Setting Up Auto-reload

Part 1, Chapter 5


Live code reloading is a simple yet effective way for developers to get quick feedback on code changes. While Django provides this functionality out-of-the-box, Celery does not. So, you'll have to manually restart the workers every time you make code changes to a task, which can make for a very difficult developer experience.

In this chapter, we'll look at two solutions for solving the Celery worker auto-reload problem so that when changes are made to the codebase Celery workers are restarted.

Each solution has two sections:

  1. Overview provides a broad overview of the solution
  2. Project Implementation shows how to add the solution into the course project

In this course we'll use the second solution. Be sure to review the first solution as well. If you'd like an additional challenge, try implementing it as well.

Objectives

By the end of this chapter, you will be able to:

  1. Describe two solutions for solving the Celery worker auto-reload problem so that when changes are made to the codebase Celery workers are restarted
  2. Implement one of the solutions into your codebase

Solution 1: Custom Django Command

Overview

You can write a Django management command to restart the Celery workers and then hook that command into Django's autoreload utility.

Add the following management command:

import shlex import sys import subprocess from django.core.management.base import BaseCommand from django.utils import autoreload def restart_celery(): # kill celery process # run celery process class Command(BaseCommand): def handle(self, *args, **options): print('Starting celery worker with autoreload...') autoreload.run_with_reloader(restart_celery) 

Now if you run the Django custom command, the worker should restart automatically when the Django autoreload utility is triggered on changes to the codebase.

Project Implementation

To incorporate this into your project, let's add a Django management command.

Create the following files and folders inside "polls":

polls └── management ├── __init__.py └── commands ├── __init__.py └── celery_worker.py 

Edit polls/management/commands/celery_worker.py like so:

import shlex import subprocess import sys from django.core.management.base import BaseCommand from django.utils import autoreload def restart_celery(): celery_worker_cmd = "celery -A django_celery_example worker" cmd = f'pkill -f "{celery_worker_cmd}"' if sys.platform == "win32": cmd = "taskkill /f /t /im celery.exe" subprocess.call(shlex.split(cmd)) subprocess.call(shlex.split(f"{celery_worker_cmd} --loglevel=info")) class Command(BaseCommand): def handle(self, *args, **options): print("Starting celery worker with autoreload...") autoreload.run_with_reloader(restart_celery) 

Notes:

  1. In the restart_celery, we declared celery_worker_cmd, and use pkill -f to search it in the OS processes and kill it if exists.
  2. And then we start new Celery worker.

Update compose/local/django/celery/worker/start:

#!/bin/bash set -o errexit set -o nounset python manage.py celery_worker 

As you can see, we replaced celery -A django_celery_example worker --loglevel=info with our new Django command.

Next, you'll need to install the procps package to use the pkill command, so install the package in compose/local/django/Dockerfile:

... RUN apt-get update \ # dependencies for building Python packages && apt-get install -y build-essential \ # psycopg dependencies && apt-get install -y libpq-dev \ # Translations dependencies && apt-get install -y gettext \ # Additional dependencies && apt-get install -y git procps \ # cleaning up unused files && apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \ && rm -rf /var/lib/apt/lists/* ... 

The full file should now look like this:

FROM python:3.12-slim-bullseye ENV PYTHONUNBUFFERED 1 ENV PYTHONDONTWRITEBYTECODE 1 RUN apt-get update \ # dependencies for building Python packages && apt-get install -y build-essential \ # psycopg dependencies && apt-get install -y libpq-dev \ # Translations dependencies && apt-get install -y gettext \ # Additional dependencies && apt-get install -y git procps \ # cleaning up unused files && apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \ && rm -rf /var/lib/apt/lists/* # Requirements are installed here to ensure they will be cached. COPY ./requirements.txt /requirements.txt RUN pip install -r /requirements.txt COPY ./compose/local/django/entrypoint /entrypoint RUN sed -i 's/\r$//g' /entrypoint RUN chmod +x /entrypoint COPY ./compose/local/django/start /start RUN sed -i 's/\r$//g' /start RUN chmod +x /start COPY ./compose/local/django/celery/worker/start /start-celeryworker RUN sed -i 's/\r$//g' /start-celeryworker RUN chmod +x /start-celeryworker COPY ./compose/local/django/celery/beat/start /start-celerybeat RUN sed -i 's/\r$//g' /start-celerybeat RUN chmod +x /start-celerybeat COPY ./compose/local/django/celery/flower/start /start-flower RUN sed -i 's/\r$//g' /start-flower RUN chmod +x /start-flower WORKDIR /app ENTRYPOINT ["/entrypoint"] 

Re-build the Docker image and spin up the new containers:

$ docker compose up -d --build 

To test out the auto-reload, first open the logs:

$ docker compose logs -f 

Now make a code change to the divide task in django_celery_example/celery.py. You should see the worker automatically restart in your terminal:

celery_worker_1 | /app/django_celery_example/celery.py changed, reloading. celery_worker_1 | Starting celery worker with autoreload... 

Solution 2: Watchfiles

Overview

Watchfiles (previously called Watchgod), a helpful tool for monitoring file system events, can help us restart Celery worker after code change.

$ pip install watchfiles 

Assuming you run your Celery worker like so

$ celery -A django_celery_example worker --loglevel=info 

To incorporate Watchdog, you'd now run it like this:

$ watchfiles --filter python 'celery -A django_celery_example worker --loglevel=info' 

Notes:

  1. --filter python tells watchfiles to only watch py files.
  2. 'celery -A django_celery_example worker --loglevel=info' is the command we want watchfiles to run
  3. By default, watchfiles will watch the current directory and all subdirectories

Try it out. Make a change to a .py file inside the "django_celery_example" directory. Watchdog will restart the worker. You should see something like:

[03:18:56] 2 changes detected worker: Hitting Ctrl+C again will terminate all running tasks! worker: Warm shutdown (MainProcess) 

Project Implementation

To add to your project, first add the following requirements to requirements.txt:

watchfiles==1.0.4 

Then, update compose/local/django/celery/worker/start:

#!/bin/bash set -o errexit set -o nounset watchfiles \ --filter python \ 'celery -A django_celery_example worker --loglevel=info' 

Re-build the Docker image and spin up the new containers:

$ docker compose up -d --build 

To test out the auto-reload, first open the logs:

$ docker compose logs -f 

Now make a code change to the divide task in django_celery_example/celery.py. You should see the worker automatically restart in your terminal.




Mark as Completed