Table of Contents
Introduction
In today's digital landscape, background tasks have become a crucial aspect of many applications. Whether it's automating repetitive processes or achieving instant responses, one popular solution that comes to the rescue is Celery
. In this article, we'll guide you through the process of creating and managing background tasks using Celery. By seamlessly integrating it with FastAPI
, we'll show you how to containerize your project. Get ready for a straightforward and exciting exploration!
FastAPI: It's is a modern web framework for building APIs with Python. Its utilization of asynchronous programming techniques combined with the power of the Starlette
framework allows FastAPI
to achieve remarkable performance levels.
Celery: A python-based distributed ask queue system with built-in support for task scheduling, result tracking and fault tolerance.
Docker: An open-source platform, empowers you to bundle applications and their dependencies into compact, portable containers. These containers guarantee a consistent runtime environment, enabling your application to run seamlessly on any host system.
Docker Compose: A tool that simplifies the management of multi-container applications. It allows you to define and configure multiple Docker containers as a single application using a YAML
file. It also specifies the dependencies and relationships between different services.
Project Structure
celery-with-fastapi/ ├── app/ │ ├── celery/ │ │ ├── app.py │ │ ├── worker.py │ │ └── __init__.py │ ├── core/ │ │ ├── config.py │ │ └── __init__.py │ ├── __init__.py │ └── main.py ├── .env ├── .gitignore ├── docker-compose.yml ├── Dockerfile ├── README.md ├── requirements.txt └── run.sh
Dockerization
Make sure you've installed Docker
and docker-compose
in your pc. Your Dockerfile and docker-compose.yml file would be like as follows
Dockerfile
# Base image FROM python:3.10-buster # set work directory WORKDIR /app # set environment variables ENV PYTHONDONTWRITEBYTECODE 1 ENV PYTHONUNBUFFERED 1 # install dependencies COPY [ "requirements.txt", "run.sh", "./"] RUN pip install --upgrade pip RUN pip install --no-cache-dir -r requirements.txt && chmod +x ./run.sh # copy project COPY . . # Set entrypoint ENTRYPOINT bash ./run.sh
run.sh
celery -A app.celery.app worker -l info --concurrency=2 & celery -A app.celery.app flower -l info & celery -A app.celery.app beat -l INFO & uvicorn app.main:app --host 0.0.0.0 --reload --workers 2
docker-compose.yml
version: "3.8" services: web: build: . ports: - 8001:8000 - 5556:5555 volumes: - .:/app depends_on: - celerybackend celerybackend: image: redis:latest ports: - 6379:6379 healthcheck: test: [ "CMD", "redis-cli", "ping" ] interval: 5s timeout: 30s retries: 50 restart: always
Up and Run
This is the final code. Before getting started, clone the repository first.
Create a .env
file and copy these lines after cloning the project:
# .env file CELERY_BROKER_URL=redis://celerybackend:6379/0 CELERY_RESULT_BACKEND=redis://celerybackend:6379/0
git clone https://github.com/shahnawaz-pabon/celery-with-fastapi.git cd celery-with-fastapi docker-compose up --build
You'll be able to see the backend project running at http://0.0.0.0:8001/ and the flower dashboard running at http://0.0.0.0:5556/
Trigger a task
Open a terminal and trigger a new task
curl http://localhost:8001/tasks -H "Content-Type: application/json" --data '{"type": 0}'
To know more about celery, please visit this.
I would like to express my heartfelt gratitude to each and every one of you who took the time to read my articles. Your support and engagement mean the world to me. 🙏🙏🙏
Top comments (0)