FastAPI and Celery
You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
Drew Bednar dde966e650 Examples of async reciever tasks 2 years ago
alembic Adding alembic and sql support 2 years ago
compose/local/fastapi adding some additional celery debugging options to readme 2 years ago
project Examples of async reciever tasks 2 years ago
.gitignore Adding alembic and sql support 2 years ago
.pre-commit-config.yaml Add shell check to pre-commit 2 years ago
.pylintrc Examples of async reciever tasks 2 years ago
LICENSE Update license with user info 2 years ago
README.md adding some additional celery debugging options to readme 2 years ago
alembic.ini Adding alembic and sql support 2 years ago
autocomplete.zsh Add celery and autocomplete for invoke 2 years ago
dev-requirements.in Add shell check to pre-commit 2 years ago
dev-requirements.txt Add shell check to pre-commit 2 years ago
docker-compose-redis.yml Initial wiring 2 years ago
docker-compose.yml adding some additional celery debugging options to readme 2 years ago
main.py Adding alembic and sql support 2 years ago
requirements.in Examples of async reciever tasks 2 years ago
requirements.txt Examples of async reciever tasks 2 years ago
tasks.py docker composed local dev env 2 years ago

README.md

FastAPI and Celery

Learning distributed task queues by doing. Since it's a greenfield project this also uses a newer async web framework.

See: Celery docs

Debugging Celery

Check results in redis result backend

docker-compose exec redis sh
redis-cli
KEYS celery*
MGET celery-task-meta-<UID of Task>

Checking results in Flower

Use the flower dashboard at: 0.0.0.0:5557

Eager Task Processing

CELERY_TASK_ALWAYS_EAGER: bool = True will synchronously execute the celery tasks. This allows us to use things like breakpoint() to enter a debugger within the execution context of the task.

Since the app currently uses the fastapi application config for celery config, add CELERY_TASK_ALWAYS_EAGER=True to the config class if needed. You don't need to run the worker, message broker, or result backend processes to debug your code. It will process the task within

Using rdb in Docker Compose environment

Celery ships with an extension to pdb called rdb that will allow for remote debugging using a tool like telnet.

To use this with your dockerized app you must have telnet installed in your container or export the debug port on the container by setting CELERY_RDB_PORT in config and adjusting the compose file accordingly. By default the remote debugging is only available at localhost. It is possible to access this remote session on another host using CELERY_RDB_HOST also.

Set the breakpoint like you would with pdb:

def foo():
    from celery.contrib import rdb
    rdb.set_trace()

Run your dockerized environment

docker compose -d up
docker compose -f logs

Trigger the execution path to the break point:

docker compose exec web bash
python
>> from main import app
>> from project.users.tasks import divide
>> divide.delay(1,2)

You will see a debug session with connection details displayed in the logs. Now you just need to connect to the remote session. Since this is running in a docker container that doesn't have the port exposed you will need to exec into the container and use telenet.

docker compose exec celery_work bash
telenet 127.0.0.1 <randomish port>
>>(Pdb)
>>(Pdb) help

It is also possible to set a signal to enter a remote debugger using by sending a SIGUSR2.

CELERY_RDBSIG=1 celery worker -l INFO
kill -USR2 <celery_pid>