FastAPI and Celery
 
 
 
 
 
Go to file
Drew Bednar d67bb8da17 remove print statement
alembic sqlite autoincrement support
compose/local/fastapi adding some additional celery debugging options to readme
project remove print statement
.gitignore sqlite autoincrement support
.pre-commit-config.yaml Add shell check to pre-commit
.pylintrc Examples of async reciever tasks
LICENSE additional comments
README.md adding some additional celery debugging options to readme
alembic.ini Adding alembic and sql support
autocomplete.zsh Add celery and autocomplete for invoke
dev-requirements.in Add shell check to pre-commit
dev-requirements.txt Add shell check to pre-commit
docker-compose.yml Remove separate docker compose for redis
main.py Adding alembic and sql support
requirements.in Adding websocket form
requirements.txt Adding websocket form
tasks.py Remove separate docker compose for redis

README.md

FastAPI and Celery

Learning distributed task queues by doing. Since it's a greenfield project this also uses a newer async web framework.

See: Celery docs

Debugging Celery

Check results in redis result backend

docker-compose exec redis sh
redis-cli
KEYS celery*
MGET celery-task-meta-<UID of Task>

Checking results in Flower

Use the flower dashboard at: 0.0.0.0:5557

Eager Task Processing

CELERY_TASK_ALWAYS_EAGER: bool = True will synchronously execute the celery tasks. This allows us to use things like breakpoint() to enter a debugger within the execution context of the task.

Since the app currently uses the fastapi application config for celery config, add CELERY_TASK_ALWAYS_EAGER=True to the config class if needed. You don't need to run the worker, message broker, or result backend processes to debug your code. It will process the task within

Using rdb in Docker Compose environment

Celery ships with an extension to pdb called rdb that will allow for remote debugging using a tool like telnet.

To use this with your dockerized app you must have telnet installed in your container or export the debug port on the container by setting CELERY_RDB_PORT in config and adjusting the compose file accordingly. By default the remote debugging is only available at localhost. It is possible to access this remote session on another host using CELERY_RDB_HOST also.

Set the breakpoint like you would with pdb:

def foo():
    from celery.contrib import rdb
    rdb.set_trace()

Run your dockerized environment

docker compose -d up
docker compose -f logs

Trigger the execution path to the break point:

docker compose exec web bash
python
>> from main import app
>> from project.users.tasks import divide
>> divide.delay(1,2)

You will see a debug session with connection details displayed in the logs. Now you just need to connect to the remote session. Since this is running in a docker container that doesn't have the port exposed you will need to exec into the container and use telenet.

docker compose exec celery_work bash
telenet 127.0.0.1 <randomish port>
>>(Pdb)
>>(Pdb) help

It is also possible to set a signal to enter a remote debugger using by sending a SIGUSR2.

CELERY_RDBSIG=1 celery worker -l INFO
kill -USR2 <celery_pid>