You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
fastapi_celery/README.md

79 lines
2.5 KiB
Markdown

# FastAPI and Celery
Learning distributed task queues by doing. Since it's a greenfield project this also uses a newer async web framework.
See: [Celery docs](https://celery-safwan.readthedocs.io/en/latest/index.html)
## Debugging Celery
### Check results in redis result backend
```
docker-compose exec redis sh
redis-cli
KEYS celery*
MGET celery-task-meta-<UID of Task>
```
### Checking results in Flower
Use the flower dashboard at: `0.0.0.0:5557`
### Eager Task Processing
`CELERY_TASK_ALWAYS_EAGER: bool = True` will synchronously execute the celery tasks. This allows us to use things like `breakpoint()` to enter a debugger within the execution context of the task.
Since the app currently uses the fastapi application config for celery config, add `CELERY_TASK_ALWAYS_EAGER=True` to the config class if needed. You don't need to run the worker, message broker, or result backend processes to debug your code. It will process the task within
### Using rdb in Docker Compose environment
Celery ships with [an extension to pdb called rdb](https://docs.celeryq.dev/en/stable/userguide/debugging.html) that will allow for remote debugging using a tool like `telnet`.
To use this with your dockerized app you must have `telnet` installed in your container or export the debug port on the container by setting `CELERY_RDB_PORT` in config and adjusting the compose file accordingly. By default the remote debugging is only available at localhost. It is possible to access this remote session on another host using `CELERY_RDB_HOST` also.
Set the breakpoint like you would with pdb:
```
def foo():
from celery.contrib import rdb
rdb.set_trace()
```
Run your dockerized environment
```
docker compose -d up
docker compose -f logs
```
Trigger the execution path to the break point:
```
docker compose exec web bash
python
>> from main import app
>> from project.users.tasks import divide
>> divide.delay(1,2)
```
You will see a debug session with connection details displayed in the logs. Now you just need to connect
to the remote session. Since this is running in a docker container that doesn't have the port exposed you
will need to exec into the container and use telenet.
```
docker compose exec celery_work bash
telenet 127.0.0.1 <randomish port>
>>(Pdb)
>>(Pdb) help
```
It is also possible to set a [signal to enter a remote debugger](https://docs.celeryq.dev/en/stable/userguide/debugging.html#enabling-the-break-point-signal) using by sending a `SIGUSR2`.
```
CELERY_RDBSIG=1 celery worker -l INFO
```
```
kill -USR2 <celery_pid>
```