`CELERY_TASK_ALWAYS_EAGER: bool = True` will synchronously execute the celery tasks. This allows us to use things like `breakpoint()` to enter a debugger within the execution context of the task.
Since the app currently uses the fastapi application config for celery config, add `CELERY_TASK_ALWAYS_EAGER=True` to the config class if needed. You don't need to run the worker, message broker, or result backend processes to debug your code. It will process the task within
Celery ships with [an extension to pdb called rdb](https://docs.celeryq.dev/en/stable/userguide/debugging.html) that will allow for remote debugging using a tool like `telnet`.
To use this with your dockerized app you must have `telnet` installed in your container or export the debug port on the container by setting `CELERY_RDB_PORT` in config and adjusting the compose file accordingly. By default the remote debugging is only available at localhost. It is possible to access this remote session on another host using `CELERY_RDB_HOST` also.
Set the breakpoint like you would with pdb:
```
def foo():
from celery.contrib import rdb
rdb.set_trace()
```
Run your dockerized environment
```
docker compose -d up
docker compose -f logs
```
Trigger the execution path to the break point:
```
docker compose exec web bash
python
>> from main import app
>> from project.users.tasks import divide
>> divide.delay(1,2)
```
You will see a debug session with connection details displayed in the logs. Now you just need to connect
to the remote session. Since this is running in a docker container that doesn't have the port exposed you
will need to exec into the container and use telenet.
```
docker compose exec celery_work bash
telenet 127.0.0.1 <randomishport>
>>(Pdb)
>>(Pdb) help
```
It is also possible to set a [signal to enter a remote debugger](https://docs.celeryq.dev/en/stable/userguide/debugging.html#enabling-the-break-point-signal) using by sending a `SIGUSR2`.