adding some additional celery debugging options to readme

drew/tilt-local-dev
Drew Bednar 2 years ago
parent e50d34e265
commit 6b3bee66c1

@ -24,3 +24,55 @@ Use the flower dashboard at: `0.0.0.0:5557`
`CELERY_TASK_ALWAYS_EAGER: bool = True` will synchronously execute the celery tasks. This allows us to use things like `breakpoint()` to enter a debugger within the execution context of the task. `CELERY_TASK_ALWAYS_EAGER: bool = True` will synchronously execute the celery tasks. This allows us to use things like `breakpoint()` to enter a debugger within the execution context of the task.
Since the app currently uses the fastapi application config for celery config, add `CELERY_TASK_ALWAYS_EAGER=True` to the config class if needed. You don't need to run the worker, message broker, or result backend processes to debug your code. It will process the task within Since the app currently uses the fastapi application config for celery config, add `CELERY_TASK_ALWAYS_EAGER=True` to the config class if needed. You don't need to run the worker, message broker, or result backend processes to debug your code. It will process the task within
### Using rdb in Docker Compose environment
Celery ships with [an extension to pdb called rdb](https://docs.celeryq.dev/en/stable/userguide/debugging.html) that will allow for remote debugging using a tool like `telnet`.
To use this with your dockerized app you must have `telnet` installed in your container or export the debug port on the container by setting `CELERY_RDB_PORT` in config and adjusting the compose file accordingly. By default the remote debugging is only available at localhost. It is possible to access this remote session on another host using `CELERY_RDB_HOST` also.
Set the breakpoint like you would with pdb:
```
def foo():
from celery.contrib import rdb
rdb.set_trace()
```
Run your dockerized environment
```
docker compose -d up
docker compose -f logs
```
Trigger the execution path to the break point:
```
docker compose exec web bash
python
>> from main import app
>> from project.users.tasks import divide
>> divide.delay(1,2)
```
You will see a debug session with connection details displayed in the logs. Now you just need to connect
to the remote session. Since this is running in a docker container that doesn't have the port exposed you
will need to exec into the container and use telenet.
```
docker compose exec celery_work bash
telenet 127.0.0.1 <randomish port>
>>(Pdb)
>>(Pdb) help
```
It is also possible to set a [signal to enter a remote debugger](https://docs.celeryq.dev/en/stable/userguide/debugging.html#enabling-the-break-point-signal) using by sending a `SIGUSR2`.
```
CELERY_RDBSIG=1 celery worker -l INFO
```
```
kill -USR2 <celery_pid>
```

@ -22,21 +22,21 @@ COPY ./compose/local/fastapi/entrypoint /entrypoint
RUN sed -i 's/\r$//g' /entrypoint RUN sed -i 's/\r$//g' /entrypoint
RUN chmod +x /entrypoint RUN chmod +x /entrypoint
COPY ./compose/local/fastapi/start /start-web COPY ./compose/local/fastapi/start.sh /start-web.sh
RUN sed -i 's/\r$//g' /start-web RUN sed -i 's/\r$//g' /start-web.sh
RUN chmod +x /start-web RUN chmod +x /start-web.sh
COPY ./compose/local/fastapi/celery/worker/start /start-celeryworker COPY ./compose/local/fastapi/celery/worker/start.sh /start-celeryworker.sh
RUN sed -i 's/\r$//g' /start-celeryworker RUN sed -i 's/\r$//g' /start-celeryworker.sh
RUN chmod +x /start-celeryworker RUN chmod +x /start-celeryworker.sh
COPY ./compose/local/fastapi/celery/beat/start /start-celerybeat COPY ./compose/local/fastapi/celery/beat/start.sh /start-celerybeat.sh
RUN sed -i 's/\r$//g' /start-celerybeat RUN sed -i 's/\r$//g' /start-celerybeat.sh
RUN chmod +x /start-celerybeat RUN chmod +x /start-celerybeat.sh
COPY ./compose/local/fastapi/celery/flower/start /start-flower COPY ./compose/local/fastapi/celery/flower/start.sh /start-flower.sh
RUN sed -i 's/\r$//g' /start-flower RUN sed -i 's/\r$//g' /start-flower.sh
RUN chmod +x /start-flower RUN chmod +x /start-flower.sh
WORKDIR /app WORKDIR /app

@ -7,7 +7,7 @@ services:
dockerfile: ./compose/local/fastapi/Dockerfile dockerfile: ./compose/local/fastapi/Dockerfile
image: fastapi_celery_example_web image: fastapi_celery_example_web
# '/start' is the shell script used to run the service # '/start' is the shell script used to run the service
command: /start-web command: /start-web.sh
# this volume is used to map the files and folders on the host to the container # this volume is used to map the files and folders on the host to the container
# so if we change code on the host, code in the docker container will also be changed # so if we change code on the host, code in the docker container will also be changed
volumes: volumes:
@ -37,7 +37,7 @@ services:
context: . context: .
dockerfile: ./compose/local/fastapi/Dockerfile dockerfile: ./compose/local/fastapi/Dockerfile
image: fastapi_celery_example_celery_worker image: fastapi_celery_example_celery_worker
command: /start-celeryworker command: /start-celeryworker.sh
volumes: volumes:
- .:/app - .:/app
env_file: env_file:
@ -51,7 +51,7 @@ services:
context: . context: .
dockerfile: ./compose/local/fastapi/Dockerfile dockerfile: ./compose/local/fastapi/Dockerfile
image: fastapi_celery_example_celery_beat image: fastapi_celery_example_celery_beat
command: /start-celerybeat command: /start-celerybeat.sh
volumes: volumes:
- .:/app - .:/app
env_file: env_file:
@ -65,7 +65,7 @@ services:
context: . context: .
dockerfile: ./compose/local/fastapi/Dockerfile dockerfile: ./compose/local/fastapi/Dockerfile
image: fastapi_celery_example_celery_flower image: fastapi_celery_example_celery_flower
command: /start-flower command: /start-flower.sh
volumes: volumes:
- .:/app - .:/app
env_file: env_file:

@ -9,6 +9,7 @@ from celery import shared_task
@shared_task @shared_task
def divide(x, y): def divide(x, y):
import time import time
time.sleep(5) time.sleep(5)

Loading…
Cancel
Save