Celery

  • Task queue, runs on its own server

  • Worker jobs runs in background

  • All data is serialized/deserialized, default is json

  • Need a message broker(SQS, RabbitMQ) and a backend(Redis, Postgres/Mongo DB)

  • v5.2 allows max of Python 3.10

Setup

pip install celery
pip install "celery[sqs]" #to use with sqs

Need to choose a message broker(see Deployment/Task Queues)

File Structure

  • module_name/

    • __init__.py

    • celery.py

    • tasks.py

SQS

  • Make sure to turn debug mode off in Django to not leak url and use amqp

  • SQS doesn’t yet support worker remote control command or events, celery events, celerymon, or Django Admin monitor

Predefined Queues if no create/delete queues

Basics

  • backend url can be db external url in form of db+postgresql:user:password@url/db_name

module_name/celery.py

module_name/tasks.py

Then run the worker in folder above:

run as background process using daemonarrow-up-right in prod

Then in seperate terminal python script inside the folder you can schedulearrow-up-right tasks that return a resultarrow-up-right:

Storage

Results can be storedarrow-up-right in Redis, SQLAlchemy DB, Mongo, RPC(transient messages sent back) etc; set in backend opt

SQLAlchemy

  • Just add SQLAlchemy connection string as backend and it will store task info in two new tables

  • db+postgresql://postgres:XXXXXXXXXXXXXXX@teachingassistant.aaaaaaaaa.us-west-2.rds.amazonaws.com:5432, note the db+

Advanced

Last updated