what is celery?
An open source async task queue that can manage tasks outside of the normal request/response workflow of frameworks like flask and django.
It means you can run the task in the background and the web server is free to respond to new requests
alternatives
main celery parts
message broker
- Intermediary used as transport for producing or consuming a task
- Can either use RabbitMQ or Redis
result backend
- Store the result of task
- Redis
django configuration
requirements.txt
django==5.0
celery==5.3.6
redis==5.0.1celery.py
should be in the same directory as wsgi.py
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' app.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'django_celery_example.settings')
# whatever the django project's name is
app = Celery("project_x")
# set it so that celery configs in settings.py have CELERY_ prefix
app.config_from_object('django.conf:settings', namespace='CELERY')
# discover and load tasks.py from from all registered Django apps
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)__init__.py
update __init__.py that’s in the same directory as celery.py
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)settings.py
CELERY_BROKER_URL = "redis://127.0.0.1:6379/0"
CELERY_RESULT_BACKEND = "redis://127.0.0.1:6379/0"adding a task
in a the app directory structure, create a tasks.py file
app.task
from celery import Celery
app = Celery("project_x")
@app.task
def divide(x, y):
import time
time.sleep(5)
return x / yshared_task
the @shared_task decorator lets you create tasks without having the app instance present. makes the code reusable
from celery import shared_task
@shared_task
def sample_task(email):
requests.post('https://httpbin.org/delay/5')running a celery worker and kicking off a task
celery -A project_x worker --loglevel=infoin a new terminal inside root director of the django project:
source venv/bin/activate
./manage.py shell
from project_x.celery import divide
task = divide.delay(1, 2)
# delay uses some default configs
# apply_async gives access to all the configs
# both are used to kick off a task
task = divide.apply_async(1, 2)
# shows current state of task PENDING, FAILURE, SUCCESS
task.state
# will show 0.5
task.resultinspect celery
the cli can be used for monitoring
celery -A fica_face inspect active