Celery
Distributed task queue for Python. Current version is 5.6.2 (Jan 2026). Requires Python >=3.9. A broker (Redis or RabbitMQ) is always required — there is no built-in broker. Old lowercase config settings (CELERY_TASK_SERIALIZER etc.) removed in Celery 5.0. SQS transport: pycurl→urllib3 in 5.5, then reverted in 5.6 — SQS users need pycurl reinstalled after 5.5→5.6 upgrade. Security fix: broker URL passwords were logged in plaintext before 5.6.
Warnings
- breaking All old uppercase CELERY_* config keys (CELERY_BROKER_URL, CELERY_RESULT_BACKEND, CELERY_TASK_SERIALIZER, etc.) were deprecated in Celery 4.0 and removed in Celery 5.0. Using them silently does nothing or raises errors.
- breaking Python 3.8 support dropped in Celery 5.6.0. Minimum is now Python 3.9.
- breaking SQS transport: pycurl was replaced by urllib3 in Celery 5.5, then reverted back to pycurl in 5.6 due to critical issues. Users who uninstalled pycurl after upgrading to 5.5 must reinstall it before upgrading to 5.6.
- breaking Security: broker URLs containing passwords were logged in plaintext by the delayed delivery mechanism before 5.6. Credentials visible in log files.
- gotcha Celery requires a broker — there is no built-in broker. Without a running Redis or RabbitMQ instance, all .delay() and .apply_async() calls raise kombu.exceptions.OperationalError.
- gotcha The result backend is separate from the broker. Without a configured result_backend, result.get() blocks forever or raises NotImplementedError. Many tutorials configure the broker but forget the backend.
- gotcha Task serializer defaults to json in Celery 5.x. Using pickle=True or passing non-JSON-serializable objects to tasks raises kombu.exceptions.EncodeError. Complex Python objects (datetime, custom classes) must be serialized manually.
- gotcha In notebooks and scripts using multiprocessing, Accelerator() or notebook_launcher patterns: the Celery app must be importable as a module — it cannot be defined inline in a __main__ block. Workers import the app from the module path passed to -A.
Install
-
pip install celery -
pip install celery[redis] -
pip install celery[rabbitmq] -
pip install celery[sqs]
Imports
- Celery
from celery import Celery app = Celery( 'myapp', broker='redis://localhost:6379/0', backend='redis://localhost:6379/0' ) # Correct: use @app.task decorator @app.task def add(x, y): return x + y # Call async result = add.delay(4, 6) print(result.get(timeout=10))
Quickstart
# tasks.py
from celery import Celery
app = Celery(
'tasks',
broker='redis://localhost:6379/0',
backend='redis://localhost:6379/0'
)
app.conf.update(
task_serializer='json',
accept_content=['json'],
result_serializer='json',
timezone='UTC',
)
@app.task
def add(x, y):
return x + y
@app.task(bind=True, max_retries=3)
def fetch_data(self, url):
try:
import requests
return requests.get(url).json()
except Exception as exc:
raise self.retry(exc=exc, countdown=5)
# --- Run worker ---
# celery -A tasks worker --loglevel=info
# --- Call from client ---
# result = add.delay(4, 6)
# print(result.get(timeout=10)) # 10