Celery

raw JSON →
5.6.2 verified Tue May 12 auth: no python install: verified quickstart: verified

Distributed task queue for Python. Current version is 5.6.2 (Jan 2026). Requires Python >=3.9. A broker (Redis or RabbitMQ) is always required — there is no built-in broker. Old lowercase config settings (CELERY_TASK_SERIALIZER etc.) removed in Celery 5.0. SQS transport: pycurl→urllib3 in 5.5, then reverted in 5.6 — SQS users need pycurl reinstalled after 5.5→5.6 upgrade. Security fix: broker URL passwords were logged in plaintext before 5.6.

pip install celery
error kombu.exceptions.OperationalError: [Errno 111] Connection refused
cause Celery cannot connect to its message broker (e.g., Redis or RabbitMQ), often because the broker is not running, is inaccessible at the specified address, or network connectivity issues exist.
fix
Ensure your broker (Redis or RabbitMQ) is running and accessible from where your Celery worker is trying to connect. Verify the broker_url in your Celery configuration is correct (e.g., redis://localhost:6379/0 or amqp://guest:guest@localhost:5672//).
error celery.exceptions.ImproperlyConfigured: The 'boto3' package is not installed.
cause This error occurs when a Celery backend or transport (like S3 for result storage or SQS for the broker) is configured but its required Python client library (e.g., 'boto3' for AWS services) is not installed in the environment.
fix
Install the missing dependency using pip, for example, pip install boto3 if you are using an S3 backend or SQS transport, or the appropriate package for your chosen backend/transport.
error AttributeError: 'celery' object has no attribute 'CELERY_TASK_SERIALIZER'
cause In Celery 5.0 and later, the configuration settings were unified and now use uppercase and are prefixed with `task_`, `broker_`, `result_`, etc. Old lowercase settings like `CELERY_TASK_SERIALIZER` are removed and attempting to access them directly as attributes will raise an `AttributeError`.
fix
Update your Celery configuration to use the new style settings, for example, change CELERY_TASK_SERIALIZER to task_serializer in your celeryconfig.py or directly in app.conf.update().
error celery.exceptions.SecurityWarning: A broker url with password was logged in plaintext
cause This warning indicates that your broker URL, which contains a password, was logged without being sanitized, posing a security risk by exposing sensitive credentials in logs. This was a known issue in versions prior to Celery 5.6.
fix
Upgrade Celery to version 5.6 or later, as this version includes a security fix to properly sanitize broker URL passwords in log output. Also, ensure your broker's logs are secured.
error celery.exceptions.NotRegistered: Task 'my_app.tasks.my_task' is not registered
cause Celery cannot find or load the specified task, usually because the module containing the task was not imported by the worker, the task function is not decorated with `@app.task`, or there's a typo in the task name or path.
fix
Ensure the module containing your task is imported by the Celery worker (e.g., by listing it in CELERY_IMPORTS or app.autodiscover_tasks()). Verify the task function is correctly decorated with @app.task and that the task name used for calling (.delay() or .apply_async()) exactly matches the registered name.
breaking All old uppercase CELERY_* config keys (CELERY_BROKER_URL, CELERY_RESULT_BACKEND, CELERY_TASK_SERIALIZER, etc.) were deprecated in Celery 4.0 and removed in Celery 5.0. Using them silently does nothing or raises errors.
fix Replace all CELERY_* uppercase keys with lowercase equivalents: CELERY_BROKER_URL → broker_url, CELERY_RESULT_BACKEND → result_backend, CELERY_TASK_SERIALIZER → task_serializer. See the Celery 4.0 migration guide.
breaking Python 3.8 support dropped in Celery 5.6.0. Minimum is now Python 3.9.
fix Pin celery<5.6 for Python 3.8 environments.
breaking SQS transport: pycurl was replaced by urllib3 in Celery 5.5, then reverted back to pycurl in 5.6 due to critical issues. Users who uninstalled pycurl after upgrading to 5.5 must reinstall it before upgrading to 5.6. Note: 'pycurl' often requires system-level development headers (e.g., libcurl-dev or curl-devel) to be installed before 'pip install pycurl' can succeed.
fix Ensure libcurl development headers are installed before pip installing pycurl. For Alpine-based environments (like python:3.13-alpine), this means running 'apk add curl-dev'. Then, pip install pycurl before upgrading to celery 5.6 if you use the SQS transport.
breaking Security: broker URLs containing passwords were logged in plaintext by the delayed delivery mechanism before 5.6. Credentials visible in log files.
fix Upgrade to celery>=5.6.0. Audit existing logs for exposed credentials.
gotcha Celery requires a broker — there is no built-in broker. Without a running Redis or RabbitMQ instance, all .delay() and .apply_async() calls raise kombu.exceptions.OperationalError.
fix Start Redis locally: docker run -d -p 6379:6379 redis. Then set broker='redis://localhost:6379/0' in your Celery app.
gotcha The result backend is separate from the broker. Without a configured result_backend, result.get() blocks forever or raises NotImplementedError. Many tutorials configure the broker but forget the backend.
fix Set backend= in Celery() constructor or result_backend in app.conf. For Redis: backend='redis://localhost:6379/0'.
gotcha Task serializer defaults to json in Celery 5.x. Using pickle=True or passing non-JSON-serializable objects to tasks raises kombu.exceptions.EncodeError. Complex Python objects (datetime, custom classes) must be serialized manually.
fix Convert task arguments to JSON-serializable types before calling .delay(). For datetime: pass .isoformat() string, parse inside the task.
gotcha In notebooks and scripts using multiprocessing, Accelerator() or notebook_launcher patterns: the Celery app must be importable as a module — it cannot be defined inline in a __main__ block. Workers import the app from the module path passed to -A.
fix Define the Celery app in a separate module file (e.g. tasks.py). Start worker with: celery -A tasks worker
pip install celery[redis]
pip install celery[rabbitmq]
pip install celery[sqs]
python os / libc variant status wheel install import disk
3.10 alpine (musl) celery - - 0.38s 34.0M
3.10 alpine (musl) rabbitmq - - 0.36s 34.0M
3.10 alpine (musl) redis - - 0.36s 36.6M
3.10 alpine (musl) sqs - - - -
3.10 slim (glibc) celery - - 0.28s 34M
3.10 slim (glibc) rabbitmq - - 0.27s 34M
3.10 slim (glibc) redis - - 0.27s 37M
3.10 slim (glibc) sqs - - 0.28s 79M
3.11 alpine (musl) celery - - 0.49s 37.7M
3.11 alpine (musl) rabbitmq - - 0.48s 37.7M
3.11 alpine (musl) redis - - 0.50s 40.8M
3.11 alpine (musl) sqs - - - -
3.11 slim (glibc) celery - - 0.43s 38M
3.11 slim (glibc) rabbitmq - - 0.42s 38M
3.11 slim (glibc) redis - - 0.43s 41M
3.11 slim (glibc) sqs - - 0.43s 84M
3.12 alpine (musl) celery - - 0.45s 29.0M
3.12 alpine (musl) rabbitmq - - 0.43s 29.0M
3.12 alpine (musl) redis - - 0.46s 32.0M
3.12 alpine (musl) sqs - - - -
3.12 slim (glibc) celery - - 0.45s 30M
3.12 slim (glibc) rabbitmq - - 0.45s 30M
3.12 slim (glibc) redis - - 0.46s 32M
3.12 slim (glibc) sqs - - 0.43s 75M
3.13 alpine (musl) celery - - 0.40s 28.6M
3.13 alpine (musl) rabbitmq - - 0.41s 28.6M
3.13 alpine (musl) redis - - 0.40s 31.6M
3.13 alpine (musl) sqs - - - -
3.13 slim (glibc) celery - - 0.41s 29M
3.13 slim (glibc) rabbitmq - - 0.41s 29M
3.13 slim (glibc) redis - - 0.42s 32M
3.13 slim (glibc) sqs - - 0.41s 75M
3.9 alpine (musl) celery - - 0.33s 33.3M
3.9 alpine (musl) rabbitmq - - 0.32s 33.3M
3.9 alpine (musl) redis - - 0.33s 35.9M
3.9 alpine (musl) sqs - - - -
3.9 slim (glibc) celery - - 0.30s 34M
3.9 slim (glibc) rabbitmq - - 0.31s 34M
3.9 slim (glibc) redis - - 0.32s 36M
3.9 slim (glibc) sqs - - 0.34s 79M

Requires Redis running. Start worker in separate terminal: celery -A tasks worker --loglevel=info

# tasks.py
from celery import Celery

app = Celery(
    'tasks',
    broker='redis://localhost:6379/0',
    backend='redis://localhost:6379/0'
)

app.conf.update(
    task_serializer='json',
    accept_content=['json'],
    result_serializer='json',
    timezone='UTC',
)

@app.task
def add(x, y):
    return x + y

@app.task(bind=True, max_retries=3)
def fetch_data(self, url):
    try:
        import requests
        return requests.get(url).json()
    except Exception as exc:
        raise self.retry(exc=exc, countdown=5)

# --- Run worker ---
# celery -A tasks worker --loglevel=info

# --- Call from client ---
# result = add.delay(4, 6)
# print(result.get(timeout=10))  # 10