{"id":2864,"library":"apache-airflow-providers-celery","title":"Apache Airflow Celery Provider","description":"This provider package integrates Apache Airflow with Celery, enabling the use of Celery workers for task execution. It allows Airflow to scale task processing by distributing tasks to a pool of Celery workers, using message brokers like RabbitMQ or Redis. The current version is 3.17.2, and releases are tied to the Apache Airflow release cycle, typically monthly or bi-monthly.","status":"active","version":"3.17.2","language":"en","source_language":"en","source_url":"https://github.com/apache/airflow/tree/main/airflow/providers/celery","tags":["airflow","celery","executor","provider","task-distribution"],"install":[{"cmd":"pip install apache-airflow-providers-celery","lang":"bash","label":"Install Celery Provider"}],"dependencies":[{"reason":"This is an Airflow provider and requires a functional Airflow installation to operate.","package":"apache-airflow","optional":false},{"reason":"Required for Celery executor functionality.","package":"celery","optional":false},{"reason":"Common message broker and result backend for Celery. Other brokers (e.g., RabbitMQ) can also be used.","package":"redis","optional":true}],"imports":[{"symbol":"CeleryConnectionHook","correct":"from airflow.providers.celery.hooks.celery import CeleryConnectionHook"}],"quickstart":{"code":"import pendulum\n\nfrom airflow.models.dag import DAG\nfrom airflow.operators.bash import BashOperator\n\nwith DAG(\n    dag_id=\"celery_executor_example\",\n    start_date=pendulum.datetime(2023, 1, 1, tz=\"UTC\"),\n    catchup=False,\n    schedule=None,\n    tags=[\"celery\", \"example\"],\n) as dag:\n    # This task will run on a worker designated for the 'high_priority' queue\n    high_priority_task = BashOperator(\n        task_id=\"run_on_high_priority_queue\",\n        bash_command=\"echo 'Running on high priority queue' && sleep 5\",\n        queue=\"high_priority\", # Configure Celery worker to listen to this queue\n    )\n\n    # This task will run on a worker designated for the 'default' queue\n    default_queue_task = BashOperator(\n        task_id=\"run_on_default_queue\",\n        bash_command=\"echo 'Running on default queue' && sleep 2\",\n        queue=\"default\", # Or omit, if default queue is configured\n    )\n\n    high_priority_task >> default_queue_task\n","lang":"python","description":"This quickstart demonstrates a basic Airflow DAG that uses Celery-specific task queuing. For this to work, you must configure your `airflow.cfg` to use `executor = CeleryExecutor` and start Celery workers configured to listen to the specified queues (e.g., `airflow celery worker -q high_priority,default`)."},"warnings":[{"fix":"Ensure your `airflow.cfg` has `executor = CeleryExecutor` under the `[core]` section, in addition to configuring the `[celery]` section for broker and result backend settings.","message":"Airflow 2.0+ changed the recommended way to enable the Celery Executor. The `[celery]` section in `airflow.cfg` is still relevant for broker/backend settings, but the executor must now be explicitly set in the `[core]` section.","severity":"breaking","affected_versions":"Airflow 2.0.0 and newer"},{"fix":"Verify that `broker_url` and `result_backend` in `airflow.cfg` are correctly formatted (e.g., `redis://localhost:6379/1` or `amqp://guest:guest@localhost:5672//`) and that the specified services (Redis, RabbitMQ) are running and accessible.","message":"Incorrect `broker_url` or `result_backend` configuration in `airflow.cfg` is a common source of issues. These must point to your running message broker (e.g., RabbitMQ, Redis) and result store, respectively, and be accessible from all Airflow components (webserver, scheduler, workers).","severity":"gotcha","affected_versions":"All versions"},{"fix":"When starting Celery workers, specify all queues they should listen to using the `-q` flag. For example, `airflow celery worker -q default,high_priority` for workers to process tasks from both default and high_priority queues.","message":"Tasks can be assigned to specific Celery queues using the `queue` parameter. If Celery workers are not configured to listen to these specific queues (e.g., `airflow celery worker -q default,my_custom_queue`), tasks assigned to those queues will remain in a 'queued' state indefinitely.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Always use `queue='your_queue_name'` when specifying the Celery queue for a task. The `celery_queue` parameter might still work for backward compatibility but should be avoided.","message":"The `celery_queue` parameter for tasks has been deprecated in favor of the `queue` parameter.","severity":"deprecated","affected_versions":"Airflow 2.0.0 and newer"}],"env_vars":null,"last_verified":"2026-04-11T00:00:00.000Z","next_check":"2026-07-10T00:00:00.000Z"}