{"id":3397,"library":"apache-airflow-providers-dbt-cloud","title":"Apache Airflow dbt Cloud Provider","description":"This provider package allows Apache Airflow to interact with dbt Cloud, enabling orchestration of dbt Cloud jobs and fetching job run details. It includes operators, sensors, and hooks for various dbt Cloud functionalities. The current version is 4.8.0. Airflow provider packages typically follow a regular release cadence, often aligned with Airflow's own releases or as new features/bug fixes are introduced.","status":"active","version":"4.8.0","language":"en","source_language":"en","source_url":"https://github.com/apache/airflow/tree/main/airflow/providers/dbt_cloud","tags":["airflow","provider","dbt","dbt cloud","etl","orchestration","data transformation"],"install":[{"cmd":"pip install apache-airflow-providers-dbt-cloud","lang":"bash","label":"Install the dbt Cloud Provider"}],"dependencies":[{"reason":"This is an Airflow provider, requiring Apache Airflow (>=2.4.0) to function.","package":"apache-airflow","optional":false},{"reason":"Used internally by the provider for dbt Cloud API interactions.","package":"dbt-cloud-sdk","optional":false}],"imports":[{"symbol":"DbtCloudHook","correct":"from airflow.providers.dbt_cloud.hooks.dbt_cloud import DbtCloudHook"},{"symbol":"DbtCloudRunJobOperator","correct":"from airflow.providers.dbt_cloud.operators.dbt_cloud import DbtCloudRunJobOperator"},{"symbol":"DbtCloudJobRunSensor","correct":"from airflow.providers.dbt_cloud.sensors.dbt_cloud import DbtCloudJobRunSensor"}],"quickstart":{"code":"from __future__ import annotations\n\nimport os\nimport pendulum\n\nfrom airflow.models.dag import DAG\nfrom airflow.providers.dbt_cloud.operators.dbt_cloud import DbtCloudRunJobOperator\nfrom airflow.providers.dbt_cloud.sensors.dbt_cloud import DbtCloudJobRunSensor\n\n\nDBT_CLOUD_CONN_ID = os.environ.get('DBT_CLOUD_CONN_ID', 'dbt_cloud_default')\nDBT_CLOUD_ACCOUNT_ID = os.environ.get('DBT_CLOUD_ACCOUNT_ID', '12345') # Your dbt Cloud Account ID\nDBT_CLOUD_JOB_ID = os.environ.get('DBT_CLOUD_JOB_ID', '67890') # Your dbt Cloud Job ID\n\nwith DAG(\n    dag_id=\"dbt_cloud_example_dag\",\n    schedule=None,\n    start_date=pendulum.datetime(2023, 1, 1, tz=\"UTC\"),\n    catchup=False,\n    tags=[\"dbt_cloud\", \"example\"],\n) as dag:\n    trigger_dbt_cloud_job = DbtCloudRunJobOperator(\n        task_id=\"trigger_dbt_cloud_job\",\n        dbt_cloud_conn_id=DBT_CLOUD_CONN_ID,\n        account_id=DBT_CLOUD_ACCOUNT_ID,\n        job_id=DBT_CLOUD_JOB_ID,\n        check_interval=10, # Check job status every 10 seconds\n        timeout=60 * 20, # Fail after 20 minutes\n        deferrable=True # Enable deferrable mode for async execution\n    )\n\n    wait_for_dbt_cloud_job = DbtCloudJobRunSensor(\n        task_id=\"wait_for_dbt_cloud_job\",\n        dbt_cloud_conn_id=DBT_CLOUD_CONN_ID,\n        account_id=DBT_CLOUD_ACCOUNT_ID,\n        job_id=DBT_CLOUD_JOB_ID,\n        deferrable=True\n    )\n\n    trigger_dbt_cloud_job >> wait_for_dbt_cloud_job\n","lang":"python","description":"This example DAG demonstrates how to trigger a dbt Cloud job and then wait for its completion using the `DbtCloudRunJobOperator` and `DbtCloudJobRunSensor`. It uses deferrable operators for efficient async execution. Ensure you configure an Airflow connection of type 'dbt Cloud' named `dbt_cloud_default` (or your chosen `DBT_CLOUD_CONN_ID`) with your dbt Cloud API Token. Also, provide your `account_id` and `job_id`."},"warnings":[{"fix":"Review the latest documentation for the DbtCloud provider and update your DAGs to use the current parameter names and structures. For instance, ensure `account_id` and `job_id` are explicitly passed.","message":"Version 4.0.0 removed all deprecated parameters in operators and hooks. If you were using any parameters marked as deprecated in previous versions (e.g., `schema`, `project_id`, `environment_id`), your DAGs will break.","severity":"breaking","affected_versions":">=4.0.0"},{"fix":"Remove `poll_interval` from `DbtCloudRunJobOperator` instances. To leverage deferrable mode, set `deferrable=True` on both operators and sensors. If you need a specific polling interval for the run, set it on `DbtCloudJobRunSensor` or use `check_interval` on `DbtCloudRunJobOperator`.","message":"In version 3.0.0, the `poll_interval` parameter was removed from `DbtCloudRunJobOperator` as it was only supported in `DbtCloudJobRunSensor`. Additionally, the `deferrable` parameter was added to `DbtCloudRunJobOperator` for async execution.","severity":"breaking","affected_versions":">=3.0.0"},{"fix":"Verify that your Airflow connection (e.g., `dbt_cloud_default`) is correctly set up with the 'dbt Cloud' type and a valid, unexpired API token. Ensure the API token has the necessary read/write permissions for the specific dbt Cloud account and jobs you are interacting with.","message":"Incorrect or missing dbt Cloud Connection configuration. The provider requires an Airflow connection of type 'dbt Cloud' with a valid API token. Common issues include using the wrong connection ID, an expired token, or a token with insufficient permissions.","severity":"gotcha","affected_versions":"All"},{"fix":"Double-check that you are passing the correct `account_id` and `job_id` for your dbt Cloud environment. These can be found in the dbt Cloud UI (e.g., in the URL when viewing a job or account settings).","message":"Confusion between `account_id`, `job_id`, and other identifiers. Users often provide incorrect IDs for dbt Cloud resources, leading to 'resource not found' or 'permission denied' errors.","severity":"gotcha","affected_versions":"All"}],"env_vars":null,"last_verified":"2026-04-11T00:00:00.000Z","next_check":"2026-07-10T00:00:00.000Z"}