Astronomer Cosmos
raw JSON → 1.13.1 verified Tue May 12 auth: no python install: stale
Astronomer Cosmos is an open-source Python library that seamlessly integrates dbt (data build tool) Core projects with Apache Airflow. It allows users to orchestrate dbt models as native Airflow DAGs and Task Groups, providing enhanced observability, retries, and test execution directly within the Airflow UI. The library is actively maintained with frequent releases, evolving to support new dbt features like Fusion and various execution modes for optimal performance.
pip install astronomer-cosmos Common errors
error ModuleNotFoundError: No module named 'astronomer_cosmos' ↓
cause The 'astronomer-cosmos' package is not installed in the Python environment.
fix
pip install astronomer-cosmos
error ImportError: cannot import name 'DbtDag' from 'astronomer_cosmos.dag' ↓
cause The 'DbtDag' class has been moved or renamed in the 'astronomer-cosmos' package.
fix
from astronomer_cosmos import DbtDag
error AttributeError: module 'astronomer_cosmos' has no attribute 'DbtTaskGroup' ↓
cause The 'DbtTaskGroup' attribute does not exist in the 'astronomer-cosmos' module.
fix
from astronomer_cosmos.task_group import DbtTaskGroup
error ModuleNotFoundError: No module named 'flask_limiter.wrappers' ↓
cause A breaking change in 'flask-limiter' version 3.13 removed the 'wrappers' module.
fix
pip install flask_limiter==3.12
error ImportError: cannot import name 'PackageFinder' from 'pip._internal.index' ↓
cause The 'PackageFinder' class has been moved in newer versions of 'pip'.
fix
Update the import statement to reflect the new location of 'PackageFinder'.
Warnings
breaking Cosmos v1.14.0 (alpha releases) drops support for Apache Airflow versions earlier than 2.9. ↓
fix Upgrade your Airflow environment to version 2.9 or higher before upgrading to Cosmos 1.14.0 or later.
gotcha Airflow's `DagBag` import timeout can occur with large dbt projects due to slow manifest parsing, especially when using `ExecutionMode.LOCAL` without a pre-generated `manifest.json`. ↓
fix Increase the `core.dagbag_import_timeout` Airflow configuration. Consider refactoring Cosmos DAGs to use a pre-generated `manifest.json` file, or explicitly setting `LoadMode.DBT_MANIFEST` and pre-generating the manifest during image build or deployment.
gotcha Dependency conflicts between `dbt-core`/dbt adapters and Airflow packages are common. ↓
fix Astronomer Cosmos strongly recommends installing dbt and its adapters within a separate Python virtual environment, distinct from the one where Airflow is installed. The `dbt_executable_path` in `ExecutionConfig` should then point to this isolated virtual environment's dbt executable.
gotcha Cosmos's default dbt test behavior (`TestBehavior.AFTER_EACH`) runs tests immediately after each model, which differs from dbt Core's default of running all tests after all models have completed. This 'fail-fast' approach can sometimes lead to unexpected failures, especially for tests that depend on multiple upstream models. ↓
fix If this behavior is not desired, configure `RenderConfig` to use `TestBehavior.AFTER_ALL`. For tests with multiple parents, consider setting `should_detach_multiple_parents_tests=True` in `RenderConfig` (introduced in Cosmos 1.8.2) to create separate test tasks for better isolation.
gotcha Intermittent `FileNotFoundError: [Errno 2] No such file or directory` errors for `/tmp` files can occur during dbt model run tasks, particularly with `ExecutionMode.LOCAL`. ↓
fix This issue is often related to concurrent dbt processes and temporary file management. Ensure proper temporary directory handling, explicitly configure `TMPDIR` environment variables if needed, and increase task retries (e.g., `retries=2`) as re-runs often succeed.
gotcha When incorporating dbt snapshots, ensure they are run in a separate `DbtTaskGroup` from dbt models to guarantee the correct execution order. ↓
fix Create distinct `DbtTaskGroup` instances for your snapshots and models, and chain them in your Airflow DAG to enforce the desired sequence (e.g., snapshots before models).
Install
pip install 'astronomer-cosmos[dbt-<adapter>]' Install compatibility stale last tested: 2026-05-12
python os / libc variant status wheel install import disk
3.10 alpine (musl) dbt-<adapter> build_error - - - -
3.10 alpine (musl) dbt-<adapter> - - - -
3.10 alpine (musl) astronomer-cosmos build_error - - - -
3.10 alpine (musl) astronomer-cosmos - - - -
3.10 slim (glibc) dbt-<adapter> build_error - 1.5s - -
3.10 slim (glibc) dbt-<adapter> - - - -
3.10 slim (glibc) astronomer-cosmos wheel 26.6s 5.50s 266M
3.10 slim (glibc) astronomer-cosmos - - 4.54s 257M
3.11 alpine (musl) dbt-<adapter> build_error - - - -
3.11 alpine (musl) dbt-<adapter> - - - -
3.11 alpine (musl) astronomer-cosmos build_error - - - -
3.11 alpine (musl) astronomer-cosmos - - - -
3.11 slim (glibc) dbt-<adapter> build_error - 1.5s - -
3.11 slim (glibc) dbt-<adapter> - - - -
3.11 slim (glibc) astronomer-cosmos wheel 25.9s 7.65s 288M
3.11 slim (glibc) astronomer-cosmos - - 6.56s 278M
3.12 alpine (musl) dbt-<adapter> build_error - - - -
3.12 alpine (musl) dbt-<adapter> - - - -
3.12 alpine (musl) astronomer-cosmos build_error - - - -
3.12 alpine (musl) astronomer-cosmos - - - -
3.12 slim (glibc) dbt-<adapter> build_error - 1.4s - -
3.12 slim (glibc) dbt-<adapter> - - - -
3.12 slim (glibc) astronomer-cosmos wheel 20.6s 7.92s 279M
3.12 slim (glibc) astronomer-cosmos - - 7.88s 269M
3.13 alpine (musl) dbt-<adapter> build_error - - - -
3.13 alpine (musl) dbt-<adapter> - - - -
3.13 alpine (musl) astronomer-cosmos build_error - - - -
3.13 alpine (musl) astronomer-cosmos - - - -
3.13 slim (glibc) dbt-<adapter> build_error - 1.5s - -
3.13 slim (glibc) dbt-<adapter> - - - -
3.13 slim (glibc) astronomer-cosmos wheel 20.6s 7.48s 281M
3.13 slim (glibc) astronomer-cosmos - - 7.24s 271M
3.9 alpine (musl) dbt-<adapter> build_error - - - -
3.9 alpine (musl) dbt-<adapter> - - - -
3.9 alpine (musl) astronomer-cosmos build_error - - - -
3.9 alpine (musl) astronomer-cosmos - - - -
3.9 slim (glibc) dbt-<adapter> build_error - 1.8s - -
3.9 slim (glibc) dbt-<adapter> - - - -
3.9 slim (glibc) astronomer-cosmos wheel 31.3s 7.42s 230M
3.9 slim (glibc) astronomer-cosmos - - 7.21s 226M
Imports
- DbtDag
from cosmos import DbtDag - DbtTaskGroup
from cosmos import DbtTaskGroup - ProjectConfig
from cosmos import ProjectConfig - ProfileConfig
from cosmos import ProfileConfig - ExecutionConfig
from cosmos import ExecutionConfig - PostgresUserPasswordProfileMapping
from cosmos.profiles import PostgresUserPasswordProfileMapping
Quickstart last tested: 2026-04-24
import os
from datetime import datetime
from pathlib import Path
from airflow.models.dag import DAG
from cosmos import DbtDag, ProjectConfig, ProfileConfig, ExecutionConfig
from cosmos.profiles import PostgresUserPasswordProfileMapping
# Define path to your dbt project relative to the DAG file
# For this example, assume a dbt project named 'jaffle_shop' exists in 'dags/dbt/'
# Example structure: dags/my_cosmos_dag.py, dags/dbt/jaffle_shop/dbt_project.yml
DEFAULT_DBT_PROJECT_PATH = Path(__file__).parent / "dbt" / "jaffle_shop"
# Configure your dbt profile to use an Airflow connection
# Replace 'your_airflow_postgres_conn_id' with an actual Airflow connection ID
# and adjust profile_args as needed for your database schema.
profile_config = ProfileConfig(
profile_name="default", # Corresponds to the profile name in your dbt_project.yml
target_name="dev", # Corresponds to the target name in your dbt_project.yml
profile_mapping=PostgresUserPasswordProfileMapping(
conn_id=os.environ.get('AIRFLOW_DB_CONN_ID', 'airflow_db'), # Airflow connection ID
profile_args={"schema": "public"}, # Example schema
),
)
# Configure dbt execution, e.g., to run dbt in a virtual environment.
# Ensure 'dbt_executable_path' points to your dbt virtual environment's executable.
# If dbt is installed in the same environment as Airflow, this might be omitted or set to 'dbt'.
execution_config = ExecutionConfig(
dbt_executable_path=os.environ.get('DBT_EXECUTABLE_PATH', '/opt/airflow/dbt_venv/bin/dbt'),
)
with DAG(
dag_id="cosmos_dbt_example_dag",
schedule_interval="@daily",
start_date=datetime(2023, 1, 1),
catchup=False,
tags=["dbt", "cosmos", "example"],
default_args={
"retries": 2, # Recommended for dbt tasks
},
) as dag:
dbt_project = DbtDag(
project_config=ProjectConfig(
DEFAULT_DBT_PROJECT_PATH,
),
profile_config=profile_config,
execution_config=execution_config,
operator_args={
"install_deps": True, # Install dbt dependencies before run
},
)
# The dbt_project object automatically creates Airflow tasks for your dbt models,
# which can be observed in the Airflow UI with their lineage.
# You can add upstream/downstream Airflow tasks as needed.
# For example, a simple PythonOperator before or after the dbt_project.