{"id":2865,"library":"apache-airflow-providers-datadog","title":"Datadog Provider for Apache Airflow","description":"This provider package integrates Apache Airflow with Datadog, enabling users to send metrics, events, and query metrics from Datadog directly within their Airflow DAGs. It supports monitoring of DAGs, tasks, and other Airflow components through Datadog's platform. The current version is 3.10.3 and it follows the Apache Airflow provider release cadence, often aligning with Airflow core releases but also releasing independently for bug fixes and features.","status":"active","version":"3.10.3","language":"en","source_language":"en","source_url":"https://github.com/apache/airflow/tree/main/airflow/providers/datadog","tags":["airflow","datadog","monitoring","observability","etl","data-pipeline","provider","metrics"],"install":[{"cmd":"pip install apache-airflow-providers-datadog","lang":"bash","label":"Install latest version"}],"dependencies":[{"reason":"Core Airflow functionality for providers.","package":"apache-airflow","optional":false},{"reason":"Python client for Datadog API.","package":"datadog","optional":false}],"imports":[{"note":"Old import path for Airflow 1.x and early 2.x 'backport' providers; current providers use `airflow.providers` namespace.","wrong":"from airflow.contrib.hooks.datadog_hook import DatadogHook","symbol":"DatadogHook","correct":"from airflow.providers.datadog.hooks.datadog import DatadogHook"},{"note":"The `DatadogMonitorOperator` was seen in older examples (pre-2021); the current convention uses `DatadogOperator` directly under the provider's `operators` module.","wrong":"from airflow.operators.datadog_operator import DatadogMonitorOperator","symbol":"DatadogOperator","correct":"from airflow.providers.datadog.operators.datadog import DatadogOperator"},{"note":"Old import path for Airflow 1.x and early 2.x 'backport' providers; current providers use `airflow.providers` namespace.","wrong":"from airflow.contrib.sensors.datadog_sensor import DatadogSensor","symbol":"DatadogSensor","correct":"from airflow.providers.datadog.sensors.datadog import DatadogSensor"}],"quickstart":{"code":"import os\nfrom datetime import datetime\n\nfrom airflow import DAG\nfrom airflow.providers.datadog.operators.datadog import DatadogOperator\n\n# Datadog API and APP keys are typically configured as an Airflow Connection\n# with conn_id='datadog_default'. For demonstration, using environment vars.\n# In a real Airflow environment, prefer Airflow Connections for credentials.\n\nwith DAG(\n    dag_id='datadog_metrics_example',\n    start_date=datetime(2023, 1, 1),\n    schedule_interval=None,\n    catchup=False,\n    tags=['datadog', 'metrics'],\n    params={\n        'metric_value': 123.45\n    }\n) as dag:\n    send_custom_metric_task = DatadogOperator(\n        task_id='send_custom_metric',\n        datadog_conn_id='datadog_default', # Ensure this connection is configured in Airflow UI\n        metric_name='my.airflow.custom.metric',\n        metric_type='gauge',\n        points=[[\"{{ execution_date.int_timestamp }}\", \"{{ params.metric_value }}\"]],\n        tags=['env:dev', 'dag_id:{{ dag.dag_id }}', 'task_id:{{ ti.task_id }}'],\n        # API/APP keys can be passed here, but Airflow Connections are preferred\n        api_key=os.environ.get('DATADOG_API_KEY', ''),\n        app_key=os.environ.get('DATADOG_APP_KEY', ''),\n    )\n\n    # Note: For this DAG to run successfully and send metrics to Datadog,\n    # you must have a Datadog connection configured in Airflow UI (Admin -> Connections)\n    # with Conn Id 'datadog_default' and provide your Datadog API and APP keys.\n    # Alternatively, ensure DATADOG_API_KEY and DATADOG_APP_KEY environment\n    # variables are set where the Airflow worker executes the task.","lang":"python","description":"This example demonstrates how to use the `DatadogOperator` to send a custom metric to Datadog. It assumes that Datadog API and APP keys are configured either via an Airflow connection named `datadog_default` or directly through environment variables `DATADOG_API_KEY` and `DATADOG_APP_KEY`."},"warnings":[{"fix":"Upgrade your Apache Airflow instance to at least the minimum version required by the provider, or pin the provider version to one compatible with your Airflow installation.","message":"Provider version 3.10.0+ requires Apache Airflow 2.11.0+. Older provider versions have different minimum Airflow requirements (e.g., 3.0.0 requires Airflow 2.2+, 2.0.0 requires Airflow 2.1.0+). Always check the provider's changelog for specific Airflow compatibility when upgrading.","severity":"breaking","affected_versions":">=3.10.0"},{"fix":"Update import statements in your DAGs to use the new `airflow.providers.datadog` namespace (e.g., `from airflow.providers.datadog.hooks.datadog import DatadogHook`).","message":"The import paths for all provider classes (Hooks, Operators, Sensors) changed from `airflow.contrib.*` or legacy provider paths to `airflow.providers.datadog.*` with the introduction of Airflow 2.0 and the unified provider package system.","severity":"breaking","affected_versions":"<2.0.0 (backport providers) to >=2.0.0 (native providers)"},{"fix":"Refer to the official Datadog documentation for a complete Airflow integration setup, including Agent installation, `airflow.cfg` modifications for StatsD, and log/trace collection configurations.","message":"For comprehensive monitoring (metrics, logs, traces), simply installing the provider is not enough. You also need to configure the Datadog Agent, enable Airflow's StatsD plugin in `airflow.cfg`, and potentially set up log collection and OpenLineage.","severity":"gotcha","affected_versions":"All"},{"fix":"Design your tags carefully, keeping them concise and using a finite set of values. Avoid using highly dynamic or unique values (like `UUID`s or full timestamps) as tags. Aggregate metrics where appropriate to reduce cardinality.","message":"Using too many unique tags or high-cardinality tags (e.g., dynamic, unbounded values) when sending metrics to Datadog can significantly increase your Datadog bill and impact performance.","severity":"gotcha","affected_versions":"All"},{"fix":"Create an Airflow connection of type 'Datadog' (or 'HTTP' for older versions) with `conn_id='datadog_default'` (or a custom ID) and store your API/APP keys there. Reference this `conn_id` in your operators/hooks.","message":"When using `DatadogOperator` or `DatadogHook`, ensure your Datadog API and APP keys are securely configured in Airflow Connections (recommended) or via environment variables accessible to the Airflow worker. Hardcoding credentials in DAG files is a security risk.","severity":"gotcha","affected_versions":"All"}],"env_vars":null,"last_verified":"2026-04-11T00:00:00.000Z","next_check":"2026-07-10T00:00:00.000Z"}