{"id":6306,"library":"airflow-exporter","title":"Airflow Exporter","description":"Airflow Exporter is an Apache Airflow plugin designed to expose DAG and task-based metrics to a Prometheus-compatible endpoint. The current version, 2.0.0, is built for Airflow 3.0+ and Python 3.9+. It is actively maintained with ongoing development and regular releases.","status":"active","version":"2.0.0","language":"en","source_language":"en","source_url":"https://github.com/epoch8/airflow-exporter","tags":["apache-airflow","prometheus","monitoring","metrics","plugin","exporter"],"install":[{"cmd":"pip install airflow-exporter","lang":"bash","label":"Install latest version"}],"dependencies":[{"reason":"This library is an Airflow plugin and requires a compatible Apache Airflow installation to function.","package":"apache-airflow","optional":false}],"imports":[],"quickstart":{"code":"# Assuming Apache Airflow is already installed and running in your environment.\n# The airflow-exporter functions as an Airflow plugin and does not require\n# explicit Python imports within your DAG files for its core functionality.\n# Simply installing the package into your Airflow environment is sufficient.\n\n# 1. Install the airflow-exporter package:\n# pip install airflow-exporter\n\n# 2. Restart your Airflow scheduler and webserver components\n#    (or redeploy if in a containerized environment) to ensure the plugin loads.\n\n# 3. Access the Prometheus metrics endpoint exposed by Airflow Exporter:\n#    Metrics will be available at http://<your_airflow_host_and_port>/admin/metrics/\n\n# Example of how you might check the endpoint (requires curl to be available):\n# import os\n# airflow_host = os.environ.get('AIRFLOW_WEBSERVER_HOST', 'localhost')\n# airflow_port = os.environ.get('AIRFLOW_WEBSERVER_PORT', '8080')\n# metrics_url = f\"http://{airflow_host}:{airflow_port}/admin/metrics/\"\n# print(f\"Access metrics at: {metrics_url}\")\n# # You would typically use a tool like curl or Prometheus to scrape this endpoint.\n# # Example using Python requests (if installed):\n# # import requests\n# # try:\n# #     response = requests.get(metrics_url)\n# #     response.raise_for_status()\n# #     print(\"Successfully fetched metrics (first 500 chars):\")\n# #     print(response.text[:500])\n# # except requests.exceptions.RequestException as e:\n# #     print(f\"Failed to fetch metrics: {e}\")\n","lang":"python","description":"The quickstart involves installing the `airflow-exporter` package into your Airflow environment. Once installed and Airflow components (webserver, scheduler) are restarted, the exporter automatically exposes a Prometheus-compatible metrics endpoint. No direct Python code is needed in your DAGs for the basic functionality."},"warnings":[{"fix":"For Airflow 2.x, pin `airflow-exporter` to `<2.0.0`. For Airflow 3.x+, ensure `apache-airflow` is installed with version `>=3.0` and `airflow-exporter` is `>=2.0.0`.","message":"Version 2.0.0 and above of `airflow-exporter` drops support for Apache Airflow versions older than 3.0. Users running Airflow 2.x must use `airflow-exporter` versions 1.x.x or upgrade their Apache Airflow instance to 3.0 or newer.","severity":"breaking","affected_versions":">=2.0.0"},{"fix":"Ensure `pip install airflow-exporter` is run within the active virtual environment or container image used by Airflow. Restart Airflow scheduler and webserver after installation.","message":"As an Airflow plugin, `airflow-exporter` must be installed in the same Python environment where your Airflow scheduler and webserver run. If metrics are not appearing at the `/admin/metrics/` endpoint, verify that the package is correctly installed and that Airflow components have been restarted.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Configure network rules (firewalls, security groups) to allow access to the Airflow webserver's `/admin/metrics/` endpoint from your monitoring system. Verify the exact host and port of your Airflow webserver.","message":"The Prometheus metrics endpoint is exposed at `http://<your_airflow_host_and_port>/admin/metrics/`. Ensure that your Prometheus server or scraping agent has network access to this specific endpoint on your Airflow webserver.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Modify your DAG definition: `dag = DAG(..., params={'labels': {'env': 'production', 'team': 'data'}})`","message":"To add custom labels to DAG-related metrics, include a `labels` dictionary within the `params` argument of your DAG definition. These labels will then be attached to the exported metrics for that DAG.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Create a minimal DAG named `canary_dag` in your `dags_folder`. The DAG can contain a simple BashOperator or PythonOperator that always succeeds.","message":"Some scheduler-specific metrics, particularly `airflow_dag_scheduler_delay`, might implicitly rely on the existence of a DAG named `canary_dag`. If you observe these metrics missing or behaving unexpectedly, consider deploying a simple `canary_dag` in your environment.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-15T00:00:00.000Z","next_check":"2026-07-14T00:00:00.000Z"}