Airflow Exporter
Airflow Exporter is an Apache Airflow plugin designed to expose DAG and task-based metrics to a Prometheus-compatible endpoint. The current version, 2.0.0, is built for Airflow 3.0+ and Python 3.9+. It is actively maintained with ongoing development and regular releases.
Warnings
- breaking Version 2.0.0 and above of `airflow-exporter` drops support for Apache Airflow versions older than 3.0. Users running Airflow 2.x must use `airflow-exporter` versions 1.x.x or upgrade their Apache Airflow instance to 3.0 or newer.
- gotcha As an Airflow plugin, `airflow-exporter` must be installed in the same Python environment where your Airflow scheduler and webserver run. If metrics are not appearing at the `/admin/metrics/` endpoint, verify that the package is correctly installed and that Airflow components have been restarted.
- gotcha The Prometheus metrics endpoint is exposed at `http://<your_airflow_host_and_port>/admin/metrics/`. Ensure that your Prometheus server or scraping agent has network access to this specific endpoint on your Airflow webserver.
- gotcha To add custom labels to DAG-related metrics, include a `labels` dictionary within the `params` argument of your DAG definition. These labels will then be attached to the exported metrics for that DAG.
- gotcha Some scheduler-specific metrics, particularly `airflow_dag_scheduler_delay`, might implicitly rely on the existence of a DAG named `canary_dag`. If you observe these metrics missing or behaving unexpectedly, consider deploying a simple `canary_dag` in your environment.
Install
-
pip install airflow-exporter
Quickstart
# Assuming Apache Airflow is already installed and running in your environment.
# The airflow-exporter functions as an Airflow plugin and does not require
# explicit Python imports within your DAG files for its core functionality.
# Simply installing the package into your Airflow environment is sufficient.
# 1. Install the airflow-exporter package:
# pip install airflow-exporter
# 2. Restart your Airflow scheduler and webserver components
# (or redeploy if in a containerized environment) to ensure the plugin loads.
# 3. Access the Prometheus metrics endpoint exposed by Airflow Exporter:
# Metrics will be available at http://<your_airflow_host_and_port>/admin/metrics/
# Example of how you might check the endpoint (requires curl to be available):
# import os
# airflow_host = os.environ.get('AIRFLOW_WEBSERVER_HOST', 'localhost')
# airflow_port = os.environ.get('AIRFLOW_WEBSERVER_PORT', '8080')
# metrics_url = f"http://{airflow_host}:{airflow_port}/admin/metrics/"
# print(f"Access metrics at: {metrics_url}")
# # You would typically use a tool like curl or Prometheus to scrape this endpoint.
# # Example using Python requests (if installed):
# # import requests
# # try:
# # response = requests.get(metrics_url)
# # response.raise_for_status()
# # print("Successfully fetched metrics (first 500 chars):")
# # print(response.text[:500])
# # except requests.exceptions.RequestException as e:
# # print(f"Failed to fetch metrics: {e}")