Apache Airflow Standard Providers
This package provides "standard" operators, hooks, and sensors for Apache Airflow. It extends the core Airflow functionality with common components that were historically part of the main `apache-airflow` package before the modularization introduced in Airflow 2.0. The current version is 1.12.2, and providers are released and versioned independently from Airflow core following a Semver scheme, often with more frequent releases than Airflow itself.
Warnings
- breaking When upgrading to Airflow 3.x, many core operators and hooks (e.g., `PythonOperator`, `BaseHook`) were moved from `airflow.operators` or `airflow.sensors` directly into specific provider packages, including `apache-airflow-providers-standard`. This requires updating import paths in existing DAGs.
- breaking Support for Python 3.9 was dropped in `apache-airflow-providers-standard` version 1.4.0. Users on Python 3.9 must upgrade their Python environment to `3.10` or newer to use current provider versions.
- deprecated In Airflow 3.x, custom operators cannot access the Airflow metadata database directly using database sessions. This change improves security and maintainability.
- gotcha The minimum Apache Airflow version supported by this provider is 2.11.0. Installing this provider on older Airflow versions may lead to compatibility issues or automatic, potentially breaking, upgrades of the `apache-airflow` package.
Install
-
pip install apache-airflow-providers-standard
Imports
- PythonOperator
from airflow.providers.standard.operators.python import PythonOperator
- BranchDateTimeOperator
from airflow.providers.standard.operators.datetime import BranchDateTimeOperator
- LatestOnlyOperator
from airflow.providers.standard.operators.latest_only import LatestOnlyOperator
- ExternalTaskSensor
from airflow.providers.standard.sensors.external_task import ExternalTaskSensor
Quickstart
from __future__ import annotations
import pendulum
from airflow.models.dag import DAG
from airflow.providers.standard.operators.python import PythonOperator
def _print_hello():
print("Hello from Apache Airflow Standard Providers!")
with DAG(
dag_id="standard_provider_quickstart",
start_date=pendulum.datetime(2023, 1, 1, tz="UTC"),
schedule=None,
catchup=False,
tags=["example", "standard", "provider"],
) as dag:
greet_task = PythonOperator(
task_id="greet_with_standard_python_operator",
python_callable=_print_hello,
)