Apache Airflow Common Compatibility Provider
The `apache-airflow-providers-common-compat` package provides a compatibility layer for Apache Airflow, enabling seamless migration from Airflow 2 to Airflow 3. It utilizes lazy imports that first attempt Airflow 3 paths and then fall back to Airflow 2 paths, abstracting away version differences. This provider aims to centralize and replace scattered version-specific conditional imports within other Airflow providers. The current version is 1.14.2, and Airflow providers generally follow independent release cadences, adhering to SemVer.
Warnings
- breaking Direct imports from `airflow.configuration` for the `conf` object are being deprecated. For Airflow 3.x compatibility, it is recommended to replace `from airflow.configuration import conf` with `from airflow.providers.common.compat.sdk import conf`.
- breaking Custom Airflow providers or plugins that currently use local `version_compat.py` files or version-specific conditional imports should migrate to use the `airflow.providers.common.compat.sdk` layer. This is a recommended architectural change for seamless Airflow 3.x compatibility.
- gotcha This provider is primarily intended for *Apache Airflow provider developers* to write version-agnostic code for their custom components (Operators, Hooks, Sensors, etc.). While end-users writing DAGs *could* technically use it for custom tasks or modules, its main purpose is to centralize and abstract compatibility logic for the broader provider ecosystem, not typically for direct DAG definitions.
Install
-
pip install apache-airflow-providers-common-compat
Imports
- BaseOperator
from airflow.providers.common.compat.sdk import BaseOperator
- conf
from airflow.providers.common.compat.sdk import conf
- task
from airflow.providers.common.compat.sdk import task
Quickstart
import os
# This example demonstrates how a custom Airflow component (e.g., an operator or hook)
# would use the common.compat.sdk to ensure compatibility across Airflow versions.
# In a real scenario, this would typically be part of a custom provider package's code.
# Ensure Airflow environment variables are set for a minimal run, e.g., for 'conf'
# In a live Airflow environment, these are usually handled by the Airflow setup.
if 'AIRFLOW_HOME' not in os.environ:
os.environ['AIRFLOW_HOME'] = os.path.expanduser('~/airflow')
try:
# Attempt to import common Airflow components via the compatibility layer
from airflow.providers.common.compat.sdk import BaseOperator, conf, task
print("Successfully imported BaseOperator, conf, and task via common.compat.sdk")
# Example usage (simplified, as these are typically used within an operator/hook definition)
# The actual 'conf' object would be more complex and used for configuration access
print(f"Retrieved BaseOperator from compat layer: {BaseOperator.__name__}")
print(f"Retrieved conf object from compat layer: {conf}")
print(f"Retrieved task decorator from compat layer: {task}")
# Minimal example of using a compat-imported BaseOperator (not runnable as a full DAG)
class MyCompatOperator(BaseOperator):
def __init__(self, **kwargs):
super().__init__(task_id='my_compat_task', **kwargs)
my_op_instance = MyCompatOperator()
print(f"Instantiated a custom operator using compat BaseOperator: {my_op_instance.task_id}")
except ImportError as e:
print(f"Failed to import from common.compat.sdk: {e}")
print("Ensure 'apache-airflow-providers-common-compat' is installed and Airflow is properly set up.")
except Exception as e:
print(f"An unexpected error occurred: {e}")