{"library":"apache-airflow-providers-common-compat","title":"Apache Airflow Common Compatibility Provider","description":"The `apache-airflow-providers-common-compat` package provides a compatibility layer for Apache Airflow, enabling seamless migration from Airflow 2 to Airflow 3. It utilizes lazy imports that first attempt Airflow 3 paths and then fall back to Airflow 2 paths, abstracting away version differences. This provider aims to centralize and replace scattered version-specific conditional imports within other Airflow providers. The current version is 1.14.2, and Airflow providers generally follow independent release cadences, adhering to SemVer.","status":"active","version":"1.14.2","language":"en","source_language":"en","source_url":"https://github.com/apache/airflow/tree/main/airflow/providers/common/compat","tags":["airflow","provider","compatibility","migration","apache"],"install":[{"cmd":"pip install apache-airflow-providers-common-compat","lang":"bash","label":"Install latest version"}],"dependencies":[{"reason":"Core Airflow dependency","package":"apache-airflow","optional":false},{"reason":"ASGI reference implementation, required for certain Python versions.","package":"asgiref","optional":false}],"imports":[{"note":"Use the compatibility layer for version-agnostic imports of core Airflow components in custom providers or plugins.","wrong":"from airflow.models.baseoperator import BaseOperator","symbol":"BaseOperator","correct":"from airflow.providers.common.compat.sdk import BaseOperator"},{"note":"The direct import from `airflow.configuration` is being replaced by the compat SDK for Airflow 3.x compatibility.","wrong":"from airflow.configuration import conf","symbol":"conf","correct":"from airflow.providers.common.compat.sdk import conf"},{"note":"Commonly used for the `@task` decorator in Airflow DAGs and providers for consistency.","symbol":"task","correct":"from airflow.providers.common.compat.sdk import task"}],"quickstart":{"code":"import os\n\n# This example demonstrates how a custom Airflow component (e.g., an operator or hook)\n# would use the common.compat.sdk to ensure compatibility across Airflow versions.\n# In a real scenario, this would typically be part of a custom provider package's code.\n\n# Ensure Airflow environment variables are set for a minimal run, e.g., for 'conf'\n# In a live Airflow environment, these are usually handled by the Airflow setup.\nif 'AIRFLOW_HOME' not in os.environ:\n    os.environ['AIRFLOW_HOME'] = os.path.expanduser('~/airflow')\n\ntry:\n    # Attempt to import common Airflow components via the compatibility layer\n    from airflow.providers.common.compat.sdk import BaseOperator, conf, task\n    print(\"Successfully imported BaseOperator, conf, and task via common.compat.sdk\")\n\n    # Example usage (simplified, as these are typically used within an operator/hook definition)\n    # The actual 'conf' object would be more complex and used for configuration access\n    print(f\"Retrieved BaseOperator from compat layer: {BaseOperator.__name__}\")\n    print(f\"Retrieved conf object from compat layer: {conf}\")\n    print(f\"Retrieved task decorator from compat layer: {task}\")\n\n    # Minimal example of using a compat-imported BaseOperator (not runnable as a full DAG)\n    class MyCompatOperator(BaseOperator):\n        def __init__(self, **kwargs):\n            super().__init__(task_id='my_compat_task', **kwargs)\n\n    my_op_instance = MyCompatOperator()\n    print(f\"Instantiated a custom operator using compat BaseOperator: {my_op_instance.task_id}\")\n\nexcept ImportError as e:\n    print(f\"Failed to import from common.compat.sdk: {e}\")\n    print(\"Ensure 'apache-airflow-providers-common-compat' is installed and Airflow is properly set up.\")\nexcept Exception as e:\n    print(f\"An unexpected error occurred: {e}\")","lang":"python","description":"This quickstart demonstrates how a developer building a custom Airflow provider or plugin would use `airflow.providers.common.compat.sdk` to import core Airflow components like `BaseOperator`, `conf`, or `@task` in a way that is compatible across different major versions of Apache Airflow (e.g., Airflow 2.x and 3.x). This code snippet shows successful imports and basic instantiation of a class using `BaseOperator` obtained from the compatibility layer."},"warnings":[{"fix":"Update provider or plugin code to import `conf` from `airflow.providers.common.compat.sdk`.","message":"Direct imports from `airflow.configuration` for the `conf` object are being deprecated. For Airflow 3.x compatibility, it is recommended to replace `from airflow.configuration import conf` with `from airflow.providers.common.compat.sdk import conf`.","severity":"breaking","affected_versions":"Airflow 2.11.0+ (in preparation for Airflow 3.x)"},{"fix":"Replace version-specific import logic (e.g., `if AIRFLOW_V_3_0_PLUS:`) with imports from `airflow.providers.common.compat.sdk`.","message":"Custom Airflow providers or plugins that currently use local `version_compat.py` files or version-specific conditional imports should migrate to use the `airflow.providers.common.compat.sdk` layer. This is a recommended architectural change for seamless Airflow 3.x compatibility.","severity":"breaking","affected_versions":"All versions when targeting Airflow 3.x compatibility"},{"fix":"Understand that its main value is for library/provider development rather than routine DAG authoring.","message":"This provider is primarily intended for *Apache Airflow provider developers* to write version-agnostic code for their custom components (Operators, Hooks, Sensors, etc.). While end-users writing DAGs *could* technically use it for custom tasks or modules, its main purpose is to centralize and abstract compatibility logic for the broader provider ecosystem, not typically for direct DAG definitions.","severity":"gotcha","affected_versions":"All"}],"env_vars":null,"last_verified":"2026-04-05T00:00:00.000Z","next_check":"2026-07-04T00:00:00.000Z"}