Apache Airflow Common SQL Provider

raw JSON →
1.34.0 verified Tue May 12 auth: no python install: verified quickstart: stale

The `apache-airflow-providers-common-sql` package provides a foundational set of SQL-related functionalities for Apache Airflow, including operators, hooks, sensors, and triggers for interacting with various SQL databases. It leverages SQLAlchemy for seamless integration and simplifies connection management. As of its current version 1.34.0, it is actively maintained with regular updates.

pip install apache-airflow-providers-common-sql
error ModuleNotFoundError: No module named 'airflow.providers.common.sql'
cause The 'apache-airflow-providers-common-sql' package is not installed.
fix
pip install apache-airflow-providers-common-sql
error ImportError: cannot import name 'SQLExecuteQueryOperator' from 'airflow.providers.common.sql.operators.sql'
cause The 'SQLExecuteQueryOperator' class is not available in the installed version of 'apache-airflow-providers-common-sql'.
fix
Upgrade to a version where 'SQLExecuteQueryOperator' is available, e.g., 'pip install apache-airflow-providers-common-sql>=1.3.0'
error AttributeError: module 'airflow.providers.common.sql.hooks.sql' has no attribute 'DbApiHook'
cause The 'DbApiHook' class has been moved or renamed in the 'apache-airflow-providers-common-sql' package.
fix
Update the import statement to 'from airflow.hooks.dbapi import DbApiHook'
error TypeError: 'NoneType' object is not iterable
cause The 'SQLExecuteQueryOperator' is returning 'None' due to an issue with the 'split_statements' parameter.
fix
Ensure that the 'split_statements' parameter is set correctly, or upgrade to a version where this issue is resolved.
error ImportError: cannot import name 'SQLCheckOperator' from 'airflow.providers.common.sql.operators.sql'
cause The 'SQLCheckOperator' class is not available in the installed version of 'apache-airflow-providers-common-sql'.
fix
Upgrade to a version where 'SQLCheckOperator' is available, e.g., 'pip install apache-airflow-providers-common-sql>=1.1.0'
breaking Provider version 1.33.0 (and later) removes all deprecated classes, parameters, and features. Users with very old provider versions or custom code relying on private functions of `common.sql` might experience issues. This release is only available for Airflow 2.9+.
fix Upgrade Airflow to at least 2.9.0 and refactor DAGs to use current, non-deprecated APIs. Consult the official changelog for specific removals.
breaking The `apache-airflow-providers-common-sql==1.3.0` release was known to break BigQuery operators from `apache-airflow-providers-google==8.4.0` due to implicit dependencies and refactoring. This resulted in `1.3.0` being yanked.
fix Avoid installing `apache-airflow-providers-common-sql==1.3.0`. Downgrade to `1.2.0` or upgrade to a later compatible version (e.g., `1.3.4` or newer for common-sql, and potentially `8.5.0` or newer for google provider, depending on Airflow version).
breaking Provider versions `1.3.0` and `1.5.0` were yanked from PyPI due to potential issues: `1.3.0` broke BigQuery operators, and `1.5.0` could cause unconstrained installation of old Airflow versions leading to `RuntimeError`.
fix Do not use `apache-airflow-providers-common-sql==1.3.0` or `==1.5.0`. Ensure your `requirements.txt` or `pip install` commands specify a working version range.
gotcha The `apache-airflow-providers-common-sql` package is preinstalled by default with Apache Airflow (since Airflow 2.4.0+). While it can be upgraded independently, attempting to install it separately on a system with an older Airflow version (e.g., <2.4.0) will fail at runtime, even if not explicitly specified in dependencies, due to internal compatibility checks.
fix Ensure your Airflow installation meets the minimum requirement for the desired provider version (e.g., Airflow 2.11.0 for provider 1.33.0+). Do not attempt to install this provider with Airflow versions lower than its minimum requirement.
breaking A SQL Injection vulnerability (CVE-2025-30473) existed in `SQLTableCheckOperator` when using the `partition_clause` parameter with externally-influenced input. An authenticated UI user could inject arbitrary SQL commands.
fix Upgrade `apache-airflow-providers-common-sql` to version `1.24.1` or higher to fix this vulnerability.
breaking The `schedule_interval` parameter in the `DAG` class was deprecated in Airflow 2.0 and completely removed in Airflow 2.2. Using it with Airflow versions 2.2.0 or newer will result in a `TypeError`.
fix Refactor your DAG definition to use the `schedule` parameter instead of `schedule_interval` (e.g., `schedule='@daily'` or `schedule=timedelta(days=1)`).
python os / libc status wheel install import disk
3.10 alpine (musl) wheel - 4.47s 244.0M
3.10 alpine (musl) - - 4.82s 236.8M
3.10 slim (glibc) wheel 24.7s 3.39s 242M
3.10 slim (glibc) - - 3.65s 235M
3.11 alpine (musl) wheel - 5.59s 263.9M
3.11 alpine (musl) - - 6.80s 256.0M
3.11 slim (glibc) wheel 23.6s 5.03s 262M
3.11 slim (glibc) - - 5.53s 254M
3.12 alpine (musl) wheel - 5.09s 254.0M
3.12 alpine (musl) - - 5.92s 246.3M
3.12 slim (glibc) wheel 18.3s 5.32s 253M
3.12 slim (glibc) - - 5.95s 245M
3.13 alpine (musl) wheel - 4.80s 255.9M
3.13 alpine (musl) - - 5.56s 248.1M
3.13 slim (glibc) wheel 18.6s 4.79s 255M
3.13 slim (glibc) - - 5.79s 248M
3.9 alpine (musl) sdist - 6.19s 210.8M
3.9 alpine (musl) - - 6.23s 209.1M
3.9 slim (glibc) wheel 29.3s 6.18s 206M
3.9 slim (glibc) - - 6.03s 204M

This quickstart demonstrates how to define a simple Airflow DAG using the `SQLExecuteQueryOperator` to execute SQL queries. It shows both a direct SQL string execution and an example of a templated SQL query that could load from a file.

from airflow import DAG
from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator
from datetime import datetime

with DAG(
    'sql_example_dag',
    start_date=datetime(2023, 1, 1),
    schedule_interval=None,
    catchup=False,
    tags=['sql', 'example']
) as dag:
    # Ensure 'my_database_conn' is configured in Airflow Connections
    execute_query_task = SQLExecuteQueryOperator(
        task_id='run_simple_sql_query',
        conn_id='my_database_conn',
        sql='SELECT 1;'
    )

    # Example of running a SQL query from a file
    # For this to work, you would typically place 'my_query.sql' in your DAGs folder's 'include' directory,
    # or set the template_searchpath in the DAG definition.
    # For demonstration, we'll use a placeholder string.
    execute_templated_query = SQLExecuteQueryOperator(
        task_id='run_templated_sql_query',
        conn_id='my_database_conn',
        sql="""SELECT * FROM users WHERE registration_date < '{{ ds }}';""",
        parameters={'table_name': 'users'}
    )