Apache Airflow Common SQL Provider
The `apache-airflow-providers-common-sql` package provides a foundational set of SQL-related functionalities for Apache Airflow, including operators, hooks, sensors, and triggers for interacting with various SQL databases. It leverages SQLAlchemy for seamless integration and simplifies connection management. As of its current version 1.34.0, it is actively maintained with regular updates.
Warnings
- breaking Provider version 1.33.0 (and later) removes all deprecated classes, parameters, and features. Users with very old provider versions or custom code relying on private functions of `common.sql` might experience issues. This release is only available for Airflow 2.9+.
- breaking The `apache-airflow-providers-common-sql==1.3.0` release was known to break BigQuery operators from `apache-airflow-providers-google==8.4.0` due to implicit dependencies and refactoring. This resulted in `1.3.0` being yanked.
- breaking Provider versions `1.3.0` and `1.5.0` were yanked from PyPI due to potential issues: `1.3.0` broke BigQuery operators, and `1.5.0` could cause unconstrained installation of old Airflow versions leading to `RuntimeError`.
- gotcha The `apache-airflow-providers-common-sql` package is preinstalled by default with Apache Airflow (since Airflow 2.4.0+). While it can be upgraded independently, attempting to install it separately on a system with an older Airflow version (e.g., <2.4.0) will fail at runtime, even if not explicitly specified in dependencies, due to internal compatibility checks.
- breaking A SQL Injection vulnerability (CVE-2025-30473) existed in `SQLTableCheckOperator` when using the `partition_clause` parameter with externally-influenced input. An authenticated UI user could inject arbitrary SQL commands.
Install
-
pip install apache-airflow-providers-common-sql
Imports
- SQLExecuteQueryOperator
from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator
- SQLColumnCheckOperator
from airflow.providers.common.sql.operators.sql import SQLColumnCheckOperator
- SQLTableCheckOperator
from airflow.providers.common.sql.operators.sql import SQLTableCheckOperator
- DbApiHook
from airflow.providers.common.sql.hooks.sql import DbApiHook
- SqlSensor
from airflow.providers.common.sql.sensors.sql import SqlSensor
Quickstart
from airflow import DAG
from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator
from datetime import datetime
with DAG(
'sql_example_dag',
start_date=datetime(2023, 1, 1),
schedule_interval=None,
catchup=False,
tags=['sql', 'example']
) as dag:
# Ensure 'my_database_conn' is configured in Airflow Connections
execute_query_task = SQLExecuteQueryOperator(
task_id='run_simple_sql_query',
conn_id='my_database_conn',
sql='SELECT 1;'
)
# Example of running a SQL query from a file
# For this to work, you would typically place 'my_query.sql' in your DAGs folder's 'include' directory,
# or set the template_searchpath in the DAG definition.
# For demonstration, we'll use a placeholder string.
execute_templated_query = SQLExecuteQueryOperator(
task_id='run_templated_sql_query',
conn_id='my_database_conn',
sql="""SELECT * FROM users WHERE registration_date < '{{ ds }}';""",
parameters={'table_name': 'users'}
)