Apache Airflow Common SQL Provider

1.34.0 · active · verified Sat Mar 28

The `apache-airflow-providers-common-sql` package provides a foundational set of SQL-related functionalities for Apache Airflow, including operators, hooks, sensors, and triggers for interacting with various SQL databases. It leverages SQLAlchemy for seamless integration and simplifies connection management. As of its current version 1.34.0, it is actively maintained with regular updates.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to define a simple Airflow DAG using the `SQLExecuteQueryOperator` to execute SQL queries. It shows both a direct SQL string execution and an example of a templated SQL query that could load from a file.

from airflow import DAG
from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator
from datetime import datetime

with DAG(
    'sql_example_dag',
    start_date=datetime(2023, 1, 1),
    schedule_interval=None,
    catchup=False,
    tags=['sql', 'example']
) as dag:
    # Ensure 'my_database_conn' is configured in Airflow Connections
    execute_query_task = SQLExecuteQueryOperator(
        task_id='run_simple_sql_query',
        conn_id='my_database_conn',
        sql='SELECT 1;'
    )

    # Example of running a SQL query from a file
    # For this to work, you would typically place 'my_query.sql' in your DAGs folder's 'include' directory,
    # or set the template_searchpath in the DAG definition.
    # For demonstration, we'll use a placeholder string.
    execute_templated_query = SQLExecuteQueryOperator(
        task_id='run_templated_sql_query',
        conn_id='my_database_conn',
        sql="""SELECT * FROM users WHERE registration_date < '{{ ds }}';""",
        parameters={'table_name': 'users'}
    )

view raw JSON →