Apache Airflow MySQL Provider

6.5.1 · active · verified Mon Apr 06

The Apache Airflow MySQL Provider enables seamless interaction with MySQL databases within Airflow DAGs. It offers operators and hooks to execute SQL queries, transfer data, and manage connections. As part of the Airflow ecosystem, it receives regular updates, with the current version being 6.5.1, requiring Python >= 3.10.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to use `SQLExecuteQueryOperator` to interact with a MySQL database. It creates a table, inserts data, and then selects it. Ensure you have a MySQL connection named `mysql_default` configured in your Airflow UI.

from __future__ import annotations

import pendulum

from airflow.models.dag import DAG
from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator

# Ensure you have a MySQL connection named 'mysql_default' configured in Airflow UI
# Host: localhost, Schema: airflow_db, User: airflow, Pass: airflow

with DAG(
    dag_id="mysql_quickstart_dag",
    start_date=pendulum.datetime(2023, 1, 1, tz="UTC"),
    catchup=False,
    schedule=None,
    tags=["mysql", "example"],
) as dag:
    create_table = SQLExecuteQueryOperator(
        task_id="create_test_table",
        conn_id="mysql_default",
        sql="""
            CREATE TABLE IF NOT EXISTS test_table (
                id INT AUTO_INCREMENT PRIMARY KEY,
                name VARCHAR(255)
            );
        """,
    )

    insert_data = SQLExecuteQueryOperator(
        task_id="insert_test_data",
        conn_id="mysql_default",
        sql="INSERT INTO test_table (name) VALUES ('Airflow User 1'), ('Airflow User 2');",
    )

    select_data = SQLExecuteQueryOperator(
        task_id="select_test_data",
        conn_id="mysql_default",
        sql="SELECT * FROM test_table;",
    )

    create_table >> insert_data >> select_data

view raw JSON →