Apache Airflow Provider for Exasol

raw JSON →
4.10.2 verified Mon Apr 27 auth: no python

Airflow provider package that integrates Exasol as a data source/sink via Exasol hooks and operators. Current version is 4.10.2, requires Python >=3.10, and is part of the Airflow providers ecosystem. Release cadence is irregular, often with minor version bumps for Airflow core compatibility.

pip install apache-airflow-providers-exasol
error ModuleNotFoundError: No module named 'airflow.providers.exasol'
cause Provider package not installed or Airflow version too old.
fix
Run pip install apache-airflow-providers-exasol and ensure Airflow >=2.6.0.
error pyexasol.exceptions.ExaCommunicationError: Communication error: could not connect to host
cause Incorrect host, port, or network connectivity to Exasol database.
fix
Verify the connection details (host, port, use_tls) in your Airflow connection.
error AttributeError: 'ExasolOperator' object has no attribute 'sql'
cause Misspelled parameter; the correct parameter is `sql` not `query`.
fix
Use ExasolOperator(sql='SELECT 1').
breaking Airflow 2.0 dropped support for legacy hooks/operators located under `airflow.hooks.*` and `airflow.operators.*`. All Exasol providers must be imported from `airflow.providers.exasol.*`.
fix Update imports to use the `airflow.providers.exasol.*` namespace.
breaking Provider version 4.0.0 dropped support for Airflow versions <2.6.0 and Python <3.10.
fix Upgrade Airflow to 2.6.0+ and Python to 3.10+.
gotcha The Exasol connection must include a `schema` parameter in the extras (as JSON) unless the default schema is acceptable. Missing schema can cause 'Schema not set' errors.
fix In the Airflow connection UI, set Extras to {"schema": "your_schema"}.
deprecated The `ExasolOperator` now uses the `pyexasol` library. Older `ExasolConnection` usage is deprecated.
fix Ensure your environment has `pyexasol` installed (it is a dependency). Use `ExasolHook` for connection management.

Minimal DAG that executes a query on Exasol using the ExasolOperator.

from airflow import DAG
from airflow.providers.exasol.operators.exasol import ExasolOperator
from datetime import datetime

with DAG(
    dag_id='exasol_example',
    start_date=datetime(2024, 1, 1),
    schedule=None,
    catchup=False
) as dag:
    run_query = ExasolOperator(
        task_id='run_query',
        exasol_conn_id='exasol_default',
        sql='SELECT 1;'
    )