{"id":3876,"library":"apache-airflow-providers-odbc","title":"Apache Airflow ODBC Provider","description":"This provider package enables Apache Airflow to connect to various ODBC data sources, including MS SQL Server, to execute queries and perform database operations. It is released independently from Airflow core and follows semantic versioning, with major version upgrades indicating breaking changes.","status":"active","version":"4.12.1","language":"en","source_language":"en","source_url":"https://airflow.apache.org/docs/apache-airflow-providers-odbc/stable/index.html","tags":["Airflow","Provider","ODBC","Database","SQL"],"install":[{"cmd":"pip install apache-airflow-providers-odbc","lang":"bash","label":"Install only ODBC provider"},{"cmd":"pip install apache-airflow[odbc]","lang":"bash","label":"Install Airflow with ODBC provider (includes pyodbc)"}],"dependencies":[{"reason":"Core Airflow dependency. Provider version 4.12.1 requires Apache Airflow >=2.11.0.","package":"apache-airflow","optional":false},{"reason":"Python driver for ODBC connectivity. Included with 'apache-airflow[odbc]' extra, but also a direct dependency of the provider.","package":"pyodbc","optional":false},{"reason":"Provides common SQL functionalities. Can be installed with `pip install apache-airflow-providers-odbc[common.sql]`.","package":"apache-airflow-providers-common-sql","optional":true}],"imports":[{"symbol":"OdbcHook","correct":"from airflow.providers.odbc.hooks.odbc import OdbcHook"},{"symbol":"OdbcOperator","correct":"from airflow.providers.odbc.operators.odbc import OdbcOperator"},{"note":"As of provider version 3.1.0, SQL operators like SQLExecuteQueryOperator were moved to the common-sql provider.","symbol":"SQLExecuteQueryOperator","correct":"from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator"}],"quickstart":{"code":"from __future__ import annotations\n\nimport pendulum\n\nfrom airflow.models.dag import DAG\nfrom airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator\n\n# Ensure you have an Airflow Connection named 'my_odbc_conn'\n# type: ODBC\n# host: your_odbc_server\n# login: your_username\n# password: your_password\n# extra: {\"driver\": \"{ODBC Driver 18 for SQL Server}\", \"autocommit\": true}\n\nwith DAG(\n    dag_id=\"odbc_example_dag\",\n    start_date=pendulum.datetime(2023, 1, 1, tz=\"UTC\"),\n    schedule=None,\n    catchup=False,\n    tags=[\"odbc\", \"example\"],\n) as dag:\n    create_table = SQLExecuteQueryOperator(\n        task_id=\"create_table\",\n        sql=\"\"\"\n            CREATE TABLE IF NOT EXISTS my_odbc_table (\n                id INT IDENTITY(1,1) PRIMARY KEY,\n                value VARCHAR(255)\n            );\n        \"\"\",\n        conn_id=\"my_odbc_conn\",\n    )\n\n    insert_data = SQLExecuteQueryOperator(\n        task_id=\"insert_data\",\n        sql=\"\"\"\n            INSERT INTO my_odbc_table (value) VALUES ('test_value_{{ ds }}');\n        \"\"\",\n        conn_id=\"my_odbc_conn\",\n    )\n\n    # Note: For fetching data, you'd typically use OdbcHook in a PythonOperator\n    # or a custom operator, as SQLExecuteQueryOperator is for execution.\n\n    create_table >> insert_data","lang":"python","description":"This quickstart demonstrates a basic Airflow DAG using the `SQLExecuteQueryOperator` to interact with an ODBC database. It assumes an Airflow ODBC connection (`my_odbc_conn`) is configured in the UI or via environment variables, including the necessary ODBC driver. Remember to install system-level ODBC drivers for your specific database."},"warnings":[{"fix":"Update usage to pass `driver` as a keyword argument to `OdbcHook` or within the `hook_params` dictionary for SQL Operators. Example `extra` for connection: `{\"autocommit\": true}` and `driver='{ODBC Driver 18 for SQL Server}'` passed separately to the hook.","message":"The `driver` parameter for `OdbcHook` must now be passed directly via the hook constructor or `hook_params` in SQL Operators, not through the connection's `extra` field, due to a security vulnerability fix.","severity":"breaking","affected_versions":"Before 4.0.0"},{"fix":"Ensure your Airflow core installation meets the minimum version requirement for the installed provider. Refer to the provider's changelog for specific version compatibility.","message":"This provider version has increased minimum Airflow requirements. For example, provider `3.0.0` requires Airflow `2.2+` and provider `4.12.1` requires Airflow `2.11.0+`. Older provider versions had lower Airflow requirements.","severity":"breaking","affected_versions":"From 2.0.0 onwards (specific to provider version)"},{"fix":"Update your Airflow Connection's `extra` field to use boolean literals for `connect_kwargs` values: `\"connect_kwargs\": {\"autocommit\": true, \"ansi\": false}`.","message":"When passing keyword arguments to the ODBC connection via the `connect_kwargs` key in the connection's `extra` field (e.g., `autocommit`, `ansi`), values must now be booleans (e.g., `true`, `false`) instead of strings (`\"true\"`, `\"false\"`).","severity":"breaking","affected_versions":"From 2.0.0 onwards"},{"fix":"Manually install the appropriate ODBC driver for your database (e.g., Microsoft ODBC Driver for SQL Server) on the operating system where Airflow workers run.","message":"The `apache-airflow-providers-odbc` package and its `pyodbc` dependency require system-level ODBC drivers to be installed on your Airflow worker machines. These are not installed by `pip`.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Avoid setting `allow_driver_in_extra` to `True` unless you fully trust all users who can modify Airflow connections. Ensure drivers are only specified directly in DAG code or via the hook constructor.","message":"Enabling the `allow_driver_in_extra` configuration (via `AIRFLOW__PROVIDERS_ODBC__ALLOW_DRIVER_IN_EXTRA=true` environment variable or `airflow.cfg`) is a security risk. It allows users to specify arbitrary ODBC drivers via the Airflow Connection's `extra` field, potentially leading to privilege escalation or arbitrary code execution if untrusted users can modify connections.","severity":"gotcha","affected_versions":"All versions"},{"fix":"If encountering encoding errors, ensure the correct character set is configured in your ODBC driver and/or connection string. Consult the specific database and ODBC driver documentation, or consider upgrading/downgrading provider/Airflow versions if a known fix exists.","message":"Some users have reported `UnicodeDecodeError` (e.g., `'utf-8' codec can't decode byte 0x91`) when using specific ODBC drivers (like Netezza) with certain Airflow and provider versions.","severity":"gotcha","affected_versions":"Specific combinations (e.g., Airflow 2.10.2, provider 4.7.0 with Netezza)"}],"env_vars":null,"last_verified":"2026-04-11T00:00:00.000Z","next_check":"2026-07-10T00:00:00.000Z"}