{"id":824,"library":"apache-airflow-providers-snowflake","title":"Apache Airflow Snowflake Provider","description":"The `apache-airflow-providers-snowflake` library is an official Apache Airflow provider package that enables seamless interaction with Snowflake Data Cloud from Airflow DAGs. It includes hooks, operators, and transfers for executing SQL queries, managing data, and leveraging Snowflake-specific features. The current version is 6.12.0, and it follows the Airflow provider release cadence, with frequent updates introducing new features and bug fixes.","status":"active","version":"6.12.0","language":"python","source_language":"en","source_url":"https://github.com/apache/airflow/tree/main/airflow/providers/snowflake","tags":["Apache Airflow","Snowflake","ETL","Data Warehouse","Provider"],"install":[{"cmd":"pip install apache-airflow-providers-snowflake","lang":"bash","label":"Install latest version"}],"dependencies":[{"reason":"Core Airflow functionality; minimum version 2.11.0 is required.","package":"apache-airflow","optional":false},{"reason":"Compatibility layer for Airflow providers; minimum version 1.12.0 is required.","package":"apache-airflow-providers-common-compat","optional":false},{"reason":"Common SQL provider for shared SQL functionalities; minimum version 1.32.0 is required.","package":"apache-airflow-providers-common-sql","optional":false},{"reason":"Official Snowflake connector for Python; minimum version 3.16.0 is required.","package":"snowflake-connector-python","optional":false},{"reason":"SQLAlchemy dialect for Snowflake; minimum version 1.7.0 is required.","package":"snowflake-sqlalchemy","optional":false},{"reason":"Required for certain data handling functionalities; version requirements vary by Python version.","package":"pandas","optional":true},{"reason":"Required for certain data handling functionalities, especially with Pandas; version requirements vary by Python version.","package":"pyarrow","optional":true},{"reason":"Required for Snowpark-related features; version requirements vary by Python version.","package":"snowflake-snowpark-python","optional":true}],"imports":[{"symbol":"SnowflakeOperator","correct":"from airflow.providers.snowflake.operators.snowflake import SnowflakeOperator"},{"symbol":"SnowflakeHook","correct":"from airflow.providers.snowflake.hooks.snowflake import SnowflakeHook"},{"note":"The S3ToSnowflakeOperator is part of the 'transfers' submodule, not 'operators'.","wrong":"from airflow.providers.snowflake.operators.s3_to_snowflake import S3ToSnowflakeOperator","symbol":"S3ToSnowflakeOperator","correct":"from airflow.providers.snowflake.transfers.s3_to_snowflake import S3ToSnowflakeOperator"}],"quickstart":{"code":"from __future__ import annotations\nimport os\nimport pendulum\n\nfrom airflow.models.dag import DAG\nfrom airflow.providers.snowflake.operators.snowflake import SnowflakeOperator\n\nSNOWFLAKE_CONN_ID = os.environ.get('SNOWFLAKE_CONN_ID', 'snowflake_default')\n\nwith DAG(\n    dag_id='snowflake_quickstart_example',\n    start_date=pendulum.datetime(2023, 1, 1, tz='UTC'),\n    catchup=False,\n    schedule=None,\n    tags=['snowflake', 'example'],\n    doc_md=\"\"\"### Snowflake Quickstart DAG\n    This DAG demonstrates a basic interaction with Snowflake using the SnowflakeOperator.\n    It creates a table, inserts data, and then queries the table.\n    \"\"\",\n) as dag:\n    create_table = SnowflakeOperator(\n        task_id='create_snowflake_table',\n        snowflake_conn_id=SNOWFLAKE_CONN_ID,\n        sql=(\n            \"CREATE TABLE IF NOT EXISTS AIRFLOW_TEST_TABLE (id INTEGER, name VARCHAR);\"\n        ),\n    )\n\n    insert_data = SnowflakeOperator(\n        task_id='insert_data_into_table',\n        snowflake_conn_id=SNOWFLAKE_CONN_ID,\n        sql=\"INSERT INTO AIRFLOW_TEST_TABLE (id, name) VALUES (1, 'Airflow'), (2, 'Snowflake');\",\n    )\n\n    query_data = SnowflakeOperator(\n        task_id='query_snowflake_table',\n        snowflake_conn_id=SNOWFLAKE_CONN_ID,\n        sql=\"SELECT * FROM AIRFLOW_TEST_TABLE;\",\n        handler=lambda cursor: [print(row) for row in cursor.fetchall()],\n    )\n\n    # Define task dependencies\n    create_table >> insert_data >> query_data\n","lang":"python","description":"This quickstart DAG demonstrates how to create a Snowflake connection in Airflow and use the `SnowflakeOperator` to execute SQL commands. It first defines a DAG, then creates a table, inserts sample data, and finally queries the data, printing the results to the task logs. Ensure your Airflow environment has a Snowflake connection named `snowflake_default` (or the value of `SNOWFLAKE_CONN_ID` environment variable) configured with appropriate credentials (e.g., account, login, password, warehouse, database, schema)."},"warnings":[{"fix":"Upgrade your Apache Airflow instance to version 2.11.0 or higher. Refer to the Airflow upgrade guide for detailed instructions.","message":"The minimum supported Apache Airflow version for `apache-airflow-providers-snowflake` 6.12.0 is 2.11.0. Older provider versions had different Airflow minimums (e.g., 2.1.0+, 2.2.0+, 2.3.0+). Ensure your Airflow installation meets this requirement to avoid compatibility issues.","severity":"breaking","affected_versions":"<6.x, <2.11.0 Airflow"},{"fix":"Adjust your DAGs to expect a sequence of sequences from `SnowflakeHook.run()` or `SnowflakeOperator`'s `handler` function. For example, iterate through `cursor.fetchall()` directly.","message":"In provider versions 4.x and above, the `SnowflakeHook`'s `run` method now conforms to the `DBApiHook` semantics, returning a sequence of sequences (DbApi-compatible results) instead of a dictionary of { 'column': 'value' }. This change affects how results are processed.","severity":"breaking","affected_versions":"4.x and later"},{"fix":"Base64 encode your private key content when configuring Snowflake connections in Airflow UI or environment variables. Update existing connections to use the base64 encoded string.","message":"As of provider version 6.3.0, the `private_key_content` field in Snowflake connections using key-pair authentication is expected to be a base64 encoded string. Existing connections with unencoded private key content will break.","severity":"breaking","affected_versions":">=6.3.0"},{"fix":"Always check the `requirements` section in the official documentation for specific compatible versions. For `snowflake-sqlalchemy` issues, downgrading to `1.2.4` has resolved conflicts with older `sqlalchemy` versions in the past.","message":"When upgrading, ensure `snowflake-connector-python` and `snowflake-sqlalchemy` versions are compatible with your `apache-airflow-providers-snowflake` and `apache-airflow` versions. Conflicts can lead to `ModuleNotFoundError` or unexpected behavior (e.g., `sqlalchemy.sql.roles` issues with `snowflake-sqlalchemy==1.2.5`).","severity":"gotcha","affected_versions":"All versions, especially during upgrades"},{"fix":"Rename any custom Python files named `snowflake.py` to avoid conflicts with the `airflow.providers.snowflake` package structure.","message":"If you encounter 'No module named 'Snowflake'' errors, especially when a `snowflake.py` file is present in your DAGs folder or a conflicting location, it might be due to a Python import path conflict where your local file shadows the installed provider module.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Update your DAGs and custom code to use the current, non-deprecated classes and parameters. Refer to the changelog for specific removals if migrating from older versions.","message":"In versions 6.x and later, all deprecated classes, parameters, and features have been removed from the Snowflake provider package. This includes removal of `apply_default` decorator.","severity":"deprecated","affected_versions":">=6.0.0"},{"fix":"Explicitly set `autocommit=True` in `SnowflakeOperator` if that's the desired behavior. For more robust transactional control, consider using `SQLExecuteQueryOperator` from `apache-airflow-providers-common-sql`.","message":"The `SnowflakeOperator`'s `autocommit` parameter defaults to `True`. However, if you explicitly want autocommit behavior, or are migrating from very old versions, be aware of changes in `common.sql` provider that might affect `autocommit` behavior in some contexts. It is recommended to use `SQLExecuteQueryOperator` for better control over transactions for complex SQL.","severity":"gotcha","affected_versions":"4.0.0, 4.0.1 (yanked), and later versions with `common.sql` provider"},{"fix":"Install the `aiohttp` package in your Airflow environment (e.g., `pip install apache-airflow-providers-snowflake[aiohttp]` or `pip install aiohttp`).","message":"The `apache-airflow-providers-snowflake` package, specifically the `SnowflakeSqlApiHook`, now requires the `aiohttp` library. If `aiohttp` is not installed, importing or using Snowflake components will result in a `ModuleNotFoundError: No module named 'aiohttp'`.","severity":"breaking","affected_versions":">=6.x"},{"fix":"Before installing `snowflake-connector-python`, ensure build essential packages are installed in your environment. For Alpine Linux, add `python3-dev` and `build-base` packages: `apk add --no-cache python3-dev build-base`.","message":"When installing `snowflake-connector-python` in minimal Linux environments (like Alpine Linux, often used in Docker images), the installation may fail with `error: command 'g++' failed: No such file or directory`. This occurs because `snowflake-connector-python` requires a C/C++ compiler (like g++) to build its C extensions (e.g., for Nanoarrow), and these build tools are not included by default in such minimal distributions.","severity":"breaking","affected_versions":"All versions of `snowflake-connector-python` when installed on Alpine Linux or other minimal environments missing build tools."}],"env_vars":null,"last_verified":"2026-05-12T20:06:45.630Z","next_check":"2026-06-27T00:00:00.000Z","problems":[{"fix":"Install the package using 'pip install apache-airflow-providers-snowflake'.","cause":"The 'apache-airflow-providers-snowflake' package is not installed or not accessible in the Python environment.","error":"ModuleNotFoundError: No module named 'airflow.providers.snowflake'"},{"fix":"Update the import statement to 'from airflow.providers.snowflake.hooks.snowflake import SnowflakeHook'.","cause":"The 'SnowflakeHook' has been moved from 'airflow.contrib.hooks.snowflake_hook' to 'airflow.providers.snowflake.hooks.snowflake' in newer versions of Airflow.","error":"ImportError: cannot import name 'SnowflakeHook' from 'airflow.contrib.hooks.snowflake_hook'"},{"fix":"Modify the import statement to 'from airflow.providers.snowflake.operators.snowflake import SnowflakeOperator'.","cause":"The 'SnowflakeOperator' has been relocated from 'airflow.contrib.operators.snowflake_operator' to 'airflow.providers.snowflake.operators.snowflake' in recent Airflow versions.","error":"ImportError: cannot import name 'SnowflakeOperator' from 'airflow.contrib.operators.snowflake_operator'"},{"fix":"Install the package using 'pip install snowflake-connector-python'.","cause":"The 'snowflake-connector-python' package, required for Snowflake integration, is not installed.","error":"ModuleNotFoundError: No module named 'snowflake'"},{"fix":"Update the import statement to 'from airflow.providers.snowflake.transfers.s3_to_snowflake import CopyFromExternalStageToSnowflakeOperator'.","cause":"The 'CopyFromExternalStageToSnowflakeOperator' has been moved to 'airflow.providers.snowflake.transfers.s3_to_snowflake' in newer versions.","error":"ImportError: cannot import name 'CopyFromExternalStageToSnowflakeOperator' from 'airflow.providers.snowflake.transfers.copy_into_snowflake'"}],"ecosystem":"pypi","meta_description":null,"install_score":0,"install_tag":"stale","quickstart_score":null,"quickstart_tag":null,"pypi_latest":"6.12.2","cli_name":null,"install_checks":{"last_tested":"2026-05-12","tag":"stale","tag_description":"widespread failures or data too old to trust","results":[{"runtime":"python:3.10-alpine","python_version":"3.10","os_libc":"alpine (musl)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":"build_error","install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.10-alpine","python_version":"3.10","os_libc":"alpine (musl)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.10-slim","python_version":"3.10","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":38.5,"import_time_s":null,"mem_mb":null,"disk_size":"595M"},{"runtime":"python:3.10-slim","python_version":"3.10","os_libc":"slim (glibc)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.11-alpine","python_version":"3.11","os_libc":"alpine (musl)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":"build_error","install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.11-alpine","python_version":"3.11","os_libc":"alpine (musl)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.11-slim","python_version":"3.11","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":37,"import_time_s":null,"mem_mb":null,"disk_size":"640M"},{"runtime":"python:3.11-slim","python_version":"3.11","os_libc":"slim (glibc)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.12-alpine","python_version":"3.12","os_libc":"alpine (musl)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":"build_error","install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.12-alpine","python_version":"3.12","os_libc":"alpine (musl)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.12-slim","python_version":"3.12","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":32,"import_time_s":null,"mem_mb":null,"disk_size":"627M"},{"runtime":"python:3.12-slim","python_version":"3.12","os_libc":"slim (glibc)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.13-alpine","python_version":"3.13","os_libc":"alpine (musl)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":"build_error","install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.13-alpine","python_version":"3.13","os_libc":"alpine (musl)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.13-slim","python_version":"3.13","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":32.3,"import_time_s":null,"mem_mb":null,"disk_size":"628M"},{"runtime":"python:3.13-slim","python_version":"3.13","os_libc":"slim (glibc)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.9-alpine","python_version":"3.9","os_libc":"alpine (musl)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":"build_error","install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.9-alpine","python_version":"3.9","os_libc":"alpine (musl)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null},{"runtime":"python:3.9-slim","python_version":"3.9","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":43.5,"import_time_s":null,"mem_mb":null,"disk_size":"546M"},{"runtime":"python:3.9-slim","python_version":"3.9","os_libc":"slim (glibc)","variant":"default","exit_code":1,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":null}]},"quickstart_checks":{"last_tested":"2026-04-24","tag":null,"tag_description":null,"results":[{"runtime":"python:3.10-alpine","exit_code":1},{"runtime":"python:3.10-slim","exit_code":-1},{"runtime":"python:3.11-alpine","exit_code":1},{"runtime":"python:3.11-slim","exit_code":-1},{"runtime":"python:3.12-alpine","exit_code":1},{"runtime":"python:3.12-slim","exit_code":-1},{"runtime":"python:3.13-alpine","exit_code":1},{"runtime":"python:3.13-slim","exit_code":-1},{"runtime":"python:3.9-alpine","exit_code":1},{"runtime":"python:3.9-slim","exit_code":-1}]}}