{"library":"apache-airflow-providers-google","title":"Google Cloud Provider for Apache Airflow","description":"The `apache-airflow-providers-google` package extends Apache Airflow with operators, hooks, sensors, and transfers for seamless integration with various Google services, including Google Cloud Platform (GCP), Google Ads, Google Firebase, and Google Workspace. Currently at version 21.0.0, this provider is actively maintained with releases often tied to new features, bug fixes, and updates to underlying Google Cloud client libraries, independent of the core Airflow release cycle.","status":"active","version":"21.0.0","language":"en","source_language":"en","source_url":"https://github.com/apache/airflow/tree/main/airflow/providers/google","tags":["apache-airflow","google-cloud","gcp","etl","orchestration","data-pipeline"],"install":[{"cmd":"pip install apache-airflow-providers-google","lang":"bash","label":"Install latest version"}],"dependencies":[{"reason":"Core Airflow dependency; provider requires >=2.11.0","package":"apache-airflow","optional":false},{"reason":"Required for BigQuery operations, specific version varies with provider","package":"google-cloud-bigquery","optional":true},{"reason":"Required for Google Cloud Storage operations, specific version varies with provider","package":"google-cloud-storage","optional":true},{"reason":"Required for Google Ads operations, specific version varies with provider","package":"google-ads","optional":true}],"imports":[{"symbol":"BigQueryInsertJobOperator","correct":"from airflow.providers.google.cloud.operators.bigquery import BigQueryInsertJobOperator"},{"symbol":"GCSUploadSessionCompleteSensor","correct":"from airflow.providers.google.cloud.sensors.gcs import GCSUploadSessionCompleteSensor"},{"symbol":"CloudStorageHook","correct":"from airflow.providers.google.cloud.hooks.gcs import GCSHook"},{"symbol":"GoogleBaseHook","correct":"from airflow.providers.google.common.hooks.base_google import GoogleBaseHook"}],"quickstart":{"code":"import os\nfrom datetime import datetime\n\nfrom airflow.models.dag import DAG\nfrom airflow.providers.google.cloud.operators.bigquery import BigQueryInsertJobOperator\n\n# Ensure you have a 'google_cloud_default' connection configured in Airflow.\n# This connection typically uses Application Default Credentials (ADC).\n# For local testing, ensure GOOGLE_APPLICATION_CREDENTIALS points to a service account key.\n\nwith DAG(\n    dag_id='gcp_bigquery_quickstart',\n    start_date=datetime(2023, 1, 1),\n    schedule_interval=None,\n    catchup=False,\n    tags=['gcp', 'bigquery', 'example'],\n) as dag:\n    insert_job = BigQueryInsertJobOperator(\n        task_id='insert_row_to_bigquery',\n        project_id=os.environ.get('GCP_PROJECT_ID', 'your-gcp-project-id'),\n        configuration={\n            'query': {\n                'query': 'INSERT INTO `dataset.table` (column1, column2) VALUES (\"value1\", \"value2\")',\n                'useLegacySql': False,\n                'destinationTable': {\n                    'projectId': os.environ.get('GCP_PROJECT_ID', 'your-gcp-project-id'),\n                    'datasetId': 'your_dataset_id',\n                    'tableId': 'your_table_id'\n                }\n            }\n        },\n        gcp_conn_id='google_cloud_default',\n    )\n","lang":"python","description":"This quickstart demonstrates a simple Airflow DAG using the `BigQueryInsertJobOperator` from the Google Cloud Provider. It assumes a 'google_cloud_default' Airflow connection is configured, typically leveraging Application Default Credentials (ADC) or a service account key file. Ensure the `GCP_PROJECT_ID` environment variable is set or replace 'your-gcp-project-id' with your actual GCP project ID, and similarly for dataset and table IDs."},"warnings":[{"fix":"Always check the `apache-airflow-providers-google` documentation (or PyPI page) for the `apache-airflow` core version requirement for your specific provider version. Upgrade `apache-airflow` if necessary before upgrading the provider.","message":"Each version of `apache-airflow-providers-google` has a minimum required Apache Airflow core version. For example, provider version 21.0.0 requires `apache-airflow>=2.11.0`. Installing a provider version incompatible with your Airflow core can lead to unexpected behavior or dependency conflicts.","severity":"breaking","affected_versions":"All versions"},{"fix":"Pin specific versions of `apache-airflow-providers-google` and `google-ads` to known compatible combinations. If encountering `VersionConflict` with `proto-plus`, explicitly constrain its version, though this may require downgrading other `google-cloud-xxx` libraries. Test thoroughly after any updates involving `google-ads`.","message":"Frequent updates to the underlying `google-ads` client library (e.g., v5 to v8, v12 to v13) and changes in object types (from native protobuf to proto-plus) have historically caused breaking changes and dependency conflicts, especially when other Google client libraries are also used in the same environment. This can lead to `VersionConflict` errors.","severity":"breaking","affected_versions":"Versions 5.0.0+, 10.1.0+, 17.0.0+, 19.0.0+, 21.0.0+"},{"fix":"Replace `delegate_to='service-account-email'` with `impersonation_chain=['service-account-email']` in operators, hooks, and triggers where service account impersonation is used.","message":"The `delegate_to` parameter for service account impersonation has been deprecated and removed in favor of `impersonation_chain`. Using `delegate_to` in newer provider versions will cause errors.","severity":"breaking","affected_versions":"Provider versions that removed `delegate_to` (e.g., around v10.0.0+)"},{"fix":"Update import paths and operator names to use the `dataplex` module for Data Catalog functionalities. Refer to the provider's changelog for specific operator renames.","message":"Operators related to Google Cloud Data Catalog have been renamed and/or moved to the Dataplex provider. For example, `CloudDataCatalogCreateEntryOperator` has been replaced by `DataplexCatalogCreateEntryOperator`.","severity":"breaking","affected_versions":"Versions 19.0.0+"},{"fix":"Ensure your Airflow environment has the `google_cloud_default` connection properly configured. If using a service account key, provide it via 'Keyfile Path' or 'Keyfile JSON' in the connection details. For ADC, ensure the environment where Airflow workers run has appropriate credentials (e.g., `GOOGLE_APPLICATION_CREDENTIALS` environment variable set, or running on GCP infrastructure like GCE/GKE).","message":"Authentication to Google Cloud relies on Airflow connections. The default `google_cloud_default` connection typically uses Application Default Credentials (ADC). Misconfiguration of ADC (e.g., `GOOGLE_APPLICATION_CREDENTIALS` not set, or incorrect service account key in Airflow connection) is a common source of authorization errors.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Refer to the provider's documentation on cross-provider dependencies and extras. If installing both, use the recommended `pip install apache-airflow-providers-google[apache.beam]` (or similar) to ensure compatible dependencies are installed.","message":"Historically, conflicts between `apache-airflow-providers-google` and `apache-airflow-providers-apache-beam` have arisen due to differing dependencies on `google-cloud-bigquery` client versions, especially when using `apache-beam[gcp]` extra. This can lead to unexpected behavior in BigQuery operators.","severity":"gotcha","affected_versions":"Provider versions before 3.0.0 (where specific integration changes were made). Still a potential for new conflicts with future `apache-beam` updates."}],"env_vars":null,"last_verified":"2026-04-06T21:49:17.940Z","next_check":"2026-07-05T00:00:00.000Z"}