Google Cloud BigQuery Data Transfer
raw JSON → 3.21.0 verified Tue May 12 auth: no python install: verified quickstart: stale
The Google Cloud BigQuery Data Transfer API client library, currently at version 3.21.0, allows users to programmatically manage scheduled data transfers from various partner SaaS applications and other Google services into Google BigQuery. It enables automation of ETL processes and data replication. The `google-cloud-python` repository, which contains this library, generally follows a regular release cadence with frequent updates across its client libraries.
pip install google-cloud-bigquery-datatransfer Common errors
error The caller does not have permission. ↓
cause The service account or user initiating the data transfer lacks the necessary IAM permissions for the BigQuery project, dataset, or the source data.
fix
Grant the appropriate IAM roles (e.g.,
roles/bigquery.admin, roles/bigquery.dataEditor, or specific Data Transfer Service Agent roles) to the service account or user on the relevant Google Cloud resources. error AttributeError: 'DataTransferServiceClient' object has no attribute 'project_transfer_config_path' ↓
cause This error occurs when using an outdated method from an older version of the `google-cloud-bigquery-datatransfer` client library where utility methods like `project_transfer_config_path` were removed or renamed in version 2.0.0 and above.
fix
Update your code to use the
transfer_config_path method, or refer to the library's migration guide for other API changes and use the correct path helper methods for your client library version. error BigQuery Data Transfer Service has not been used in project <project_id> before or it is disabled. ↓
cause The BigQuery Data Transfer API is not enabled for the specified Google Cloud project.
fix
Enable the BigQuery Data Transfer API in the Google Cloud Console by navigating to 'APIs & Services' -> 'Enabled APIs & services' and searching for 'BigQuery Data Transfer API'.
error Failed to find a valid credential. The request to create a transfer config is supposed to contain an authorization code. ↓
cause The client library could not find valid authentication credentials, or the service account used by the application lacks the necessary permissions to authenticate or create tokens for the transfer configuration.
fix
Ensure Application Default Credentials are set up correctly by running
gcloud auth application-default login for local development, or by verifying that the service account used by the application has the roles/iam.serviceAccountTokenCreator role if impersonation is involved. error BigQuery Data Transfer Service does not yet support location: <location> ↓
cause The specified geographic location (region or multi-region) for the BigQuery dataset or the transfer configuration is not yet supported by the BigQuery Data Transfer Service.
fix
Use a supported region for the destination BigQuery dataset and the transfer configuration. Consult the official BigQuery Data Transfer documentation for a list of currently supported locations.
Warnings
breaking Future versions of Google Cloud client libraries, including `google-cloud-bigquery-datatransfer`, will drop support for Python 3.7 and 3.8. Python 3.7 reached end-of-life (EOL) in June 2023, and 3.8 in October 2024. New major versions of the library will be incompatible with these EOL Python runtimes. ↓
fix Upgrade your Python environment to Python 3.9 or newer to ensure continued compatibility and support. Check the official `google-cloud-python` documentation for the latest supported Python versions.
gotcha BigQuery Data Transfer Service's data source connectors (e.g., Google Ads, Display & Video 360) periodically undergo API version upgrades. These upgrades can introduce schema changes, such as new, modified, or deprecated columns, which may affect existing transfer configurations and downstream data processing pipelines. ↓
fix Regularly monitor the BigQuery Data Transfer Service data source change log for announcements related to your specific data sources. Be prepared to update your BigQuery schemas, ETL processes, and queries as needed to accommodate schema changes.
gotcha Authentication requires appropriate IAM roles. For service accounts creating transfers, 'BigQuery Admin' and 'BigQuery Data Transfer Service Agent' roles are often necessary, in addition to permissions to access the source data. Incorrect permissions are a common cause of transfer failures. ↓
fix Ensure the service account or user initiating the transfer has the 'BigQuery Admin' (roles/bigquery.admin) role, the 'BigQuery Data Transfer Service Agent' role (roles/bigquery.datatransfer.serviceAgent), and read/write access to the source/destination datasets/tables. For external data sources, manual authorization via OAuth may also be required initially.
gotcha Effective February 1, 2026, BigQuery requests that read data from multi-region Cloud Storage buckets will incur Cloud Storage multi-region data transfer fees. This can significantly impact billing for transfers involving data stored across different regions. ↓
fix Review your data transfer configurations. To minimize costs, consider co-locating Cloud Storage buckets with your BigQuery datasets in the same region where possible.
deprecated BigQuery will limit the use of Legacy SQL starting June 1, 2026. If a project has not used Legacy SQL between November 1, 2025, and June 1, 2026, it will no longer be able to use it. Existing workloads might continue, but new ones may fail. ↓
fix Migrate any existing transfer configurations or queries that rely on Legacy SQL to Standard SQL. Review BigQuery documentation on migrating from Legacy SQL.
breaking The executed Python script contains a fundamental syntax error (e.g., an unclosed string literal, malformed statement), preventing the script from being parsed and executed correctly. ↓
fix Review the Python script at the indicated line number (line 20 in this case) and correct the syntax. Ensure all strings are properly closed, parentheses/brackets are matched, and statements follow Python's grammar rules. This is a user-code error.
breaking The script contains a SyntaxError, specifically an 'unterminated f-string literal'. This indicates a coding error in the Python script itself, where an f-string (formatted string literal) was opened but not properly closed with a matching quote. ↓
fix Review the Python script at the indicated line number (line 20 in this case) and ensure all f-strings and other string literals are correctly opened and closed with matching quotes. For example, 'print(f"Error: {e}")' instead of 'print(f"Error: {e}')'.
Install compatibility verified last tested: 2026-05-12
python os / libc status wheel install import disk
3.10 alpine (musl) wheel - 1.67s 68.6M
3.10 alpine (musl) - - 1.63s 67.5M
3.10 slim (glibc) wheel 6.2s 1.00s 66M
3.10 slim (glibc) - - 1.00s 65M
3.11 alpine (musl) wheel - 2.26s 73.2M
3.11 alpine (musl) - - 2.58s 72.1M
3.11 slim (glibc) wheel 5.2s 1.49s 71M
3.11 slim (glibc) - - 2.00s 70M
3.12 alpine (musl) wheel - 2.24s 64.7M
3.12 alpine (musl) - - 2.51s 63.5M
3.12 slim (glibc) wheel 4.4s 1.81s 62M
3.12 slim (glibc) - - 2.01s 61M
3.13 alpine (musl) wheel - 2.10s 64.4M
3.13 alpine (musl) - - 2.61s 63.2M
3.13 slim (glibc) wheel 4.8s 1.67s 62M
3.13 slim (glibc) - - 2.11s 61M
3.9 alpine (musl) wheel - 1.55s 68.7M
3.9 alpine (musl) - - 1.56s 67.6M
3.9 slim (glibc) wheel 7.0s 1.21s 66M
3.9 slim (glibc) - - 1.06s 65M
Imports
- DataTransferServiceClient wrong
from google.cloud.bigquery_datatransfer import DataTransferServiceClientcorrectfrom google.cloud import bigquery_datatransfer_v1 client = bigquery_datatransfer_v1.DataTransferServiceClient()
Quickstart stale last tested: 2026-04-24
import os
from google.cloud import bigquery_datatransfer_v1
# Set your Google Cloud Project ID and Location
# For local development, set the GOOGLE_APPLICATION_CREDENTIALS environment variable.
# Example: export GOOGLE_APPLICATION_CREDENTIALS="/path/to/your/key.json"
project_id = os.environ.get("GOOGLE_CLOUD_PROJECT", "your-project-id")
location = "us"
client = bigquery_datatransfer_v1.DataTransferServiceClient()
parent = client.location_path(project_id, location)
try:
print(f"Listing data sources in project '{project_id}' and location '{location}':")
for data_source in client.list_data_sources(parent=parent):
print(f" Name: {data_source.display_name} (ID: {data_source.data_source_id})")
except Exception as e:
print(f"Error listing data sources: {e}
Ensure the BigQuery Data Transfer API is enabled for project '{project_id}' and your credentials have sufficient permissions.")