Google Cloud Storage Transfer API Client Library
raw JSON → 1.20.0 verified Tue May 12 auth: no python install: verified
The `google-cloud-storage-transfer` library is the official Python client for the Google Cloud Storage Transfer Service. It enables programmatic control over data transfers to and from Google Cloud Storage, supporting various sources including other cloud providers and on-premises systems. Currently at version 1.20.0, this library adheres to Google Cloud's frequent release cadence, often receiving updates alongside other Python client libraries in the `googleapis/google-cloud-python` monorepo.
pip install google-cloud-storage-transfer Common errors
error ModuleNotFoundError: No module named 'google.cloud.storage_transfer' ↓
cause The `google-cloud-storage-transfer` Python package is not installed or not accessible in the current Python environment.
fix
pip install google-cloud-storage-transfer
error Service lacked sufficient permissions, SERVICE_PERMISSION_FAILURE ↓
cause The Google-managed service account used by Storage Transfer Service (typically `project-PROJECT_NUMBER@storage-transfer-service.iam.gserviceaccount.com`) does not have the necessary IAM permissions to access the source or destination resources.
fix
Grant the
Storage Transfer Service Agent role (roles/storagetransfer.serviceAgent) to the service account on the project, and the Storage Admin role (roles/storage.admin) on all relevant source and destination buckets. error The destination bucket doesn't exist in Cloud Storage. ↓
cause The specified destination bucket in the transfer job configuration either does not exist, or the Storage Transfer Service agent does not have permission to view or write to it.
fix
Verify the spelling and existence of the destination bucket, and ensure the Storage Transfer Service agent has the
Storage Admin role on the destination bucket. Warnings
gotcha The Storage Transfer Service API must be explicitly enabled in your Google Cloud Project before use. Additionally, ensure proper authentication is configured, ideally using Application Default Credentials (ADC). ↓
fix Go to the Google Cloud Console, navigate to 'APIs & Services' > 'Library', search for 'Storage Transfer API', and enable it. For authentication, run `gcloud auth application-default login` locally, or ensure your deployment environment (e.g., GCE, Cloud Run) has a service account with necessary roles.
gotcha The service has specific quotas and limits, including rate limits (e.g., 600 requests/min/project) and a 5 TiB maximum object size for transfers to Cloud Storage. Exceeding these limits can lead to failures or throttling. ↓
fix Review the official 'Quotas & Limits' documentation for Storage Transfer Service. Implement exponential backoff for retries and design transfer jobs to stay within known limits. For very large transfers, consider splitting the data into multiple jobs.
gotcha Insufficient IAM permissions are a common cause of transfer failures. The service account or user initiating the transfer needs permissions to create and manage transfer jobs, and read/write access to the source and destination resources. ↓
fix Grant the necessary IAM roles to the principal performing the transfer. For example, 'Storage Transfer Admin' for managing jobs, and 'Storage Object Viewer'/'Storage Object Creator'/'Storage Object Admin' for source/sink buckets as appropriate.
breaking If you are migrating from or encounter older code using the `google-api-services-storagetransfer` library, be aware that it's a legacy Google API Client Library. The `google-cloud-storage-transfer` is the recommended Cloud Client Library. ↓
fix Update your dependencies to `google-cloud-storage-transfer` and refactor client instantiation and method calls according to the Cloud Client Library's patterns. Consult the 'Migrating to the Storage Transfer Service Cloud Client Library' guide for specific changes.
gotcha When troubleshooting failed transfer jobs, it's crucial to enable and inspect logs to understand the root cause, especially for agent-based transfers (on-premises to cloud). ↓
fix Configure your transfer jobs to store logs (e.g., using `--log-dir` in gcloud CLI or equivalent API parameter). Review these logs in Cloud Logging for detailed error messages, such as permission issues, missing files, or network problems.
gotcha Using Python 3.9 or older versions with `google-cloud-storage-transfer` (and its dependencies like `google-api-core`, `google-auth`) will trigger `FutureWarning`s indicating that these Python versions are unsupported and will receive limited or no further updates. This may lead to unexpected behavior or lack of critical fixes in the future. ↓
fix Upgrade your Python environment to version 3.10 or newer. After upgrading Python, update all Google Cloud Client Libraries and their core dependencies (e.g., `google-api-core`, `google-auth`) to their latest compatible versions using `pip install --upgrade google-cloud-storage-transfer google-api-core google-auth`.
gotcha Users must provide valid Google Cloud Project ID, source bucket name, and destination bucket name for transfer operations. These are fundamental identifiers required to configure and execute transfer jobs. ↓
fix Ensure that the Google Cloud Project ID, source bucket name, and destination bucket name are correctly provided to your application, typically via environment variables (e.g., `GCP_PROJECT_ID`, `GCP_SOURCE_BUCKET`, `GCP_SINK_BUCKET`), or passed explicitly as parameters to the Storage Transfer Service client methods.
Install compatibility verified last tested: 2026-05-12
python os / libc status wheel install import disk
3.10 alpine (musl) wheel - 1.87s 68.6M
3.10 alpine (musl) - - 1.72s 67.4M
3.10 slim (glibc) wheel 5.8s 1.11s 66M
3.10 slim (glibc) - - 1.07s 65M
3.11 alpine (musl) wheel - 2.51s 73.1M
3.11 alpine (musl) - - 2.63s 72.0M
3.11 slim (glibc) wheel 5.1s 1.66s 71M
3.11 slim (glibc) - - 1.52s 70M
3.12 alpine (musl) wheel - 2.57s 64.6M
3.12 alpine (musl) - - 2.69s 63.5M
3.12 slim (glibc) wheel 4.2s 1.95s 62M
3.12 slim (glibc) - - 2.11s 61M
3.13 alpine (musl) wheel - 2.37s 64.3M
3.13 alpine (musl) - - 2.54s 63.1M
3.13 slim (glibc) wheel 4.4s 1.88s 62M
3.13 slim (glibc) - - 2.04s 61M
3.9 alpine (musl) wheel - 1.73s 68.6M
3.9 alpine (musl) - - 1.62s 67.5M
3.9 slim (glibc) wheel 6.7s 1.43s 66M
3.9 slim (glibc) - - 1.21s 65M
Imports
- StorageTransferServiceClient wrong
from google.cloud.storagetransfer_v1 import StorageTransferServiceClientcorrectfrom google.cloud.storage_transfer import StorageTransferServiceClient
Quickstart last tested: 2026-04-24
import os
from google.cloud.storage_transfer import StorageTransferServiceClient
def create_and_run_gcs_to_gcs_transfer_job(
project_id: str,
source_bucket_name: str,
sink_bucket_name: str,
job_description: str,
):
"""Creates and runs a one-time transfer job between two GCS buckets."""
client = StorageTransferServiceClient()
# Transfer job configuration
transfer_job = {
"project_id": project_id,
"description": job_description,
"transfer_spec": {
"gcs_data_source": {"bucket_name": source_bucket_name},
"gcs_data_sink": {"bucket_name": sink_bucket_name},
},
"status": "ENABLED", # Job is created in an enabled state
}
# Create the transfer job
# The API might create the job as DISABLED and then ENABLE it, or directly ENABLED.
# For a 'run now' quickstart, we often set it to ENABLED at creation.
try:
created_job = client.create_transfer_job(transfer_job=transfer_job)
print(f"Created transfer job: {created_job.name}")
# If the job is not already IN_PROGRESS (e.g., if set to ENABLED but not yet started)
# we can explicitly run it. For a one-time job set to ENABLED, it should start automatically.
# However, for demonstration, an explicit run call can be shown for clarity if desired
# for existing jobs, but 'create' with ENABLED often triggers it immediately for one-time.
print(f"Transfer job '{created_job.name}' initiated.")
# For more complex job management (e.g., recurrent jobs), you'd interact more with job status and run_transfer_job
except Exception as e:
print(f"Error creating or running transfer job: {e}")
# Example usage (replace with your actual project and bucket names)
if __name__ == "__main__":
# Ensure these environment variables are set or replace with actual values
# For local development, `gcloud auth application-default login` often provides credentials.
PROJECT_ID = os.environ.get("GCP_PROJECT_ID", "your-gcp-project-id")
SOURCE_BUCKET = os.environ.get("GCP_SOURCE_BUCKET", "your-source-gcs-bucket")
SINK_BUCKET = os.environ.get("GCP_SINK_BUCKET", "your-sink-gcs-bucket")
JOB_DESCRIPTION = "My Python quickstart GCS to GCS transfer"
if PROJECT_ID == "your-gcp-project-id" or SOURCE_BUCKET == "your-source-gcs-bucket" or SINK_BUCKET == "your-sink-gcs-bucket":
print("Please set GCP_PROJECT_ID, GCP_SOURCE_BUCKET, and GCP_SINK_BUCKET environment variables or replace placeholder values.")
else:
create_and_run_gcs_to_gcs_transfer_job(
PROJECT_ID, SOURCE_BUCKET, SINK_BUCKET, JOB_DESCRIPTION
)