gcloud-aio-storage
gcloud-aio-storage is an asyncio-compatible Python client library for Google Cloud Storage. It provides full CRUD operations for buckets and blobs, including streaming support for large files, parallel upload capabilities, and built-in session management. Designed for high-performance cloud storage operations with modern async/await patterns, it is currently at version 9.6.4 and follows an active release cadence.
Warnings
- gotcha When downloading large files and using `response.text()`, `aiohttp` (which `gcloud-aio-storage` uses) may default to `chardet` for character encoding detection, which can be very slow. Explicitly setting the `Content-Type` with a `charset` (e.g., `text/plain; charset=utf-8`) when uploading can significantly improve performance for text-based files.
- breaking Support for Python 3.9 was dropped in the `gcloud-aio-auth` component (version 5.4.4) of the `gcloud-aio` ecosystem. While this specifically refers to `gcloud-aio-auth`, it's generally indicative of the overall library's compatibility, so users on Python 3.9 attempting to upgrade other `gcloud-aio` components might encounter issues.
- gotcha Recent releases include fixes related to `auto_decompress` handling in `download_stream` and `ClientSession` settings. Ensure you are not inadvertently overwriting or misconfiguring `auto_decompress` if you have custom `aiohttp.ClientSession` settings.
- gotcha When passing custom metadata during an `upload()` operation, ensure it's nested under a 'metadata' key in the dictionary. Incorrectly structured metadata (e.g., top-level key-value pairs) will not be stored as custom metadata.
- gotcha Users have reported issues uploading files larger than 2GB, particularly when running on 32-bit Python installations. This is a common limitation for 32-bit systems regarding file sizes and memory addressing.
Install
-
pip install gcloud-aio-storage
Imports
- Storage
from gcloud.aio.storage import Storage
- Bucket
from gcloud.aio.storage import Bucket
- Blob
from gcloud.aio.storage import Blob
- StreamResponse
from gcloud.aio.storage import StreamResponse
Quickstart
import asyncio
import os
from gcloud.aio.storage import Storage
async def main():
# GCLOUD_PROJECT, GOOGLE_APPLICATION_CREDENTIALS, or similar env vars
# should be set for authentication.
project_id = os.environ.get('GCLOUD_PROJECT', 'your-gcp-project-id')
bucket_name = 'your-gcs-bucket-name'
file_name = 'hello_gcloud_aio.txt'
content = b'Hello, gcloud-aio-storage!'
async with Storage(project=project_id) as storage:
print(f"Uploading '{file_name}' to bucket '{bucket_name}'...")
await storage.upload(bucket_name, file_name, content, content_type='text/plain')
print(f"'{file_name}' uploaded successfully.")
print(f"Downloading '{file_name}' from bucket '{bucket_name}'...")
downloaded_content = await storage.download(bucket_name, file_name)
print(f"Downloaded content: {downloaded_content.decode()}")
print(f"Deleting '{file_name}' from bucket '{bucket_name}'...")
await storage.delete(bucket_name, file_name)
print(f"'{file_name}' deleted successfully.")
if __name__ == '__main__':
asyncio.run(main())