GCSFS

raw JSON →
2026.3.0 verified Tue May 12 auth: no python install: verified quickstart: verified

GCSFS provides a Pythonic file-system interface to Google Cloud Storage, allowing seamless interaction with GCS as if it were a local file system. The current version is 2026.3.0, and it follows a regular release cadence with updates approximately every few months.

pip install gcsfs
error ModuleNotFoundError: No module named 'gcsfs'
cause The gcsfs library is not installed in your Python environment.
fix
Install the library using pip: pip install gcsfs or with conda: conda install -c conda-forge gcsfs.
error Anonymous caller does not have storage.objects.list access to the Google Cloud Storage bucket
cause The Python environment or the executing service account lacks the necessary Google Cloud Storage permissions to list or access objects in the specified bucket.
fix
Ensure that your Google Cloud authentication is correctly set up (e.g., via gcloud auth application-default login, GOOGLE_APPLICATION_CREDENTIALS environment variable, or by passing token to gcsfs.GCSFileSystem). Verify that the authenticated identity has the 'Storage Object Viewer' or 'Storage Admin' IAM role for the GCS bucket.
error OSError: Project was not passed and could not be determined from the environment
cause The Google Cloud project ID was not explicitly provided to gcsfs.GCSFileSystem and could not be automatically inferred from environment variables or gcloud configuration.
fix
Initialize gcsfs.GCSFileSystem with the project argument (e.g., gcsfs.GCSFileSystem(project='your-project-id')), or ensure the GOOGLE_CLOUD_PROJECT environment variable is set, or configure a default project using gcloud config set project YOUR_PROJECT_ID.
error FileNotFoundError: gs://your-bucket/path/to/file
cause The specified file or bucket path does not exist in Google Cloud Storage, or the authenticated identity does not have permission to view it, making it appear as nonexistent.
fix
Double-check the bucket name and object path for any typos. Verify that the authenticated user or service account has storage.objects.get and storage.objects.list permissions for the target bucket and its contents.
breaking Default Filesystem Implementation Change: gcsfs now uses ExtendedFileSystem as the default entry point for all bucket types to support specialized storage buckets like Hierarchical Namespace (HNS) out-of-box. This change may affect existing workflows that rely on the previous default implementation.
fix If you experience unexpected behavior due to this change, you can revert to the previous implementation by setting the environment variable GCSFS_EXPERIMENTAL_ZB_HNS_SUPPORT=false before importing gcsfs.
gotcha FUSE functionality is experimental and may change. It is not recommended for production use, and data loss could occur if not used cautiously.
fix Use FUSE functionality with caution and ensure data is backed up before performing operations.
breaking Attempting to access a Google Cloud Storage bucket without proper authentication or sufficient IAM permissions results in a 401 Unauthorized error and 'Permission denied' message for operations like listing objects. This happens when the caller (e.g., your application) is anonymous or lacks the required IAM roles (e.g., `storage.objectViewer` or `storage.legacyBucketReader`) to perform the requested action on the specified bucket.
fix Ensure your environment is correctly authenticated to Google Cloud (e.g., via `gcloud auth application-default login`, service account keys, or running on a GCE instance with appropriate scopes). Verify that the authenticated identity has the necessary IAM permissions (e.g., `storage.objects.list`) for the target Google Cloud Storage bucket. For anonymously accessible public buckets, ensure the bucket's IAM policy or ACLs explicitly allow public access for the operation.
breaking gcsfs operation failed due to missing or invalid Google Cloud credentials. The application attempted to access a Google Cloud Storage bucket as an anonymous caller and lacked the necessary permissions (e.g., storage.objects.list), resulting in a 401 Unauthorized error. This typically means Application Default Credentials (ADC) were not found or were incorrectly configured.
fix Ensure that Google Cloud Application Default Credentials (ADC) are properly configured in the execution environment. This can involve setting the GOOGLE_APPLICATION_CREDENTIALS environment variable to point to a service account key file, authenticating via `gcloud auth application-default login`, or ensuring that the compute environment (e.g., GCE, GKE, Cloud Run) has an attached service account with appropriate Storage Object Viewer/Admin roles.
conda install -c conda-forge gcsfs
python os / libc status wheel install import disk
3.10 alpine (musl) - - 2.62s 85.0M
3.10 slim (glibc) - - 1.44s 85M
3.11 alpine (musl) - - 3.39s 91.5M
3.11 slim (glibc) - - 2.26s 91M
3.12 alpine (musl) - - 3.29s 82.7M
3.12 slim (glibc) - - 2.52s 82M
3.13 alpine (musl) - - 3.49s 82.1M
3.13 slim (glibc) - - 2.55s 82M
3.9 alpine (musl) - - 1.25s 63.6M
3.9 slim (glibc) - - 1.18s 66M

This script demonstrates how to initialize a GCSFileSystem instance, list files in a GCS bucket, and read a file from GCS.

import gcsfs

# Initialize the GCSFileSystem
fs = gcsfs.GCSFileSystem(project='your-project-id')

# List files in a bucket
print(fs.ls('your-bucket-name'))

# Read a file
with fs.open('your-bucket-name/your-file.txt', 'rb') as f:
    print(f.read())