S3Fs

2026.2.0 · active · verified Fri Mar 27

S3Fs is a Pythonic filesystem interface to Amazon S3, built on top of aiobotocore and fsspec. The top-level class S3FileSystem exposes familiar file-system operations (ls, cp, mv, du, glob, put, get) and a file open() API that emulates Python's standard file protocol, making it a drop-in for libraries like pandas, dask, and gzip that accept file-like objects. It also supports S3-compatible stores (MinIO, Ceph, R2) via the endpoint_url parameter. Versions follow calendar versioning (YYYY.MM.PATCH); the current release is 2026.2.0, released February 2026, with roughly monthly cadence.

Warnings

Install

Imports

Quickstart

Connect with explicit credentials from environment variables, list a bucket, read a file, and write a file.

import os
import s3fs

# Credentials via env vars (boto chain also checks ~/.aws/credentials, IAM roles, etc.)
fs = s3fs.S3FileSystem(
    key=os.environ.get('AWS_ACCESS_KEY_ID', ''),
    secret=os.environ.get('AWS_SECRET_ACCESS_KEY', ''),
    # token=os.environ.get('AWS_SESSION_TOKEN', ''),  # uncomment for STS/assumed-role
    # endpoint_url='https://s3.example.com',          # uncomment for MinIO / S3-compatible
)

# List bucket contents
bucket = os.environ.get('S3_BUCKET', 'my-bucket')
print(fs.ls(bucket))

# Read a file
with fs.open(f'{bucket}/hello.txt', 'rb') as f:
    print(f.read())

# Write a file (must flush >5 MiB for multipart; context manager handles this)
with fs.open(f'{bucket}/output.txt', 'wb') as f:
    f.write(b'hello s3fs')

# Works transparently with pandas via storage_options
import pandas as pd
df = pd.read_csv(
    f's3://{bucket}/data.csv',
    storage_options={
        'key': os.environ.get('AWS_ACCESS_KEY_ID', ''),
        'secret': os.environ.get('AWS_SECRET_ACCESS_KEY', ''),
    },
)
print(df.head())

view raw JSON →