s3pypi - S3-backed Python Package Repository
s3pypi is a Python command-line interface and library for creating and managing a private Python Package Index hosted in an Amazon S3 bucket. It supports uploading packages (wheels and source distributions), generating `index.html` files, and integrating with `pip`. The current version is `2.0.1`. It maintains an active development pace with releases typically tied to feature enhancements or bug fixes.
Common errors
-
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
cause The AWS credentials used by s3pypi lack the necessary S3 permissions to upload objects to the specified bucket.fixVerify that your IAM user/role has `s3:PutObject`, `s3:GetObject`, and `s3:ListBucket` permissions granted on the target S3 bucket. -
botocore.exceptions.ClientError: An error occurred (NoSuchBucket) when calling the GetBucketLocation operation: The specified bucket does not exist
cause The S3 bucket name provided to s3pypi is incorrect or the bucket does not exist in the specified region.fixDouble-check the bucket name in your configuration (CLI argument or `S3PyPI` constructor) and ensure the bucket exists in the AWS region you're targeting.
Warnings
- breaking The CLI subcommand `s3pypi upload` was renamed to `s3pypi deploy` in version 2.0.0.
- breaking The programmatic API method `S3PyPI.upload()` was removed and replaced by `S3PyPI.deploy()` in version 2.0.0.
- breaking The CLI argument `--region` was renamed to `--bucket-region` in version 2.0.0 to clarify its scope to the S3 bucket's region.
- gotcha Deploying packages requires appropriate AWS S3 permissions (`s3:PutObject`, `s3:GetObject`, `s3:ListBucket`) on the target bucket for the IAM user or role `s3pypi` is using.
Install
-
pip install s3pypi
Imports
- S3PyPI
from s3pypi import S3PyPI
from s3pypi.main import S3PyPI
Quickstart
import os
import shutil
from pathlib import Path
from s3pypi.main import S3PyPI
# 1. Create a temporary directory and a dummy package file
dist_dir = Path("dist_temp_s3pypi")
dist_dir.mkdir(exist_ok=True)
dummy_package_name = "my_dummy_package-1.0.0-py3-none-any.whl"
dummy_package_path = dist_dir / dummy_package_name
dummy_package_path.write_text("This is a dummy package file content.")
print(f"Created dummy package file: {dummy_package_path}")
# 2. Configure S3PyPI using environment variables for AWS credentials
# Ensure AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION are set in your environment
# and S3_PYPI_BUCKET points to your target S3 bucket.
aws_region = os.environ.get('AWS_REGION', 'us-east-1')
s3_bucket_name = os.environ.get('S3_PYPI_BUCKET', 'your-s3pypi-bucket-name')
print(f"\nAttempting to deploy package to s3://{s3_bucket_name} in region {aws_region}...")
try:
s3 = S3PyPI(
bucket=s3_bucket_name,
region=aws_region,
files=[str(dummy_package_path)] # Directly provide the path to the artifact
)
s3.deploy() # Use .deploy() as .upload() was removed in v2.0.0
print(f"Successfully deployed {dummy_package_name} to s3://{s3_bucket_name}.")
except Exception as e:
print(f"Error deploying package: {e}")
print("\nEnsure you have configured AWS credentials (e.g., via environment variables, ~/.aws/credentials) ")
print(f"and that S3 bucket '{s3_bucket_name}' exists and you have write permissions (s3:PutObject, s3:GetObject, s3:ListBucket).")
# 3. Clean up temporary files
shutil.rmtree(dist_dir)
print("Cleaned up temporary files.")