SQLAlchemy Aurora Data API

raw JSON →
0.5.0 verified Fri May 01 auth: no python

Provides an SQLAlchemy dialect for AWS Aurora Serverless Data API, enabling connection to Aurora clusters without persistent database connections. Current version 0.5.0, updated occasionally.

pip install sqlalchemy-aurora-data-api
error sqlalchemy.exc.NoSuchModuleError: Can't load plugin: sqlalchemy.dialects:aurora_data_api
cause Missing correct installation or import; SQLAlchemy cannot find the dialect plugin.
fix
Ensure sqlalchemy-aurora-data-api is installed: pip install sqlalchemy-aurora-data-api. Also verify SQLAlchemy version compatibility.
error botocore.exceptions.NoCredentialsError: Unable to locate credentials
cause AWS credentials not configured for boto3.
fix
Set environment variables AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_DEFAULT_REGION, or use IAM role when running on AWS.
error sqlalchemy.exc.OperationalError: (aurora_data_api.exceptions.DatabaseError) ... must appear in the GROUP BY clause or be used in an aggregate function
cause Aurora MySQL/PostgreSQL SQL strict mode; query violates GROUP BY rules.
fix
Adjust the SQL query to include all non-aggregated columns in GROUP BY or use aggregate functions.
breaking The dialect URL scheme is 'aurora_data_api' (not 'awsaurora' or 'data-api'). Using a wrong scheme results in import errors.
fix Use create_engine('aurora_data_api://...')
gotcha The Data API has a 1MB payload limit for results. Large queries may fail with a 'Payload too large' error. Use pagination or limit result sets.
fix Add LIMIT or use server-side cursors if available.
deprecated This library is no longer actively maintained by the original authors (Chan Zuckerberg). It may lack support for newer SQLAlchemy versions.
fix Consider using the official AWS library 'aws-sdk-python' or 'aurora-data-api' directly if needing updates.

Basic usage: create an engine with the aurora_data_api dialect and query the database.

from sqlalchemy import create_engine
import os

# Requires AWS credentials configured via environment or IAM role
engine = create_engine("aurora_data_api://?database=my_database&aws_region=us-east-1")
with engine.connect() as conn:
    result = conn.execute("SELECT 1")
    print(result.fetchone())