sqs-extended-client
raw JSON → 0.0.11 verified Fri May 01 auth: no python
Python port of the Amazon SQS Extended Client Library. Handles sending and receiving large SQS messages (>256KB) by storing the message payload in S3 and sending a reference pointer via SQS. Current version: 0.0.11. Low activity, infrequent releases.
pip install sqs-extended-client Common errors
error ModuleNotFoundError: No module named 'sqs_extended_client' ↓
cause The library is not installed or was installed from a different source.
fix
Run 'pip install sqs-extended-client' to install from PyPI.
error botocore.exceptions.NoCredentialsError: Unable to locate credentials ↓
cause AWS credentials not configured properly for boto3 session.
fix
Set up AWS credentials via environment variables (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY) or configure via ~/.aws/credentials.
error sqs_extended_client.exceptions.MessageTooLargeError: Message size ... exceeds max allowed size 262144 ↓
cause When always_through_s3=False, a message >256KB was sent without going through S3. This may happen if the library misidentifies the size or if the payload grows after the check.
fix
Set always_through_s3=True or ensure the message body is correctly sized before sending. Verify the library's size check.
Warnings
breaking The library is a port of the Java Extended Client Library, but not all features are implemented. Notably, batch operations and message attributes handling may differ. ↓
fix Check changelog and source code for missing features before migrating from Java.
gotcha When always_through_s3 is False, the client automatically decides to store in S3 only if the message size exceeds 256KB. However, the maximum message size in SQS is 256KB; messages larger than this must use S3. If always_through_s3 is True, all messages are stored in S3 regardless of size. ↓
fix Set always_through_s3=True if you want all messages stored in S3. The default is False.
gotcha The client does not handle S3 bucket policy or lifecycle automatically. You must ensure the S3 bucket is configured to allow the SQS service to retrieve objects, and that you handle cleanup of old S3 objects. ↓
fix Configure your bucket to allow access for the SQS service, and consider using S3 lifecycle rules to expire objects after a certain period.
deprecated The library has not seen updates since 2020; rely on it at your own risk. Consider using AWS Lambda functions or Step Functions for large message processing as an alternative. ↓
fix Evaluate if the library meets your long-term requirements. Check for newer alternatives or contribute to maintenance.
Imports
- SQSExtendedClient wrong
from sqs_extended_client.core import SQSExtendedClientcorrectfrom sqs_extended_client import SQSExtendedClient - S3BackedQueue
from sqs_extended_client import S3BackedQueue
Quickstart
import boto3
from sqs_extended_client import SQSExtendedClient
session = boto3.Session()
sqs_client = session.client('sqs')
s3_client = session.client('s3')
# Initialize extended client
# Use environment variables for credentials in production
extended = SQSExtendedClient(
sqs_client=sqs_client,
s3_client=s3_client,
s3_bucket_name='my-large-messages-bucket', # Replace with your bucket
always_through_s3=False # Only use S3 when size > 256KB
)
# Send a message
queue_url = 'https://sqs.us-east-1.amazonaws.com/123456789012/MyQueue'
message_body = 'a' * 300000 # Large message
response = extended.send_message(
QueueUrl=queue_url,
MessageBody=message_body
)
print('Message sent with SQS message ID:', response['MessageId'])
# Receive and process messages
messages = extended.receive_message(
QueueUrl=queue_url,
MaxNumberOfMessages=5
)
for msg in messages:
body = msg['Body']
print('Received message body length:', len(body))
extended.delete_message(QueueUrl=queue_url, ReceiptHandle=msg['ReceiptHandle'])