{"id":7957,"library":"aws-logging-handlers","title":"AWS Logging Handlers","description":"AWS Logging Handlers is a Python library that provides multithreaded logging handlers for streaming log records to Amazon S3 and Kinesis Data Streams. It leverages asynchronous uploading with multiple worker threads and supports gzip compression for S3 logs. The library is actively maintained, with version 2.0.5 currently available, and receives regular minor updates for improvements and bug fixes.","status":"active","version":"2.0.5","language":"en","source_language":"en","source_url":"https://github.com/omrikiei/aws_logging_handlers/","tags":["AWS","logging","S3","Kinesis","handler","cloud"],"install":[{"cmd":"pip install aws-logging-handlers","lang":"bash","label":"Install stable release"}],"dependencies":[{"reason":"Required for interacting with AWS S3 and Kinesis services for logging. The library relies on boto3 for asynchronous multipart uploading and AWS API calls.","package":"boto3"}],"imports":[{"note":"Module names for handlers were refactored in version 2.x; ensure 'S3' is capitalized.","wrong":"from aws_logging_handlers.s3 import S3Handler","symbol":"S3Handler","correct":"from aws_logging_handlers.S3 import S3Handler"},{"note":"Module names for handlers were refactored in version 2.x; ensure 'Kinesis' is capitalized.","wrong":"from aws_logging_handlers.kinesis import KinesisHandler","symbol":"KinesisHandler","correct":"from aws_logging_handlers.Kinesis import KinesisHandler"}],"quickstart":{"code":"import logging\nimport os\nfrom aws_logging_handlers.S3 import S3Handler\nfrom aws_logging_handlers.Kinesis import KinesisHandler\n\n# Configure AWS credentials (replace with your actual bucket and stream names)\nbucket_name = os.environ.get('AWS_S3_LOG_BUCKET', 'your-s3-log-bucket')\nkinesis_stream_name = os.environ.get('AWS_KINESIS_LOG_STREAM', 'your-kinesis-log-stream')\naws_region = os.environ.get('AWS_REGION', 'us-east-1')\n\n# Configure S3 handler (logs rotate every 5MB or 120 seconds)\ns3_handler = S3Handler(\n    log_group='test_log_s3',\n    bucket_name=bucket_name,\n    workers=3, # Number of upload worker threads\n    session_kwargs={'region_name': aws_region}\n)\n\n# Configure Kinesis handler\nkinesis_handler = KinesisHandler(\n    log_group='test_log_kinesis',\n    stream_name=kinesis_stream_name,\n    workers=1, # Number of upload worker threads\n    session_kwargs={'region_name': aws_region}\n)\n\n# Set up formatter\nformatter = logging.Formatter(\n    '[%(asctime)s] %(filename)s:%(lineno)d} %(levelname)s - %(message)s'\n)\ns3_handler.setFormatter(formatter)\nkinesis_handler.setFormatter(formatter)\n\n# Get logger and add handlers\nlogger = logging.getLogger('my_app')\nlogger.setLevel(logging.INFO)\nlogger.addHandler(s3_handler)\nlogger.addHandler(kinesis_handler)\n\n# Log some messages\nlogger.info(\"This is an info message to S3 and Kinesis.\")\nlogger.warning(\"A warning occurred in the application.\")\nlogger.error(\"An error message that should go to AWS services.\")\n\n# Ensure all buffered logs are flushed and workers shut down gracefully\nlogging.shutdown()","lang":"python","description":"This quickstart demonstrates how to configure both `S3Handler` and `KinesisHandler` to stream log records to their respective AWS services. It shows how to set a formatter, add handlers to a logger, and crucially, how to call `logging.shutdown()` for graceful termination and log flushing. Ensure AWS credentials (e.g., via environment variables or IAM roles) and the specified S3 bucket and Kinesis stream exist and are correctly configured."},"warnings":[{"fix":"Update import statements to use capitalized submodule names (e.g., `from aws_logging_handlers.S3 import S3Handler` instead of `from aws_logging_handlers.s3 import S3Handler`). Review the latest documentation for handler constructor arguments.","message":"Major architectural and directory tree refactoring in version 2.0.1 introduced breaking changes, particularly in import paths and potentially handler instantiation. Code written for 0.x versions will likely require updates.","severity":"breaking","affected_versions":"0.x to 2.x"},{"fix":"Always call `logging.shutdown()` at the end of your application's lifecycle to ensure all buffered logs are flushed to S3 or Kinesis and worker threads terminate gracefully.","message":"Failure to call `logging.shutdown()` can result in lost log messages, especially in short-lived applications (like AWS Lambda functions or scripts). The library uses worker threads for asynchronous uploads, and these threads need a signal to flush remaining buffers before the application exits.","severity":"gotcha","affected_versions":"All"},{"fix":"Ensure the target S3 bucket and Kinesis stream are pre-provisioned in AWS. Verify that the IAM user/role associated with your application has `s3:PutObject`, `kinesis:PutRecord`, and `kinesis:PutRecords` permissions (among others as needed for specific configurations) for the target resources.","message":"The S3 bucket and Kinesis stream specified in the handler configuration must already exist and have appropriate IAM permissions for the AWS credentials being used. The library does not create these resources.","severity":"gotcha","affected_versions":"All"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Update your import statements: change `from aws_logging_handlers.s3 import S3Handler` to `from aws_logging_handlers.S3 import S3Handler`. Apply similar changes for `KinesisHandler`.","cause":"This error typically occurs when upgrading from `aws-logging-handlers` version 0.x to 2.x. The module structure was refactored, changing `s3` (lowercase) to `S3` (capitalized).","error":"ModuleNotFoundError: No module named 'aws_logging_handlers.s3'"},{"fix":"Verify your AWS credentials and IAM permissions. Ensure environment variables like `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`, and `AWS_REGION` are correctly set and not expired. If using an IAM role, confirm it has policies allowing `s3:PutObject` for the target S3 bucket and `kinesis:PutRecord`/`kinesis:PutRecords` for the target Kinesis stream.","cause":"The AWS credentials configured for the Boto3 session (either via environment variables, shared credentials file, or IAM role) are invalid, expired, or lack the necessary permissions to perform the logging operation.","error":"botocore.exceptions.ClientError: An error occurred (ExpiredToken) when calling the PutObject operation: The provided token has expired."},{"fix":"1. Ensure `logging.shutdown()` is called at the application's exit. 2. Increase the `workers` parameter for `S3Handler` or `KinesisHandler` if dealing with high log volumes. 3. Check network connectivity to AWS. 4. Double-check that `bucket_name`, `stream_name`, and `session_kwargs={'region_name': ...}` are accurate and match your AWS setup.","cause":"This can happen due to several reasons: `logging.shutdown()` not being called (leading to un-flushed buffers), insufficient worker threads for high log volume, network connectivity issues to AWS, or incorrect configuration of bucket/stream names or regions.","error":"Logs are not appearing in the configured S3 bucket or Kinesis stream, or are significantly delayed."}]}