robotframework-aws

raw JSON →
1.0.0 verified Fri May 01 auth: no python

Robot Framework library for testing AWS services. Current version 1.0.0 provides keywords for S3 (multipart upload, bucket policy), SQS (send/receive/delete messages), and configuration via endpoint URL. Last updated 2022.

pip install robotframework-aws
error ModuleNotFoundError: No module named 'robotframework_aws'
cause Outdated import paths from before version 0.2.0.
fix
Use 'AWS.services.S3.AWSS3Client' instead of 'robotframework_aws.AWSS3Client'.
error AttributeError: 'NoneType' object has no attribute 'upload_file'
cause Missing AWS credentials (boto3 session not initialized).
fix
Set environment variables AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_DEFAULT_REGION.
breaking Version 0.2.0 refactored module paths: classes moved from 'robotframework_aws' package to 'AWS.services.*'.
fix Update imports to use 'AWS.services.S3.AWSS3Client' etc.
gotcha Many keywords require explicit 'bucket_name' and 'key' parameters; the library does not assume default region or credentials – you must configure boto3 separately (env vars or AWS profile).
fix Set AWS credentials via environment variables (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_DEFAULT_REGION) before test run.
gotcha The library uses 'endpoint_url' parameter in library import. If not provided, it defaults to None, which may cause connection errors when testing against localstack.
fix Always set endpoint_url when using localstack or custom endpoints.

Basic usage of S3 upload and SQS send with localstack endpoint.

*** Settings ***
Library    AWS.services.S3.AWSS3Client    endpoint_url=${ENDPOINT_URL}
Library    AWS.services.SQS.AWSSQSClient    endpoint_url=${ENDPOINT_URL}

*** Variables ***
${ENDPOINT_URL}    http://localhost:4566

*** Test Cases ***
Upload File
    Put Object    bucket_name=my-bucket    file_path=/tmp/test.txt    key=test.txt

Send Message
    Send Message    queue_url=http://sqs.us-east-1.localhost:4566/000000000000/my-queue    message_body=Hello