DataRobot Storage

raw JSON →
2.2.0 verified Fri May 01 auth: no python

A library providing reusable storage access (S3, GCS, Azure Blob, local filesystem) for DataRobot. Current version 2.2.0, compatible with Python >=3.9, <4.0. Released under the MIT license with quarterly updates.

pip install datarobot-storage
error ImportError: No module named datarobot.storage
cause Using the old dotted import path (pre-v2.0.0).
fix
Use from datarobot_storage import Storage instead of from datarobot.storage import Storage.
error TypeError: Storage() takes at least 1 argument (0 given)
cause Calling Storage() without a backend argument (v2+).
fix
Provide a backend object: Storage(backend).
error datarobot_storage.exceptions.InvalidBackendError: The backend must be an instance of BaseBackend
cause Passing a dict or string instead of a backend instance.
fix
Use a proper backend class: Storage(S3Backend(...)).
breaking v2.0.0 renamed the main package from datarobot.storage to datarobot_storage. All imports must use underscores.
fix Update imports: replace datarobot.storage with datarobot_storage.
breaking Storage constructor now requires a backend object instead of a dict configuration.
fix Instantiate a backend explicitly (e.g., S3Backend(...)) and pass to Storage.
gotcha Environment variables for credentials are not automatically picked up; you must pass them explicitly unless using an IAM role or service account.
fix Pass aws_access_key_id and aws_secret_access_key explicitly, or configure boto3 default session.

Initialize an S3-backed storage and upload a file.

from datarobot_storage import Storage
from datarobot_storage.backends import S3Backend

backend = S3Backend(
    bucket='my-bucket',
    aws_access_key_id=os.environ.get('AWS_ACCESS_KEY_ID', ''),
    aws_secret_access_key=os.environ.get('AWS_SECRET_ACCESS_KEY', '')
)
storage = Storage(backend)
storage.upload('local/path.txt', 'remote/path.txt')