LLM Sandbox
LLM Sandbox is a lightweight and portable Python library designed to run large language model (LLM) generated code in a safe and isolated environment. It supports various container backends like Docker, Kubernetes, and Podman, and offers multi-language execution (Python, JavaScript, Java, C++, Go, R). The project sees frequent minor releases, addressing features, fixes, and security enhancements, and now supports the Model Context Protocol (MCP) server for direct AI assistant integration.
Warnings
- breaking Older versions (prior to 0.3.35) contained command injection and path traversal vulnerabilities. Update immediately to version 0.3.35 or newer to secure your environment.
- gotcha When using features like pooled sessions or `skip_environment_setup`, ensure consistent Python environments. Version 0.3.32 fixed an issue where pooled sessions didn't correctly use the venv Python, and 0.3.29 addressed `skip_environment_setup` using a non-existent venv.
- gotcha To use Docker, Kubernetes, or Podman as a backend, you must install the corresponding optional dependencies (e.g., `pip install 'llm-sandbox[docker]'`) in addition to the base `llm-sandbox` package. Failure to do so will result in runtime errors when attempting to use specific backends.
- gotcha By default, sandbox containers are destroyed after the `SandboxSession` closes, meaning any state (e.g., installed libraries, created files) is lost. To persist the container image state, ensure `keep_template=True` is set during session initialization, though the default `SandboxSession` might not directly expose this in all scenarios, requiring careful management of custom images or `SandboxPoolManager`.
Install
-
pip install llm-sandbox -
pip install 'llm-sandbox[docker]' # For Docker backend pip install 'llm-sandbox[kubernetes]' # For Kubernetes backend pip install 'llm-sandbox[podman]' # For Podman backend
Imports
- SandboxSession
from llm_sandbox import SandboxSession
Quickstart
import os
from llm_sandbox import SandboxSession
# Ensure Docker or another backend is running and accessible
# For a simple test, ensure `llm-sandbox[docker]` is installed.
try:
with SandboxSession(lang="python") as session:
print("Running Python code in sandbox...")
result = session.run(
"import math\nprint(math.factorial(5))"
)
print(f"Sandbox Output: {result.stdout.strip()}")
if result.stderr:
print(f"Sandbox Error: {result.stderr.strip()}")
print("\nRunning Python code with a library (numpy)...")
# Requires numpy to be installed in the sandbox environment,
# or configured for on-the-fly installation. The default image often includes it.
result_numpy = session.run(
"import numpy as np\nprint(np.mean([1, 2, 3, 4]))",
libraries=["numpy"]
)
print(f"Sandbox NumPy Output: {result_numpy.stdout.strip()}")
if result_numpy.stderr:
print(f"Sandbox NumPy Error: {result_numpy.stderr.strip()}")
except Exception as e:
print(f"An error occurred during sandbox execution: {e}")
print("Ensure your container backend (e.g., Docker) is running and configured correctly.")