Open Interpreter

0.4.3 · active · verified Thu Apr 16

Open Interpreter is an open-source library that enables Large Language Models (LLMs) to run code (Python, Javascript, Shell, and more) locally on your machine. It provides a natural-language chat interface for interacting with your computer's general-purpose capabilities, such as creating/editing files, controlling browsers, and analyzing data. It serves as a powerful local alternative to ChatGPT's Code Interpreter, operating without sandbox limits. The current version is 0.4.3, with frequent updates and major releases, indicating an active development cadence.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize Open Interpreter and start an interactive chat session. It includes a placeholder for setting the OpenAI API key, which is necessary when using hosted models. By default, Open Interpreter asks for user confirmation before executing any code for safety reasons. `auto_run = True` can bypass this, but should be used with extreme caution as code executes directly on your system.

import os
from interpreter import interpreter

# Configure API key for hosted models (e.g., OpenAI)
# For local models, this might not be needed or setup differently.
os.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'YOUR_OPENAI_API_KEY')

# Enable auto_run with caution, or let it prompt for confirmation
interpreter.auto_run = False # Set to True to bypass confirmation, use with extreme caution

print("Starting interactive Open Interpreter chat. Type 'quit' to exit.\n")

# Start an interactive chat
# The interpreter will ask for confirmation before executing code by default.
interpreter.chat()

view raw JSON →