Open Interpreter
Open Interpreter is an open-source library that enables Large Language Models (LLMs) to run code (Python, Javascript, Shell, and more) locally on your machine. It provides a natural-language chat interface for interacting with your computer's general-purpose capabilities, such as creating/editing files, controlling browsers, and analyzing data. It serves as a powerful local alternative to ChatGPT's Code Interpreter, operating without sandbox limits. The current version is 0.4.3, with frequent updates and major releases, indicating an active development cadence.
Common errors
-
OSError: [WinError 10106] The requested service provider could not be loaded or initialized Error during installation with OpenBLAS: Command [...] 'llama-cpp-python']' returned non-zero exit status 1.
cause Problem installing `llama-cpp-python` on Windows, often due to missing C++ build tools or specific compiler requirements.fixManually install `llama-cpp-python` first: `pip install llama-cpp-python`. If it fails, install 'Visual Studio 2019/2022 Community Edition' with 'Desktop development with C++' workload, then retry `llama-cpp-python` installation, and finally `pip install open-interpreter`. -
openai.error.InvalidRequestError: The model gpt-4 does not exist or you do not have access to it.
cause Your OpenAI account does not have API access to the GPT-4 model.fixCheck your OpenAI account's API access and billing status. If GPT-4 is unavailable, use `interpreter.llm.model = "gpt-3.5-turbo"` or another accessible model. -
No module named pip
cause The Python `pip` package manager is not properly installed or accessible in your environment.fixEnsure Python is correctly installed, and pip is part of the installation. You can often reinstall/update pip using `python -m ensurepip --upgrade` or `python -m pip install -U pip`. -
Interpreter runs code, but expected changes (e.g., creating files, changing directories) don't persist or are in the wrong location.
cause The interpreter might be executing Python scripts in an isolated subprocess, or its understanding of the current working directory doesn't match the shell's, especially for commands that affect the shell's state.fixWhen performing operations like changing directories or creating files, be explicit about full paths. If using Python's `os.chdir()`, remember it only affects the Python process. For persistent shell-level changes, ensure the interpreter directly runs shell commands (e.g., `cd`) in the desired context, or integrate with its `computer.run()` methods for more direct system interaction.
Warnings
- breaking Open Interpreter executes code directly on your local system, allowing it to modify files, install packages, and interact with system settings. While it prompts for confirmation by default, bypassing this with `interpreter.auto_run = True` or `interpreter -y` carries significant risk, including data loss or security vulnerabilities.
- gotcha Python version compatibility can be strict, especially for optional dependencies like `llama-cpp-python`. While PyPI states `>=3.9, <4`, official documentation often recommends Python 3.10 or 3.11 and notes issues with Python 3.12 for some dependencies.
- gotcha When using hosted LLMs (e.g., OpenAI's GPT-4), an API key must be configured. For OpenAI, this is typically done by setting the `OPENAI_API_KEY` environment variable.
- gotcha Users often encounter `InvalidRequestError: The model gpt-4 does not exist or you do not have access to it` when trying to use GPT-4.
Install
-
pip install open-interpreter -
pip install open-interpreter[local]
Imports
- interpreter
from interpreter import interpreter
Quickstart
import os
from interpreter import interpreter
# Configure API key for hosted models (e.g., OpenAI)
# For local models, this might not be needed or setup differently.
os.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'YOUR_OPENAI_API_KEY')
# Enable auto_run with caution, or let it prompt for confirmation
interpreter.auto_run = False # Set to True to bypass confirmation, use with extreme caution
print("Starting interactive Open Interpreter chat. Type 'quit' to exit.\n")
# Start an interactive chat
# The interpreter will ask for confirmation before executing code by default.
interpreter.chat()