{"id":9855,"library":"jupyter-ai","title":"Jupyter AI","description":"Jupyter AI is a set of extensions that provides agentic AI capabilities directly within JupyterLab. It integrates large language models (LLMs) and agents into notebooks and the JupyterLab interface, offering features like real-time chat, code generation, and AI-powered data analysis. The current version is 3.0.0, with an active development cadence releasing frequently, especially leading up to major versions.","status":"active","version":"3.0.0","language":"en","source_language":"en","source_url":"https://github.com/jupyterlab/jupyter-ai","tags":["jupyter","ai","llm","agent","notebook","jupyterlab","extension"],"install":[{"cmd":"pip install jupyter-ai","lang":"bash","label":"Install core package"},{"cmd":"jupyter labextension list","lang":"bash","label":"Verify JupyterLab extension (look for @jupyter-ai/jupyter-ai enabled)"},{"cmd":"jupyter server extension list","lang":"bash","label":"Verify Jupyter server extension (look for jupyter_ai enabled)"}],"dependencies":[{"reason":"Jupyter AI is a JupyterLab extension and requires JupyterLab to function.","package":"jupyterlab","optional":false},{"reason":"Required for agent capabilities and managing tool calls within JupyterLab, introduced in v3.0.0.","package":"jupyter_server_mcp","optional":false},{"reason":"A core dependency for integrating commands into JupyterLab, added as required in v3.0.0rc1.","package":"jupyterlab_commands_toolkit","optional":false},{"reason":"Provides integration with over 1000 LLM providers; replaced Langchain in v3.0.0.","package":"litellm","optional":false},{"reason":"Required for connecting to Amazon Bedrock models.","package":"boto3","optional":true}],"imports":[{"note":"Jupyter AI is primarily used via the JupyterLab UI and magic commands (e.g., `%%ai`). Direct programmatic import of its core functionality like `AiMagics` is less common for end-users, but can be used by developers or for advanced scripting to interact with the magic commands programmatically.","symbol":"AiMagics","correct":"from jupyter_ai_magics import AiMagics"}],"quickstart":{"code":"import os\n\n# Before starting JupyterLab, set your API key for a model provider.\n# Example for OpenAI:\n# os.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'YOUR_OPENAI_KEY')\n# For other providers like Anthropic:\n# os.environ['ANTHROPIC_API_KEY'] = os.environ.get('ANTHROPIC_API_KEY', 'YOUR_ANTHROPIC_KEY')\n\n# --- In a Jupyter notebook cell: ---\n\n# (Optional) Explicitly load the AI magic commands, though often loaded automatically\n# %load_ext jupyter_ai_magics\n\n# List available models and providers\n# %%ai list\n\n# Use a model for code generation\n# %%ai chatgpt -f code\n# Create a Python function to calculate the Nth Fibonacci number recursively.\n\n# Use an agent to interact with Jupyter (requires agent-capable model like Claude, Gemini)\n# %%ai claude -m anthropic.claude-3-haiku-20240307-v1:0\n# Write a simple Python script to list files in the current directory and save it as 'list_files.py'.\n# Note: Agent actions that modify files or execute commands will prompt for approval in the UI.\n\n# --- Using the Jupyter AI chat interface: ---\n# 1. Open the 'Jupyter AI' panel in the left sidebar of JupyterLab.\n# 2. Select a model provider and model from the dropdowns.\n# 3. Start a conversation or ask for assistance with your notebook content.","lang":"python","description":"To use Jupyter AI, ensure it's installed and your JupyterLab server is running. You primarily interact with Jupyter AI through its dedicated chat interface in the JupyterLab sidebar or directly within notebook cells using magic commands (e.g., `%%ai`). Most LLM providers require an API key, which should be set as an environment variable before launching JupyterLab or configured via JupyterLab's settings. The example demonstrates loading magic commands, listing models, and basic usage for code generation and agent interaction."},"warnings":[{"fix":"Review the official Jupyter AI v3.0.0 documentation and migration guides. Existing codebases or configurations relying on Langchain providers will need to be updated to use LiteLLM. Agent interactions will follow the new ACP model, which may require adjustments to prompts or expectations.","message":"Jupyter AI v3.0.0 introduces significant breaking changes, including a complete overhaul of the agent architecture via the Agent Client Protocol (ACP) and a migration from Langchain to LiteLLM for LLM integration.","severity":"breaking","affected_versions":">=3.0.0"},{"fix":"Reconfigure your LLM providers and models according to LiteLLM's conventions. Consult LiteLLM documentation for details on setting up credentials and models. Remove any explicit Langchain dependencies from your environment if they were only for Jupyter AI.","message":"The internal LLM integration framework changed from Langchain to LiteLLM in v3.0.0beta6 (which carried into v3.0.0 stable). Any custom integrations or model configurations built specifically on Langchain will no longer work.","severity":"breaking","affected_versions":">=3.0.0"},{"fix":"Be aware that agent actions might pause, awaiting your approval in the JupyterLab UI (often in the chat panel or a pop-up dialog). This is a security feature; monitor the agent's output and approve/deny actions as prompted.","message":"Jupyter AI agents in v3.0.0 now require explicit user permission for tool calls that modify the filesystem or execute commands within JupyterLab (e.g., writing files, running terminal commands).","severity":"gotcha","affected_versions":">=3.0.0"},{"fix":"If you need to monitor Dask activity for development or debugging, enable it by starting your Jupyter server with `--AiExtension.enable_dask_dashboard=True`. The dashboard will typically be available on port `8787`.","message":"The Dask dashboard, which shows progress for '/learn' calls, is disabled by default starting from v2.31.7. This is intended to avoid unintended resource usage in production environments.","severity":"gotcha","affected_versions":">=2.31.7"}],"env_vars":null,"last_verified":"2026-04-17T00:00:00.000Z","next_check":"2026-07-16T00:00:00.000Z","problems":[{"fix":"If you were previously relying on Langchain for custom model setups, you need to migrate your configuration to LiteLLM. Ensure you are running Jupyter AI v3.0.0+ and LiteLLM is correctly installed (it's a core dependency).","cause":"Jupyter AI v3.0.0 and later versions have deprecated and removed Langchain in favor of LiteLLM for all LLM integrations.","error":"ModuleNotFoundError: No module named 'langchain'"},{"fix":"Ensure `jupyter-ai` is installed (`pip install jupyter-ai`), and restart your JupyterLab server. Verify the extension is enabled using `jupyter labextension list` and `jupyter server extension list`. If issues persist, try `jupyter lab build`.","cause":"The Jupyter AI server extension or magic commands were not loaded or enabled correctly, or the Jupyter server was not restarted after installation.","error":"Command '%%ai' not found or 'jupyter_ai' server extension not found."},{"fix":"Set the required API key as an environment variable (e.g., `export OPENAI_API_KEY=\"your_key_here\"` in your shell before starting JupyterLab) or configure it via the JupyterLab settings editor under the Jupyter AI section in the UI.","cause":"The selected LLM provider requires an API key that has not been set in the environment variables or configured within JupyterLab's settings.","error":"Error: Missing API key for model provider 'openai' (or another provider). Please set the OPENAI_API_KEY environment variable."},{"fix":"Look for a prompt in the JupyterLab UI (often in the chat panel or a pop-up dialog) asking for permission to execute the action. You must explicitly approve or deny the action for the agent to proceed.","cause":"A Jupyter AI agent attempted to perform an action (e.g., file write, command execution) that requires user approval under the new tool call permissions system in v3.0.0.","error":"Permission denied: Agent tried to write file 'example.py'."}]}