LlamaIndex OpenAI Agent
The `llama-index-agent-openai` library provides an integration for creating agents within LlamaIndex that leverage OpenAI's language models. It allows defining tools and orchestrating them with OpenAI's function-calling capabilities, making it easy to build powerful LLM-powered applications. As part of the LlamaIndex ecosystem, it stays current with LlamaIndex's modular architecture (v0.10+). The current version is `0.4.12`, with updates typically aligning with LlamaIndex core releases.
Warnings
- breaking LlamaIndex Modularization (v0.10+): `llama-index-agent-openai` is designed exclusively for the modular LlamaIndex (v0.10 and later). It explicitly requires `llama-index>=0.10.12`. Attempts to use it with older, monolithic LlamaIndex versions (e.g., v0.9.x) will result in import errors or runtime failures due to incompatible API structures.
- gotcha `openai` Python package version: This integration requires `openai>=1.1.0`. Using older `openai` versions (e.g., `0.x.x`) will cause `APIError` or `TypeError` due to significant changes in `openai`'s API, especially regarding client initialization and method calls (e.g., `openai.Completion` vs `client.chat.completions.create`).
- gotcha `OpenAIAgent` initialization with tools: The recommended and most robust way to initialize `OpenAIAgent` is using `OpenAIAgent.from_tools()`. Directly passing a `tools` argument to the constructor, or using older initialization patterns, might behave differently or be deprecated in future versions, potentially leading to incorrect tool invocation or agent behavior.
- gotcha OpenAI API Key Configuration: The OpenAI API key (traditionally `OPENAI_API_KEY`) must be correctly configured. If it's not set as an environment variable, it must be explicitly passed to the `OpenAI` LLM instance during initialization. Forgetting this will result in `AuthenticationError` from the OpenAI API.
Install
-
pip install llama-index-agent-openai
Imports
- OpenAIAgent
from llama_index.agent.openai import OpenAIAgent
- OpenAI LLM
from llama_index.llms.openai import OpenAI
Quickstart
import os
from llama_index.agent.openai import OpenAIAgent
from llama_index.llms.openai import OpenAI
from llama_index.core.tools import FunctionTool
def multiply(a: int, b: int) -> int:
"""Multiply two integers and return the result integer"""
return a * b
# Define a tool from a function
multiply_tool = FunctionTool.from_defaults(fn=multiply)
# Initialize LLM with API key (ensure OPENAI_API_KEY is set in environment)
llm = OpenAI(
model="gpt-3.5-turbo",
api_key=os.environ.get('OPENAI_API_KEY', '') # Use os.environ.get for security
)
# Initialize OpenAI agent with tools
agent = OpenAIAgent.from_tools(
tools=[multiply_tool],
llm=llm,
verbose=True,
)
# Chat with the agent
response = agent.chat("What is 2 * 2?")
print(f"Agent Response: {response}")
response_complex = agent.chat("What is 10 times 5 plus 3?")
print(f"Agent Response (complex): {response_complex}")