{"id":4615,"library":"llama-index-program-openai","title":"LlamaIndex OpenAI Pydantic Program","description":"The `llama-index-program-openai` library is an integration for LlamaIndex that facilitates the generation of structured data using OpenAI's API in conjunction with Pydantic objects. It allows users to define explicit output schemas using Pydantic models, enabling robust and type-safe data extraction from LLM responses. Currently at version 0.3.2, this package follows the rapid release and modular development cadence of the broader LlamaIndex ecosystem.","status":"active","version":"0.3.2","language":"en","source_language":"en","source_url":"https://github.com/run-llama/llama_index","tags":["LlamaIndex","OpenAI","Pydantic","LLM","Structured Data","AI","Integrations"],"install":[{"cmd":"pip install llama-index-program-openai","lang":"bash","label":"Install llama-index-program-openai"}],"dependencies":[{"reason":"Provides core LlamaIndex abstractions, essential for any LlamaIndex integration.","package":"llama-index-core","optional":false},{"reason":"Required for interfacing with OpenAI's Large Language Models (LLMs) which power the Pydantic program.","package":"llama-index-llms-openai","optional":false},{"reason":"Used to define the structured output schemas (BaseModel classes) that the program will populate.","package":"pydantic","optional":false},{"reason":"The underlying Python client for interacting with the OpenAI API.","package":"openai","optional":false}],"imports":[{"note":"The modular structure of LlamaIndex requires importing from the specific integration package's namespace.","wrong":"from llama_index.program import OpenAIPydanticProgram","symbol":"OpenAIPydanticProgram","correct":"from llama_index.program.openai import OpenAIPydanticProgram"},{"symbol":"BaseModel","correct":"from pydantic import BaseModel"},{"symbol":"List","correct":"from typing import List"}],"quickstart":{"code":"import os\nfrom pydantic import BaseModel\nfrom typing import List\nfrom llama_index.program.openai import OpenAIPydanticProgram\n\n# Set your OpenAI API key (replace 'sk-...' or ensure it's in your environment)\nos.environ[\"OPENAI_API_KEY\"] = os.environ.get(\"OPENAI_API_KEY\", \"sk-YOUR_OPENAI_KEY_HERE\")\n\n# Define your desired structured output schema using Pydantic\nclass Song(BaseModel):\n    title: str\n    length_seconds: int\n\nclass Album(BaseModel):\n    name: str\n    artist: str\n    songs: List[Song]\n\n# Define the prompt template for the LLM\nprompt_template_str = \"\"\"\nGenerate an example album, with an artist and a list of songs.\nUsing the movie {movie_name} as inspiration.\n\"\"\"\n\n# Initialize the OpenAI Pydantic program\nprogram = OpenAIPydanticProgram.from_defaults(\n    output_cls=Album,\n    prompt_template_str=prompt_template_str,\n    verbose=True\n)\n\n# Run the program to get structured output\noutput = program(movie_name=\"The Shining\")\n\n# Print the structured output\nprint(output.json(indent=2))\n","lang":"python","description":"This quickstart demonstrates how to use `OpenAIPydanticProgram` to extract structured data. It defines a Pydantic `Album` schema, constructs a prompt using a template, and then executes the program with an OpenAI LLM to populate the `Album` object based on the given inspiration. Ensure your `OPENAI_API_KEY` environment variable is set."},"warnings":[{"fix":"Explicitly install integration packages (`pip install llama-index-llms-openai`) and import from their specific namespaces (e.g., `from llama_index.llms.openai import OpenAI`). Replace `ServiceContext` usage with `Settings` or direct `llm`/`embed_model` parameters.","message":"Beginning with LlamaIndex v0.10, the library adopted a modular package structure. `llama-index-program-openai` is now a standalone package, and core components are in `llama-index-core`. Ensure you install specific integration packages (e.g., `llama-index-llms-openai`) and adjust imports if migrating from older versions where all modules were consolidated under `llama_index`. The `ServiceContext` object was deprecated in v0.10 and fully removed in v0.11; use the `Settings` object or direct parameter passing instead.","severity":"breaking","affected_versions":"llama-index >= 0.10.0"},{"fix":"Remove or update any `pydantic.v1` imports and ensure your Pydantic models are compatible with Pydantic V2.","message":"LlamaIndex v0.11 fully migrated to Pydantic V2. If your project previously used `pydantic.v1` imports to work around compatibility issues, these should now be removed or updated to Pydantic V2 syntax.","severity":"breaking","affected_versions":"llama-index >= 0.11.0"},{"fix":"Set `os.environ['OPENAI_API_KEY'] = 'your_key_here'` or export it in your shell (`export OPENAI_API_KEY='your_key_here'`) before running your application.","message":"An `OPENAI_API_KEY` environment variable must be set for `llama-index-program-openai` to function, as it relies on OpenAI's API. Failure to set this will result in an `openai.error.AuthenticationError`.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Shorten tool descriptions or move extensive details into the LLM prompt instead of the tool's metadata to avoid the character limit.","message":"When using `OpenAIPydanticProgram` within agents that leverage OpenAI's function calling, tool descriptions have a maximum length of 1024 characters. Exceeding this limit will cause an error during tool construction.","severity":"gotcha","affected_versions":"All versions utilizing OpenAI function calling with tools"}],"env_vars":null,"last_verified":"2026-04-12T00:00:00.000Z","next_check":"2026-07-11T00:00:00.000Z"}