LlamaIndex Legacy

0.9.48.post4 · deprecated · verified Wed Apr 15

LlamaIndex Legacy is a compatibility package that contains the pre-v0.10.0 monolithic codebase of LlamaIndex, a data framework designed to build LLM applications by connecting large language models with external data sources. It is currently at version 0.9.48.post4 and is maintained for users who have not yet migrated to the modular LlamaIndex v0.10+ architecture. While the broader LlamaIndex ecosystem has a rapid release cadence, this specific legacy package receives minimal updates as it is a bridge for older codebases.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to load documents from a local directory, create a vector store index, and query it using the legacy LlamaIndex API. Ensure you have an `OPENAI_API_KEY` set as an environment variable and a `data` directory with at least one text file.

import os
from llama_index.legacy import VectorStoreIndex, SimpleDirectoryReader

# Set your OpenAI API key as an environment variable
os.environ["OPENAI_API_KEY"] = os.environ.get("OPENAI_API_KEY", "YOUR_OPENAI_API_KEY")

# Create a 'data' directory and place some text files inside it
# e.g., echo "The quick brown fox jumps over the lazy dog." > data/example.txt

# Load documents from the specified directory
documents = SimpleDirectoryReader("data").load_data()

# Create a vector store index from the loaded documents
index = VectorStoreIndex.from_documents(documents)

# Create a query engine and query the index
query_engine = index.as_query_engine()
response = query_engine.query("What did the fox do?")

print(response.response)

view raw JSON →