LlamaIndex Legacy
LlamaIndex Legacy is a compatibility package that contains the pre-v0.10.0 monolithic codebase of LlamaIndex, a data framework designed to build LLM applications by connecting large language models with external data sources. It is currently at version 0.9.48.post4 and is maintained for users who have not yet migrated to the modular LlamaIndex v0.10+ architecture. While the broader LlamaIndex ecosystem has a rapid release cadence, this specific legacy package receives minimal updates as it is a bridge for older codebases.
Warnings
- breaking The primary LlamaIndex library underwent a significant modularization in version 0.10.0, splitting into `llama-index-core` and numerous integration-specific packages. `llama-index-legacy` is provided as a temporary bridge for pre-v0.10.0 code, but new development should use the modular `llama-index` packages.
- deprecated The `ServiceContext` object has been deprecated in favor of a global `Settings` object or local configurations passed directly to APIs. Many agent-related classes (e.g., `AgentRunner`, `FunctionCallingAgent`) are also deprecated in favor of `AgentWorkflow`.
- deprecated The `llama-index-legacy` package itself is deprecated and has been removed from the main LlamaIndex repository. This implies that it will receive minimal or no new features and potentially limited maintenance.
- gotcha While `llama-index-legacy` explicitly supports Python >=3.8.1 and <4.0, the broader `llama-index` ecosystem has begun deprecating Python 3.9 in recent releases of its modular packages. Users planning a migration should be aware of future Python version requirements.
Install
-
pip install llama-index-legacy
Imports
- VectorStoreIndex
from llama_index.legacy import VectorStoreIndex
- SimpleDirectoryReader
from llama_index.legacy import SimpleDirectoryReader
- Replicate
from llama_index.legacy.llms import Replicate
Quickstart
import os
from llama_index.legacy import VectorStoreIndex, SimpleDirectoryReader
# Set your OpenAI API key as an environment variable
os.environ["OPENAI_API_KEY"] = os.environ.get("OPENAI_API_KEY", "YOUR_OPENAI_API_KEY")
# Create a 'data' directory and place some text files inside it
# e.g., echo "The quick brown fox jumps over the lazy dog." > data/example.txt
# Load documents from the specified directory
documents = SimpleDirectoryReader("data").load_data()
# Create a vector store index from the loaded documents
index = VectorStoreIndex.from_documents(documents)
# Create a query engine and query the index
query_engine = index.as_query_engine()
response = query_engine.query("What did the fox do?")
print(response.response)