LlamaIndex Core

0.14.20 · active · verified Thu Apr 09

LlamaIndex Core provides the foundational interface and components for building LLM-powered applications, enabling users to connect large language models with their private or domain-specific data. It includes data structures, indexing tools, query engines, and basic abstractions for LLMs and embedding models. The current version is 0.14.20, with frequent, often daily, releases across its modular ecosystem.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates loading data, configuring the LLM and embedding model via global `Settings`, creating a vector store index, and performing a simple query. Ensure you have the `OPENAI_API_KEY` environment variable set and the `llama-index-llms-openai` and `llama-index-embeddings-openai` packages installed.

import os
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, Settings
from llama_index.llms.openai import OpenAI
from llama_index.embeddings.openai import OpenAIEmbedding

# Ensure you have your OpenAI API key set as an environment variable
# os.environ["OPENAI_API_KEY"] = "sk-..."
OPENAI_API_KEY = os.environ.get('OPENAI_API_KEY', '')

if not OPENAI_API_KEY:
    raise ValueError("OPENAI_API_KEY environment variable not set.")

# Create a dummy data directory and file
if not os.path.exists("data"):
    os.makedirs("data")
with open("data/hello.txt", "w") as f:
    f.write("The quick brown fox jumps over the lazy dog.\n")
    f.write("LlamaIndex is a data framework for LLM applications.")

# 1. Load data
documents = SimpleDirectoryReader("data").load_data()

# 2. Configure global settings (LLM and Embedding Model)
Settings.llm = OpenAI(api_key=OPENAI_API_KEY, model="gpt-3.5-turbo")
Settings.embed_model = OpenAIEmbedding(api_key=OPENAI_API_KEY, model="text-embedding-ada-002")

# 3. Create an index
index = VectorStoreIndex.from_documents(documents)

# 4. Create a query engine
query_engine = index.as_query_engine()

# 5. Query the index
response = query_engine.query("What is LlamaIndex?")
print(response.response)

view raw JSON →