Llama-Index Managed Llama Cloud Indices
This library provides integration for LlamaIndex to create, manage, and query indices hosted on Llama Cloud. It enables persistent, scalable, and production-ready Retrieval-Augmented Generation (RAG) solutions without needing to manage vector databases or other infrastructure directly. The current version is `0.11.1`, and it follows the frequent release cadence of the broader LlamaIndex ecosystem, often tied to Llama Cloud service updates.
Common errors
-
ModuleNotFoundError: No module named 'llama_index.indices.managed.llama_cloud'
cause The `llama-index-indices-managed-llama-cloud` package is not installed in your Python environment, or there is a conflict preventing its discovery. This can also occur if the import path has changed in newer LlamaIndex versions or if the package itself is deprecated.fixEnsure the package is installed by running `pip install llama-index-indices-managed-llama-cloud`. If the issue persists, verify your Python environment. Note that this package is deprecated; consider migrating to `pip install llama-cloud` and updating your imports to `from llama_cloud_services import LlamaCloudIndex` for ongoing support. -
{"error": "InvalidAPIKey", "message": "The provided API key is invalid or expired."}cause The Llama Cloud API key used for authentication is either incorrect, expired, not properly configured as an environment variable (`LLAMA_CLOUD_API_KEY`), or lacks the necessary permissions. It could also be a mismatch with the target region or project scope.fixVerify your API key in the Llama Cloud dashboard, generate a new one if it's expired or compromised, and ensure it's correctly set as an environment variable (e.g., `os.environ["LLAMA_CLOUD_API_KEY"] = "llx-..."`) or passed directly to the `LlamaCloudIndex` constructor (`api_key="llx-..."`). Also, confirm the API key's associated region and project. -
pydantic.v1.error_wrappers.ValidationError: 1 validation error for ParsingModel[List[llama_cloud.types.pipeline.Pipeline]] __root__ -> 0 -> preset_retrieval_parameters -> retrieval_mode value is not a valid enumeration member; permitted: 'chunks', 'files_via_metadata', 'files_via_content'
cause This validation error indicates a mismatch between the `llama-index-indices-managed-llama-cloud` client library's expected data model for API responses (specifically regarding `retrieval_mode`) and what the Llama Cloud API is actually returning or accepting. This often happens due to version incompatibility between the client library and the Llama Cloud service or an outdated client.fixUpdate the `llama-index-indices-managed-llama-cloud` package to its latest version (e.g., `pip install --upgrade llama-index-indices-managed-llama-cloud`) and ensure your `llama-index-core` packages are also up-to-date. If the issue persists, review the Llama Cloud documentation for the `retrieval_mode` parameter to align your code with the current API specifications. -
NOTE: This package has been deprecated and is no longer maintained. Please use the llama-cloud package instead.
cause The `llama-index-indices-managed-llama-cloud` package has been deprecated by the LlamaIndex project. It is no longer actively maintained, and its functionality has been integrated into, or replaced by, the `llama-cloud` package to support the evolving Llama Cloud services.fixTo ensure continued support, access to new features, and bug fixes, migrate your application to use the `llama-cloud` package. First, uninstall the deprecated package (`pip uninstall llama-index-indices-managed-llama-cloud`), then install the new package (`pip install llama-cloud`). Update your import statements in your code, for example, from `from llama_index.indices.managed.llama_cloud import LlamaCloudIndex` to `from llama_cloud_services import LlamaCloudIndex`.
Warnings
- gotcha The `LLAMA_CLOUD_API_KEY` environment variable or `api_key` parameter in `LlamaCloudIndex` initialization is mandatory for authentication with Llama Cloud. Forgetting to set it will result in authentication errors and prevent index operations.
- breaking The LlamaIndex ecosystem undergoes rapid development. Frequent updates in the core `llama-index` library can introduce API changes that might affect the `llama-index-indices-managed-llama-cloud` package. This can lead to breaking changes between minor versions.
- gotcha When creating or connecting to an index, the `name` parameter in `LlamaCloudIndex` must be unique within your Llama Cloud project. If you call `from_documents` or `from_index_name` with an existing name, it will connect to that index. If new documents are provided, they will be upserted into the existing index, which might not be the intended behavior if you expect a fresh index.
Install
-
pip install llama-index-indices-managed-llama-cloud
Imports
- LlamaCloudIndex
from llama_index.readers.llama_cloud import LlamaCloudReader
from llama_index.indices.managed.llama_cloud import LlamaCloudIndex
Quickstart
import os
from llama_index.indices.managed.llama_cloud import LlamaCloudIndex
from llama_index.core.schema import Document
# Set your Llama Cloud API key as an environment variable
# You can obtain one from app.llamacloud.ai
api_key = os.environ.get("LLAMA_CLOUD_API_KEY", "")
if not api_key:
print("Warning: LLAMA_CLOUD_API_KEY environment variable not set. The example will not fully function without it.")
# Define some documents to be indexed
documents = [
Document(text="The quick brown fox jumps over the lazy dog."),
Document(text="LlamaIndex helps build LLM applications with external data.")
]
# Create a new managed index or connect to an existing one by name.
# If the index 'my_managed_index_example' doesn't exist, it will be created.
# If it exists, the documents will be upserted.
print(f"Attempting to create/connect to Llama Cloud Index 'my_managed_index_example'...")
index = LlamaCloudIndex.from_documents(
documents,
name="my_managed_index_example", # Use a unique name for your index
api_key=api_key
)
print(f"Successfully connected to Llama Cloud Index: {index.index_name}")
# Query the index using a query engine
query_engine = index.as_query_engine()
query_text = "What does LlamaIndex help with?"
print(f"\nQuery: {query_text}")
response = query_engine.query(query_text)
print(f"Response: {response}")