Llama Cloud Services SDK (Deprecated)

raw JSON →
0.6.94 verified Tue May 12 auth: no python install: reviewed quickstart: verified deprecated

This library provides tailored SDK clients for LlamaCloud services such as LlamaParse, LlamaExtract, and LlamaCloud Index. As of January 2026, this package is officially deprecated and will only be maintained until May 1, 2026. Users are strongly advised to migrate to the `llama-cloud` Python package for continued support and new features.

pip install llama-cloud-services
error ModuleNotFoundError: No module named 'llama_cloud_services'
cause The `llama-cloud-services` package is deprecated. Users are advised to migrate to the `llama-cloud` package, which has a different top-level module name and structure. This error occurs when old import statements (`from llama_cloud_services import ...`) are used after uninstalling the deprecated package or when the new package is installed.
fix
Migrate to the llama-cloud package. First, uninstall the old package: pip uninstall llama-cloud-services. Then install the new package: pip install llama-cloud. Finally, update your import statements, for example, replace from llama_cloud_services import LlamaParse with from llama_cloud import LlamaParse.
error ImportError: cannot import name 'ExtractAgentCreate' from 'llama_cloud'
cause This error typically occurs when there's a version mismatch or conflict between `llama-cloud-services` and `llama-cloud` packages, or when attempting to import a class that has been renamed or refactored in the newer `llama-cloud` package, leading to an incorrect import path or non-existent name in the `llama_cloud` module.
fix
Ensure you have uninstalled llama-cloud-services and installed only llama-cloud. Review the llama-cloud documentation for the correct class names and import paths, as ExtractAgentCreate might have been renamed or moved within the new package structure. If using LlamaExtract, the main class is LlamaExtract.
error 401 Unauthorized on every request
cause This HTTP error indicates an issue with authentication, usually due to an invalid, expired, or incorrectly provided API key, or attempting to use an API key from one region (e.g., North America) with a different regional API endpoint (e.g., EU).
fix
Verify your API key is correct and active in your LlamaCloud dashboard. Ensure it's passed correctly to the client constructor (e.g., LlamaParse(api_key="YOUR_API_KEY")) or set as the LLAMA_CLOUD_API_KEY environment variable. If using a specific region, confirm the base_url parameter matches the region where your API key was generated (e.g., base_url="https://api.cloud.eu.llamaindex.ai" for EU).
error ERROR_DURING_PROCESSING: An unknown error occurred during processing.
cause This error from the LlamaParse service indicates a failure during document processing, which can be due to various reasons such as document complexity, large file size leading to timeouts, unsupported file formats, or internal service issues.
fix
Check the job status via the API or web UI for more specific error codes or messages. For large or complex documents, consider optimizing by using target_pages to parse specific pages, choosing a faster tier if applicable, splitting the document, or ensuring the file format is supported and not corrupted. Review LlamaParse's troubleshooting guide for specific error codes and solutions related to parsing problems.
breaking The `llama-cloud-services` package is officially deprecated and will no longer be maintained after May 1, 2026. New features and bug fixes will only be released for the new `llama-cloud` package.
fix Migrate to the new `llama-cloud` Python package by installing `pip install llama-cloud>=1.0` and updating your imports and code accordingly. Refer to the official LlamaIndex documentation for migration guides.
gotcha When migrating to the new `llama-cloud` package, import paths and client initialization patterns have changed. Direct replacements of `LlamaParse`, `LlamaExtract`, and `LlamaCloudIndex` from `llama_cloud_services` will not work.
fix For the new `llama-cloud` package, imports are generally from `llama_cloud` (e.g., `from llama_cloud import LlamaCloud`). Services are accessed via the main client object (e.g., `client.parsing.create(...)`). Review the `llama-cloud` documentation for the updated API.
gotcha API keys are sensitive. Storing them directly in code is a security risk. The `api_key` parameter is mandatory for client initialization.
fix Always retrieve your API key from environment variables (e.g., `os.environ.get('LLAMA_CLOUD_API_KEY')`) or a secure secrets management system. The `python-dotenv` package can also be used for local development.
gotcha LlamaCloud services can be hosted in different regions (e.g., US and EU). The default base URL might not be suitable for all deployments.
fix If you are using LlamaCloud services in the EU, you must specify `base_url=EU_BASE_URL` during client initialization (e.g., `LlamaParse(api_key=..., base_url=EU_BASE_URL)`). Ensure your API key is also created in the correct region.
breaking The installed `llama-index-core` version uses type hints (`X | Y`) that are incompatible with Python 3.9. This syntax for union types was introduced in Python 3.10 (PEP 604).
fix Upgrade your Python environment to 3.10 or newer, or downgrade the `llama-index-core` package to a version that is compatible with Python 3.9 (e.g., `<0.14.0`).
pip install llama-cloud>=1.0
python os / libc variant status wheel install import disk
3.10 alpine (musl) llama-cloud-services wheel - 7.06s 242.1M
3.10 alpine (musl) llama-cloud>=1.0 wheel - - 90.7M
3.10 alpine (musl) llama-cloud-services - - 7.26s 242.3M
3.10 alpine (musl) llama-cloud>=1.0 - - - -
3.10 slim (glibc) llama-cloud-services wheel 21.1s 5.30s 239M
3.10 slim (glibc) llama-cloud>=1.0 wheel 4.5s - 161M
3.10 slim (glibc) llama-cloud-services - - 4.80s 239M
3.10 slim (glibc) llama-cloud>=1.0 - - - -
3.11 alpine (musl) llama-cloud-services wheel - 7.82s 268.7M
3.11 alpine (musl) llama-cloud>=1.0 wheel - - 98.9M
3.11 alpine (musl) llama-cloud-services - - 8.47s 268.9M
3.11 alpine (musl) llama-cloud>=1.0 - - - -
3.11 slim (glibc) llama-cloud-services wheel 19.0s 7.21s 265M
3.11 slim (glibc) llama-cloud>=1.0 wheel 4.0s - 169M
3.11 slim (glibc) llama-cloud-services - - 6.73s 265M
3.11 slim (glibc) llama-cloud>=1.0 - - - -
3.12 alpine (musl) llama-cloud-services wheel - 6.87s 257.7M
3.12 alpine (musl) llama-cloud>=1.0 wheel - - 89.1M
3.12 alpine (musl) llama-cloud-services - - 8.24s 257.9M
3.12 alpine (musl) llama-cloud>=1.0 - - - -
3.12 slim (glibc) llama-cloud-services wheel 15.9s 7.03s 254M
3.12 slim (glibc) llama-cloud>=1.0 wheel 3.4s - 159M
3.12 slim (glibc) llama-cloud-services - - 7.76s 254M
3.12 slim (glibc) llama-cloud>=1.0 - - - -
3.13 alpine (musl) llama-cloud-services wheel - 6.64s 256.8M
3.13 alpine (musl) llama-cloud>=1.0 wheel - - 85.7M
3.13 alpine (musl) llama-cloud-services - - 7.33s 256.9M
3.13 alpine (musl) llama-cloud>=1.0 - - - -
3.13 slim (glibc) llama-cloud-services wheel 17.0s 6.70s 253M
3.13 slim (glibc) llama-cloud>=1.0 wheel 3.6s - 158M
3.13 slim (glibc) llama-cloud-services - - 7.21s 253M
3.13 slim (glibc) llama-cloud>=1.0 - - - -
3.9 alpine (musl) llama-cloud-services wheel - - 246.5M
3.9 alpine (musl) llama-cloud>=1.0 wheel - - 89.7M
3.9 alpine (musl) llama-cloud-services - - - -
3.9 alpine (musl) llama-cloud>=1.0 - - - -
3.9 slim (glibc) llama-cloud-services wheel 24.8s - 247M
3.9 slim (glibc) llama-cloud>=1.0 wheel 5.4s - 160M
3.9 slim (glibc) llama-cloud-services - - - -
3.9 slim (glibc) llama-cloud>=1.0 - - - -

This quickstart demonstrates how to initialize the LlamaParse client using the `llama-cloud-services` package. It uses an API key, preferably from an environment variable. Note that to actually parse a document, you would first need to upload a file to LlamaCloud and use its `file_id`.

import os
from llama_cloud_services import LlamaParse

# Get your API key from LlamaCloud. Recommended to use environment variables.
LLAMA_CLOUD_API_KEY = os.environ.get('LLAMA_CLOUD_API_KEY', 'your_api_key_here')

if LLAMA_CLOUD_API_KEY == 'your_api_key_here':
    print("WARNING: Please set the LLAMA_CLOUD_API_KEY environment variable or replace 'your_api_key_here' with your actual key.")

# Initialize LlamaParse
parser = LlamaParse(api_key=LLAMA_CLOUD_API_KEY)

# Example usage (assuming you have a file_id from an uploaded document)
# This is a conceptual example as parsing a file requires prior upload.
# For a full workflow, refer to the LlamaCloud documentation.
print("LlamaParse client initialized. You can now use it to parse documents.")
# To parse a document, you would typically upload a file first and then use its ID:
# job = parser.create(tier="agentic", version="latest", file_id="your_uploaded_file_id")
# print(f"Parsing job created with ID: {job.id}")