{"id":2575,"library":"llama-index-cli","title":"llama-index-cli","description":"llama-index-cli was a command-line interface tool for LlamaIndex, primarily designed to facilitate Retrieval-Augmented Generation (RAG) with local files without writing Python code. It allowed users to ingest documents and perform Q&A or chat within the terminal. The package is currently at version 0.5.7, but it has been officially deprecated and is no longer maintained as of April 2, 2026.","status":"deprecated","version":"0.5.7","language":"en","source_language":"en","source_url":"https://github.com/run-llama/llama_index/tree/main/llama-index-cli","tags":["LLM","RAG","CLI","LlamaIndex","deprecated","AI"],"install":[{"cmd":"pip install llama-index-cli","lang":"bash","label":"Latest Version (Deprecated)"}],"dependencies":[{"reason":"Core LlamaIndex functionality that the CLI operates upon.","package":"llama-index","optional":false},{"reason":"Default vector database used by the RAG CLI tool.","package":"chromadb","optional":true},{"reason":"For PDF file parsing when ingesting PDF documents.","package":"pypdf","optional":true}],"imports":[{"note":"While this was the programmatic entry for customizing the RAG CLI, the `llama-index-cli` package is deprecated. Direct usage is discouraged in favor of the main `llama-index` library or its modern CLI tool, `create-llama`.","symbol":"RagCLI","correct":"from llama_index.cli.rag_cli import RagCLI"}],"quickstart":{"code":"import os\nimport subprocess\n\n# NOTE: llama-index-cli is deprecated. Use the main llama-index library\n# or the 'create-llama' CLI for new projects.\n\n# 1. Set your OpenAI API key (required by default for embeddings/LLM)\n# Replace 'YOUR_OPENAI_API_KEY' with your actual key or set it as an env var.\nos.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'YOUR_OPENAI_API_KEY')\n\n# Create a dummy file for demonstration\nwith open('example.txt', 'w') as f:\n    f.write('LlamaIndex is a data framework for LLM applications. It helps connect custom data sources with LLMs.')\n\nprint(\"Ingesting 'example.txt' and asking a question via llamaindex-cli...\")\n\n# 2. Ingest a file and ask a question\ntry:\n    result = subprocess.run(\n        ['llamaindex-cli', 'rag', '--files', 'example.txt', '--question', 'What is LlamaIndex?'],\n        capture_output=True, text=True, check=True\n    )\n    print(\"--- CLI Output ---\")\n    print(result.stdout)\n    if result.stderr:\n        print(\"--- CLI Errors (if any) ---\")\n        print(result.stderr)\nexcept FileNotFoundError:\n    print(\"Error: 'llamaindex-cli' command not found. Ensure the package is installed and in your PATH.\")\nexcept subprocess.CalledProcessError as e:\n    print(f\"Error running command: {e}\")\n    print(f\"Stdout: {e.stdout}\")\n    print(f\"Stderr: {e.stderr}\")\n\n# Clean up the dummy file\nos.remove('example.txt')\n","lang":"python","description":"This quickstart demonstrates how to use the `llamaindex-cli` tool to ingest a local file and query it using Retrieval-Augmented Generation (RAG). It requires an OpenAI API key by default. Note that `llama-index-cli` is deprecated; for new projects, consider using `create-llama` or directly integrating with the `llama-index` Python library."},"warnings":[{"fix":"Migrate to `create-llama` (for project scaffolding) or the main `llama-index` library for programmatic RAG workflows.","message":"The `llama-index-cli` package is officially deprecated and no longer maintained. It will not receive further updates or bug fixes. Users are advised to consider alternatives such as the `create-llama` CLI tool for scaffolding LlamaIndex projects or interacting directly with the core `llama-index` Python library.","severity":"deprecated","affected_versions":"0.5.7 and older"},{"fix":"Set the `OPENAI_API_KEY` environment variable or customize the CLI tool to use local models or other providers. Check LlamaIndex documentation for customization options.","message":"By default, `llama-index-cli` uses OpenAI for embeddings and LLM, which requires setting the `OPENAI_API_KEY` environment variable. This means that, by default, any local data ingested with this tool will be sent to OpenAI's API.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Refer to the LlamaIndex v0.10 Migration Guide for changes in `llama-index` core. The `llama-index-cli` package included an `upgrade-file` command to assist with these migrations, though its utility is limited now due to the CLI's deprecation.","message":"Upgrading the underlying `llama-index` library to versions v0.10.0 and above introduced significant breaking changes, including a massive packaging refactor, deprecation of `ServiceContext`, and changes to import paths. While `llama-index-cli` itself is deprecated, users might encounter issues if their `llama-index` dependency is updated.","severity":"breaking","affected_versions":"llama-index >= 0.10.0 (dependency of llama-index-cli)"},{"fix":"Consider installing `onnxruntime` separately with `SYSTEM_VERSION_COMPAT=0 pip install --no-cache-dir 'onnxruntime>=1.17.1'` before installing `llama-index-cli` and `chromadb`, or use a fresh virtual environment.","message":"Users installing `llama-index-cli` alongside `chromadb` (a default dependency for RAG) may encounter dependency conflicts, particularly with `onnxruntime` in certain environments (e.g., Conda). This can lead to installation failures.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-10T00:00:00.000Z","next_check":"2026-07-09T00:00:00.000Z"}