{"id":7376,"library":"llama-index-llms-openai-like","title":"LlamaIndex OpenAI-Like LLM Integration","description":"This package provides an integration for LlamaIndex to use OpenAI-compatible Large Language Models (LLMs). It acts as a thin wrapper, allowing LlamaIndex applications to interact with any API that mimics the OpenAI API, making it flexible for various third-party LLM providers. The current version is 0.7.1, with releases typically tied to the broader LlamaIndex ecosystem updates.","status":"active","version":"0.7.1","language":"en","source_language":"en","source_url":"https://github.com/run-llama/llama_index/tree/main/llama-index-integrations/llms/llama-index-llms-openai-like","tags":["LlamaIndex","LLM","OpenAI-compatible","integration","AI","Python"],"install":[{"cmd":"pip install llama-index-llms-openai-like","lang":"bash","label":"Install with pip"}],"dependencies":[{"reason":"Provides core LlamaIndex abstractions and components.","package":"llama-index-core","optional":false},{"reason":"Often a peer dependency or implicitly required by other LlamaIndex components when working with OpenAI-style models.","package":"llama-index-llms-openai","optional":false},{"reason":"Optional dependency, primarily for tokenizer functionality if not explicitly provided or if certain LlamaIndex features that rely on it are used.","package":"transformers","optional":true}],"imports":[{"symbol":"OpenAILike","correct":"from llama_index.llms.openai_like import OpenAILike"}],"quickstart":{"code":"import os\nfrom llama_index.llms.openai_like import OpenAILike\n\n# Replace with your actual model name and API base URL\n# If your API doesn't require an API key, set it to a dummy string like 'fake'\n# Set is_chat_model and context_window to match your model's capabilities\nllm = OpenAILike(\n    model=\"my-openai-compatible-model\",\n    api_base=os.environ.get(\"OPENAI_COMPATIBLE_API_BASE\", \"http://localhost:8000/v1\"),\n    api_key=os.environ.get(\"OPENAI_COMPATIBLE_API_KEY\", \"fake-api-key\"),\n    is_chat_model=True,\n    context_window=4096 # Adjust based on your model's context window\n)\n\nresponse = llm.complete(\"Tell me a short story about a brave knight.\")\nprint(response.text)","lang":"python","description":"Demonstrates how to initialize and use the `OpenAILike` LLM to complete a prompt with an OpenAI-compatible API endpoint. Environment variables are used for API configuration, with fallbacks for local testing."},"warnings":[{"fix":"Upgrade other LlamaIndex packages to their explicit integration versions (e.g., `llama-index-core`, `llama-index-embeddings-openai`). Review LlamaIndex migration guides for detailed steps.","message":"LlamaIndex v0.10+ introduced a significant packaging refactor. While `llama-index-llms-openai-like` maintains its direct import path, ensure your other LlamaIndex components (core, embeddings, etc.) are updated to their respective integration packages or use `llama-index-core` to avoid conflicts.","severity":"breaking","affected_versions":">=0.10.0 (LlamaIndex ecosystem)"},{"fix":"Always set `api_base` to the exact endpoint URL of your OpenAI-compatible service and provide an `api_key` string. Consult your LLM provider's documentation for correct values.","message":"`api_key` and `api_base` parameters for `OpenAILike` are crucial. `api_base` must point to the correct OpenAI-compatible API endpoint, and `api_key` must be provided, even if it's a dummy string (e.g., 'fake') for APIs that don't require authentication. Incorrect values will lead to connection or authentication errors.","severity":"gotcha","affected_versions":"All"},{"fix":"Configure `context_window`, `is_chat_model`, and `is_function_calling_model` to accurately reflect the capabilities and requirements of your specific OpenAI-compatible model for optimal performance and correct API calls.","message":"Parameters like `context_window`, `is_chat_model`, and `is_function_calling_model` directly influence how `OpenAILike` interacts with the underlying LLM. Incorrect settings (e.g., `is_chat_model=False` for a chat-only API) can cause unexpected behavior or errors.","severity":"gotcha","affected_versions":"All"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Run `pip install llama-index-llms-openai-like` to install the package.","cause":"The `llama-index-llms-openai-like` package is not installed in the current Python environment.","error":"ModuleNotFoundError: No module named 'llama_index.llms.openai_like'"},{"fix":"Verify that `api_base` is set to the *full* endpoint URL, including any necessary API version paths (e.g., `http://localhost:11434/v1`). Ensure `is_chat_model` is correctly set (`True` for chat models, `False` for completion models) and that the `model` name is recognized by your local server. Compare your Python request with a successful `curl` command if available.","cause":"The `api_base` URL or the request format (headers, body, endpoint path) configured in `OpenAILike` does not correctly match the local OpenAI-compatible server's expectations. This often happens with local models (e.g., Ollama) if the `/v1` or `/v1/chat/completions` path isn't correctly appended or if `is_chat_model` is misconfigured.","error":"NotFoundError: 404 page not found (when using OpenAILike with local LLMs like Ollama)"},{"fix":"If your project indirectly depends on `llama-index-llms-openai`, install it with `pip install llama-index-llms-openai`. If you're solely using the `openai-like` integration, ensure all relevant LlamaIndex components are configured to use `OpenAILike` and that there are no lingering imports or dependencies on the specific OpenAI package.","cause":"This error can occur if another LlamaIndex integration or an older version of LlamaIndex is implicitly trying to import `llama_index.llms.openai` when only `llama-index-llms-openai-like` is installed, or if `llama-index-llms-openai` is a required dependency that's missing.","error":"ModuleNotFoundError: No module named 'llama_index.llms.openai'"}]}