LlamaIndex LangChain LLMs Integration
The `llama-index-llms-langchain` library provides a seamless integration layer, allowing users to leverage LangChain's extensive collection of Large Language Models (LLMs) within the LlamaIndex framework. It acts as a bridge, enabling LangChain LLM instances to conform to LlamaIndex's LLM interface. The current version is `0.8.0`, and it receives regular updates aligned with LlamaIndex core releases.
Common errors
-
ModuleNotFoundError: No module named 'llama_index.llms.langchain'
cause The `llama-index-llms-langchain` package is not installed or the import path is incorrect.fixInstall the package: `pip install llama-index-llms-langchain`. If installed, verify the import path is `from llama_index.llms.langchain import LangChainLLM`. -
ImportError: cannot import name 'LangChainLLM' from 'llama_index.integrations.llms.langchain'
cause Using an outdated import path for LlamaIndex versions v0.10 and later.fixUpdate your import statement to `from llama_index.llms.langchain import LangChainLLM`. -
AttributeError: 'ChatOpenAI' object has no attribute 'complete' (or 'chat')
cause A raw LangChain LLM object is being used where LlamaIndex expects its own `LLM` interface. The `LangChainLLM` wrapper was not applied.fixYou must wrap the LangChain LLM instance with `LangChainLLM` before passing it to LlamaIndex components. Example: `lc_llm = ChatOpenAI(...); llm = LangChainLLM(llm=lc_llm)`.
Warnings
- breaking LlamaIndex v0.10+ introduced a significant package restructuring. Integration packages like `llama-index-llms-langchain` moved from the `llama_index.integrations` namespace to dedicated namespaces like `llama_index.llms` or `llama_index.embeddings`.
- gotcha This package (`llama-index-llms-langchain`) only provides the wrapper. You still need to install the specific LangChain provider packages (e.g., `langchain-openai`, `langchain-anthropic`, `langchain-google-genai`) for the LLMs you intend to use.
- gotcha Ensure compatible versions of `llama-index-core` and `langchain` (and its sub-packages like `langchain-core`, `langchain-openai`). Version mismatches can lead to `ImportError` or `AttributeError`.
Install
-
pip install llama-index-llms-langchain llama-index-core -
pip install llama-index-llms-langchain llama-index-core langchain-openai
Imports
- LangChainLLM
from llama_index.integrations.llms.langchain import LangChainLLM
from llama_index.llms.langchain import LangChainLLM
Quickstart
import os
from langchain_openai import ChatOpenAI
from llama_index.llms.langchain import LangChainLLM
from llama_index.core import Settings
# 1. Initialize a LangChain LLM instance
# Make sure to install 'langchain-openai' (pip install langchain-openai)
# and set your OPENAI_API_KEY environment variable.
# Using os.environ.get for safe execution in environments without the key.
lc_llm = ChatOpenAI(temperature=0.0, model="gpt-3.5-turbo", api_key=os.environ.get("OPENAI_API_KEY", "test_key"))
# 2. Wrap the LangChain LLM with LlamaIndex's LangChainLLM wrapper
llm = LangChainLLM(llm=lc_llm)
# 3. Use the wrapped LLM with LlamaIndex
# You can either set it globally or pass it directly to components.
Settings.llm = llm
# Example: Generate a completion
response = Settings.llm.complete("Tell me a short story about a magical cat.")
print(response.text)
# Example: Generate a chat response
from llama_index.core.llms import ChatMessage, MessageRole
chat_response = Settings.llm.chat([
ChatMessage(role=MessageRole.USER, content="What is the capital of France?")
])
print(chat_response.message.content)