{"id":8288,"library":"llama-index-embeddings-langchain","title":"LlamaIndex Embeddings Langchain Integration","description":"The `llama-index-embeddings-langchain` library provides an integration layer to use LangChain's embedding models within the LlamaIndex framework. It acts as a wrapper, allowing users to leverage the wide array of embedding models available in LangChain for LlamaIndex's indexing and retrieval functionalities. The current version is 0.5.0, and as part of the broader LlamaIndex ecosystem, it typically sees active development and frequent releases in conjunction with LlamaIndex core.","status":"active","version":"0.5.0","language":"en","source_language":"en","source_url":"https://github.com/run-llama/llama_index/tree/main/llama-index-integrations/embeddings/llama-index-embeddings-langchain","tags":["llama-index","langchain","embeddings","LLM","RAG","integration"],"install":[{"cmd":"pip install llama-index-embeddings-langchain","lang":"bash","label":"Install Package"},{"cmd":"pip install langchain-community","lang":"bash","label":"Install LangChain Community (for most models)"}],"dependencies":[{"reason":"Required for LlamaIndex's core embedding abstractions.","package":"llama-index-core","optional":false},{"reason":"Provides the base LangChain embedding models to be wrapped. Often `langchain-community` is specifically needed.","package":"langchain","optional":false},{"reason":"Many specific LangChain embedding models (e.g., HuggingFaceEmbeddings) are now located in `langchain-community`.","package":"langchain-community","optional":true}],"imports":[{"note":"The `LangchainEmbedding` wrapper is located in a specific submodule, not directly under `llama_index`.","wrong":"from llama_index import LangchainEmbedding","symbol":"LangchainEmbedding","correct":"from llama_index.embeddings.langchain import LangchainEmbedding"},{"note":"With recent LangChain refactorings, many common embedding models moved from `langchain.embeddings` to `langchain_community.embeddings`.","wrong":"from langchain.embeddings import HuggingFaceEmbeddings","symbol":"HuggingFaceEmbeddings","correct":"from langchain_community.embeddings import HuggingFaceEmbeddings"}],"quickstart":{"code":"import os\nfrom llama_index.embeddings.langchain import LangchainEmbedding\nfrom llama_index.core import Settings\nfrom langchain_community.embeddings import HuggingFaceEmbeddings\n\n# Ensure the necessary LangChain package is installed\n# pip install langchain-community\n\n# 1. Initialize a LangChain embedding model\n# Using a local model for demonstration, no API key needed\nlc_embed_model = HuggingFaceEmbeddings(model_name=\"sentence-transformers/all-MiniLM-L6-v2\")\n\n# 2. Wrap the LangChain embedding model with LlamaIndex's LangchainEmbedding wrapper\nembed_model = LangchainEmbedding(lc_embed_model)\n\n# 3. Set the global embedding model for LlamaIndex\nSettings.embed_model = embed_model\n\n# 4. Example usage: get an embedding\ntext = \"This is a test sentence for embedding.\"\nembedding = Settings.embed_model.get_text_embedding(text)\n\nprint(f\"Embedding length: {len(embedding)}\")\nprint(f\"First 10 dimensions of embedding: {embedding[:10]}\")\n\n# You can also use it directly without setting global settings\n# direct_embedding = embed_model.get_text_embedding(\"Another sentence.\")\n# print(f\"Direct embedding length: {len(direct_embedding)}\")","lang":"python","description":"This quickstart demonstrates how to integrate a LangChain embedding model, specifically a HuggingFace one, into LlamaIndex. It involves initializing the LangChain model, wrapping it with `LangchainEmbedding`, and then setting it as the global embedding model for LlamaIndex. This allows any LlamaIndex component (e.g., VectorStoreIndex) to use this embedding model."},"warnings":[{"fix":"Always consult the latest official LlamaIndex and LangChain documentation for correct import paths and recommended packages. Pin your library versions carefully in `requirements.txt`.","message":"Both LlamaIndex and LangChain are rapidly evolving libraries. Import paths and class locations within `langchain` (e.g., moving to `langchain-community`) have changed frequently, leading to `ImportError` or `ModuleNotFoundError` if versions are not compatible or imports are not updated.","severity":"breaking","affected_versions":"All versions, especially during major LangChain or LlamaIndex core updates."},{"fix":"Ensure that `langchain` and/or `langchain-community` are installed alongside `llama-index-embeddings-langchain` and any specific dependencies for your chosen LangChain embedding model.","message":"The `llama-index-embeddings-langchain` package is merely a wrapper. You *must* install the underlying LangChain package that provides the actual embedding model you intend to use (e.g., `langchain-community` for `HuggingFaceEmbeddings`). Not installing this dependency will lead to runtime errors when the wrapped model is initialized.","severity":"gotcha","affected_versions":"All versions."},{"fix":"Check network connectivity. For `HuggingFaceEmbeddings`, you might need to pre-download the model or ensure your environment has access to Hugging Face Hub. Increasing verbosity or checking logs for download progress can also help.","message":"When using local HuggingFace embedding models via LangChain, ensure the model weights are downloaded correctly and that there are no network issues if it's the first time using a specific model. Indefinite loading times can indicate a problem with model download.","severity":"gotcha","affected_versions":"All versions using local HuggingFace models."}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Change your import statement to `from llama_index.embeddings.langchain import LangchainEmbedding`.","cause":"The `LangchainEmbedding` class is not directly available at the top-level `llama_index` package. It resides within the `llama_index.embeddings.langchain` submodule.","error":"ImportError: cannot import name 'LangchainEmbedding' from 'llama_index'"},{"fix":"Install or update LangChain: `pip install langchain` or `pip install langchain-community`. If the issue persists, try pinning an older, known-compatible version of `langchain` (e.g., `langchain==0.0.153` as seen in older issues) and then gradually upgrade.","cause":"This error typically means the `langchain` package (or `langchain-community` for newer versions) is not installed or the installed version is incompatible with LlamaIndex's bridge.","error":"ModuleNotFoundError: No module named 'langchain.embeddings.base'"},{"fix":"This is often handled gracefully by the `LangchainEmbedding` wrapper, falling back to sync. If you strictly require async, ensure your specific LangChain embedding model explicitly supports `aembed_query` and `aembed_documents` methods.","cause":"Some underlying LangChain embedding models might not have asynchronous embedding methods implemented, leading to this error when LlamaIndex attempts to use them asynchronously. The `LangchainEmbedding` wrapper handles this by falling back to synchronous calls, but you might see warnings.","error":"AttributeError: 'HuggingFaceEmbeddings' object has no attribute 'aembed_query'"}]}