LangChain Community Integrations

raw JSON →
0.4.1 verified Tue May 12 auth: no python install: verified quickstart: stale

LangChain Community provides a collection of third-party integrations for the LangChain ecosystem. These integrations implement base interfaces defined in LangChain Core, enabling connectivity to various LLM providers, document loaders, vector stores, and other tools within any LangChain application. It is actively maintained and currently at version 0.4.1.

pip install langchain-community
error ModuleNotFoundError: No module named 'langchain_community'
cause The `langchain-community` package is not installed in your Python environment or your virtual environment is not active.
fix
pip install langchain-community
error ModuleNotFoundError: No module named 'langchain.llms'
cause Components like LLMs, Chat Models, and Vector Stores were moved from the main `langchain` package to `langchain-community` or specific partner packages (e.g., `langchain-openai`) in LangChain v0.2.0+ due to a modularization effort.
fix
Change the import path to from langchain_community.llms import OpenAI (or from langchain_community.chat_models import ChatOpenAI, from langchain_community.vectorstores import FAISS, etc.) and ensure langchain-community (and relevant partner packages like langchain-openai) is installed.
error AttributeError: module 'langchain' has no attribute 'verbose'
cause This attribute or similar top-level configuration/utility functions have been removed or refactored in newer versions of LangChain (post-v0.2.0 modularization).
fix
Remove the usage of langchain.verbose or consult the latest LangChain documentation for the equivalent new way to achieve the desired functionality (e.g., configuring logging directly or using langchain_core components).
error LangChainDeprecationWarning: Importing vector stores from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain_community.vectorstores instead:
cause You are using an old import path for a component (like vector stores, document loaders, or embeddings) that has been moved to the `langchain-community` package as part of the LangChain v0.2.0+ modularization.
fix
Update your import statement, for example, change from langchain.vectorstores import FAISS to from langchain_community.vectorstores import FAISS.
error ImportError: cannot import name 'Chroma' from 'langchain_community.vectorstores'
cause The Chroma vector store integration has been moved from `langchain_community.vectorstores` to its own dedicated `langchain-chroma` package.
fix
pip install langchain-chroma from langchain_chroma import Chroma
breaking Minor versions of `langchain-community` (e.x., 0.x.y to 0.y.z) may introduce breaking changes. Unlike `langchain` and `langchain-core`, it does not strictly adhere to semantic versioning due to the nature of community contributions and third-party integrations.
fix Always check the changelog or release notes before upgrading minor versions of `langchain-community`. Pin your `langchain-community` version to prevent unexpected breakage.
breaking Many third-party integrations (e.g., specific Document Loaders, Vector Stores, LLMs) were moved from the main `langchain` package to `langchain-community` or dedicated provider packages (e.g., `langchain-openai`, `langchain-anthropic`) starting with LangChain v0.2. Attempting to import from old paths will result in `ImportError`.
fix Update your import statements: `from langchain.module import Class` becomes `from langchain_community.module import Class` or `from langchain_provider.module import Class`. Ensure the necessary integration package (e.g., `langchain-community`, `langchain-openai`) is installed.
gotcha Using specific integrations within `langchain-community` often requires installing additional, sometimes optional, Python packages (e.g., `beautifulsoup4` for `WebBaseLoader`, `faiss-cpu` for `FAISS`, `openai` for `langchain-openai` classes). A `ModuleNotFoundError` for a seemingly related package means a sub-dependency is missing.
fix Refer to the specific integration's documentation to identify and install its required dependencies (e.g., `pip install langchain-community[webloaders]` or `pip install beautifulsoup4`). If using a dedicated provider package, install it (e.g., `pip install langchain-openai`).
python os / libc status wheel install import disk
3.10 alpine (musl) wheel - 1.45s 206.3M
3.10 alpine (musl) - - 1.36s 205.7M
3.10 slim (glibc) wheel 17.2s 1.08s 210M
3.10 slim (glibc) - - 1.00s 210M
3.11 alpine (musl) wheel - 1.68s 225.5M
3.11 alpine (musl) - - 1.81s 224.8M
3.11 slim (glibc) wheel 14.8s 1.47s 229M
3.11 slim (glibc) - - 1.49s 228M
3.12 alpine (musl) wheel - 1.80s 211.3M
3.12 alpine (musl) - - 1.98s 210.6M
3.12 slim (glibc) wheel 12.9s 1.80s 214M
3.12 slim (glibc) - - 1.90s 214M
3.13 alpine (musl) wheel - 1.77s 210.5M
3.13 alpine (musl) - - 1.78s 209.7M
3.13 slim (glibc) wheel 13.1s 1.61s 213M
3.13 slim (glibc) - - 1.70s 213M
3.9 alpine (musl) wheel - 1.78s 211.2M
3.9 alpine (musl) - - 1.71s 211.2M
3.9 slim (glibc) wheel 19.9s 1.61s 218M
3.9 slim (glibc) - - 1.52s 218M

This quickstart demonstrates how to load documents using `TextLoader` from `langchain-community`, create a simple vector store with `FAISS` and `FakeEmbeddings` (both from `langchain-community`), and then use an OpenAI Chat Model (from `langchain-openai`) with a basic RAG (Retrieval Augmented Generation) chain. It highlights using components from `langchain-community` alongside a common LLM provider package.

import os
from langchain_community.document_loaders import TextLoader
from langchain_community.embeddings import FakeEmbeddings
from langchain_community.vectorstores import FAISS
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser
from langchain_openai import ChatOpenAI # Requires 'pip install langchain-openai'

# Create a dummy text file for demonstration
with open("example.txt", "w") as f:
    f.write("LangChain is a framework for developing applications powered by large language models (LLMs).")
    f.write("\nIt enables applications that are context-aware and can reason over data.")

# Set your OpenAI API key (replace with actual key or environment variable)
# For a real application, use `os.environ.get("OPENAI_API_KEY", "")`
os.environ["OPENAI_API_KEY"] = os.environ.get("OPENAI_API_KEY", "sk-YOUR_OPENAI_KEY_HERE")

# 1. Load documents using a loader from langchain-community
loader = TextLoader("example.txt")
documents = loader.load()
print(f"Loaded {len(documents)} document(s).")

# 2. Create embeddings (using a fake one for simplicity in 'community' quickstart)
# For real use, install a provider package like `langchain-openai` and use its embeddings.
embeddings = FakeEmbeddings()

# 3. Create a vector store from documents and embeddings
vectorstore = FAISS.from_documents(documents, embeddings)
print("Vector store created.")

# 4. Perform a similarity search as a retriever
retriever = vectorstore.as_retriever()

# 5. Define a Chat Model (from a dedicated provider package, e.g., langchain-openai)
# Ensure OPENAI_API_KEY is set in your environment.
chat_model = ChatOpenAI(api_key=os.environ.get("OPENAI_API_KEY"))

# 6. Create a prompt template
prompt = ChatPromptTemplate.from_messages([
    ("system", "You are an AI assistant. Answer the question based ONLY on the provided context."),
    ("human", "Context: {context}\nQuestion: {question}")
])

# 7. Build a RAG chain
chain = (
    {"context": retriever, "question": StrOutputParser()}
    | prompt
    | chat_model
    | StrOutputParser()
)

# 8. Invoke the chain
question = "What is LangChain's primary purpose?"
response = chain.invoke(question)
print(f"\nQuestion: {question}")
print(f"Answer: {response}")

# Clean up the dummy file
os.remove("example.txt")