LangChain Core Library
raw JSON → 1.2.23 verified Tue May 12 auth: no python install: stale quickstart: stale
LangChain Core is a Python library designed to facilitate the development of applications using Large Language Models (LLMs) through composability. The current version is 1.2.23, and it follows a regular release cadence with frequent updates and improvements.
pip install langchain-core Common errors
error ModuleNotFoundError: No module named 'langchain_community' ↓
cause The LangChain library has been modularized. Components that were previously part of the main `langchain` package are now distributed in separate, specific packages like `langchain-community`, `langchain-openai`, etc., and must be installed explicitly.
fix
Install the missing package using pip. For
langchain_community, run: pip install langchain-community (and similarly for other specific integration packages like langchain-openai). error AttributeError: module 'langchain' has no attribute 'debug' ↓
cause The global `debug` setting, previously accessible via `langchain.debug`, has been deprecated or moved. LangChain now encourages using `langchain.set_debug(True)` or LangSmith for tracing and debugging.
fix
Instead of
langchain.debug = True, use from langchain_core.globals import set_debug; set_debug(True) or enable tracing with LangSmith by setting relevant environment variables (LANGCHAIN_TRACING_V2=true, LANGCHAIN_API_KEY, LANGCHAIN_PROJECT). error AttributeError: 'tuple' object has no attribute 'invoke' ↓
cause This error commonly occurs when an LLM instance (e.g., `ChatOpenAI`) is accidentally initialized as a single-element tuple due to a trailing comma in its declaration, preventing direct method calls like `invoke()`.
fix
Remove the trailing comma when initializing the LLM instance to ensure it is an object and not a tuple. For example, change
llm = ChatOpenAI(model='gpt-4o'), to llm = ChatOpenAI(model='gpt-4o'). error UnprocessableEntityError: Failed to deserialize the JSON body into the target type: prompt: invalid type: sequence, expected a string at line 1 column 3 ↓
cause This error typically arises when an LLM or a chain expects a single string as a prompt input but receives a list of message objects (a 'sequence'), or vice versa. This often happens due to a mismatch between using a classic `OpenAI` LLM (string-in) and a `ChatOpenAI` model (list-of-messages-in).
fix
Ensure the input prompt matches the expected type for the specific LLM or chain. If using a chat model like
ChatOpenAI, pass a list of messages (e.g., [HumanMessage(content='your prompt')]). If using an older OpenAI text completion model, pass a simple string. Warnings
breaking Deprecation of Pydantic 1.x support ↓
fix Upgrade to Pydantic 2.x to maintain compatibility.
gotcha Import paths have changed in recent versions ↓
fix Update import statements to reflect the new module structure.
Install compatibility stale last tested: 2026-05-12
python os / libc status wheel install import disk
3.10 alpine (musl) - - - -
3.10 slim (glibc) - - - -
3.11 alpine (musl) - - - -
3.11 slim (glibc) - - - -
3.12 alpine (musl) - - - -
3.12 slim (glibc) - - - -
3.13 alpine (musl) - - - -
3.13 slim (glibc) - - - -
3.9 alpine (musl) - - - -
3.9 slim (glibc) - - - -
Imports
- SomeClass
from langchain_core import SomeClass
Quickstart stale last tested: 2026-04-23
import os
from langchain_core import SomeClass
# Initialize the class with necessary parameters
some_instance = SomeClass(param1=os.environ.get('PARAM1'), param2=os.environ.get('PARAM2'))
# Use the instance to perform tasks
result = some_instance.perform_task()
print(result)