DeepSeek Integration for LangChain

1.0.1 · active · verified Mon Apr 13

langchain-deepseek is an official LangChain partner package that integrates DeepSeek's large language models (LLMs) and embeddings into the LangChain ecosystem. It allows developers to easily use DeepSeek models for chat completions and text embeddings within their LangChain applications. The current version is 1.0.1, and it follows the rapid development and release cadence typical of LangChain's partner libraries, often aligning with `langchain-core` updates.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize the `ChatDeepSeek` model and use it to invoke a simple chat completion. It assumes the `DEEPSEEK_API_KEY` is set as an environment variable, which is the recommended secure practice. The model processes a list of `HumanMessage` objects and returns a response.

import os
from langchain_deepseek import ChatDeepSeek
from langchain_core.messages import HumanMessage

# Ensure your DeepSeek API key is set as an environment variable
# or passed directly to the ChatDeepSeek constructor.
# os.environ["DEEPSEEK_API_KEY"] = "your-deepseek-api-key"

deepseek_api_key = os.environ.get("DEEPSEEK_API_KEY")

if not deepseek_api_key:
    raise ValueError("DEEPSEEK_API_KEY environment variable not set.")

chat_model = ChatDeepSeek(deepseek_api_key=deepseek_api_key)

messages = [
    HumanMessage(content="Tell me a short story about a brave knight.")
]

response = chat_model.invoke(messages)
print(response.content)

view raw JSON →