DeepSeek API
DeepSeek has NO official Python SDK. The correct integration pattern is using the openai package with base_url='https://api.deepseek.com'. The deepseek PyPI package (1.0.0) is a stub placeholder — do not use it. DeepSeek API is fully OpenAI-compatible. Two primary models: deepseek-chat (V3.2, general purpose) and deepseek-reasoner (R1, chain-of-thought reasoning).
Warnings
- breaking No official DeepSeek Python SDK exists. The 'deepseek' PyPI package (1.0.0) is an empty stub with no functionality. LLMs hallucinate 'import deepseek' or 'from deepseek import Client' — neither works.
- breaking R1 reasoning model name is 'deepseek-reasoner' not 'deepseek-r1'. Wrong model name returns 404 or model not found error.
- gotcha deepseek-reasoner returns an extra reasoning_content field on the message object. Standard OpenAI response parsing misses the chain-of-thought output.
- gotcha base_url must be 'https://api.deepseek.com' — no trailing slash, no /v1 suffix needed. The /v1 path exists for OpenAI compatibility but has no relationship to model version.
- gotcha API key env var is DEEPSEEK_API_KEY not OPENAI_API_KEY. OpenAI SDK will silently use OPENAI_API_KEY if DEEPSEEK_API_KEY is not explicitly passed.
- gotcha deepseek-reasoner does not support system prompt in the same way. System prompts are accepted but may be converted to user messages internally. Behavior differs from deepseek-chat.
Install
-
pip install openai
Imports
- DeepSeek via OpenAI SDK
from openai import OpenAI import os client = OpenAI( api_key=os.environ['DEEPSEEK_API_KEY'], base_url='https://api.deepseek.com' ) response = client.chat.completions.create( model='deepseek-chat', messages=[ {'role': 'system', 'content': 'You are a helpful assistant'}, {'role': 'user', 'content': 'Hello'} ], stream=False ) print(response.choices[0].message.content) - deepseek-reasoner (R1 thinking model)
from openai import OpenAI client = OpenAI( api_key='DEEPSEEK_API_KEY', base_url='https://api.deepseek.com' ) response = client.chat.completions.create( model='deepseek-reasoner', # R1 thinking model messages=[{'role': 'user', 'content': 'Solve: 2x + 5 = 13'}] ) # Access reasoning content separately reasoning = response.choices[0].message.reasoning_content answer = response.choices[0].message.content print(reasoning) print(answer)
Quickstart
# pip install openai
from openai import OpenAI
import os
client = OpenAI(
api_key=os.environ['DEEPSEEK_API_KEY'],
base_url='https://api.deepseek.com'
)
response = client.chat.completions.create(
model='deepseek-chat',
messages=[
{'role': 'system', 'content': 'You are a helpful assistant'},
{'role': 'user', 'content': 'What is the capital of France?'}
]
)
print(response.choices[0].message.content)