Ollama Python Library

0.6.1 · active · verified Sat Feb 28

Official Python client for Ollama — the local LLM runtime. Wraps the Ollama REST API with native Python types. Requires Ollama to be installed and running locally (or pointed at https://ollama.com for cloud models). Not an inference provider — it's a runtime client.

Warnings

Install

Imports

Quickstart

Basic chat with a locally running model

from ollama import chat

response = chat(
    model='gemma3',
    messages=[{'role': 'user', 'content': 'Why is the sky blue?'}]
)
print(response.message.content)

view raw JSON →