{"id":52,"library":"ollama","title":"Ollama Python Library","description":"Official Python client for Ollama — the local LLM runtime. Wraps the Ollama REST API with native Python types. Requires Ollama to be installed and running locally (or pointed at https://ollama.com for cloud models). Not an inference provider — it's a runtime client.","status":"active","version":"0.6.1","language":"python","source_language":"en","source_url":"https://github.com/ollama/ollama-python","tags":["llm","local-inference","ollama","python","runtime-client","openai-compatible","self-hosted"],"install":[{"cmd":"pip install ollama","lang":"bash","label":"Official Ollama Python client"}],"dependencies":[{"reason":"The Python library is a client for the Ollama runtime. Ollama must be separately installed and running — pip install alone is not sufficient.","package":"ollama (the runtime)","optional":false}],"imports":[{"note":"messages must be a list of dicts with role and content keys. Model name must match an already-pulled model.","wrong":"import ollama\nollama.chat('gemma3', 'Hello')","symbol":"chat","correct":"from ollama import chat\nresponse = chat(model='gemma3', messages=[{'role': 'user', 'content': 'Hello'}])\nprint(response.message.content)"},{"note":"Use Client() for custom host, headers, or timeouts. Default host is http://localhost:11434.","wrong":"from ollama import Client\nclient = Client()\nclient.chat('gemma3', 'Hello')","symbol":"Client (remote/custom host)","correct":"from ollama import Client\nclient = Client(host='http://localhost:11434')\nresponse = client.chat(model='gemma3', messages=[{'role': 'user', 'content': 'Hello'}])"}],"quickstart":{"code":"from ollama import chat\n\nresponse = chat(\n    model='gemma3',\n    messages=[{'role': 'user', 'content': 'Why is the sky blue?'}]\n)\nprint(response.message.content)","lang":"python","description":"Basic chat with a locally running model"},"warnings":[{"fix":"Install the Ollama runtime separately from https://ollama.com/download, then run 'ollama serve' (or the desktop app). Verify with: curl http://localhost:11434","message":"pip install ollama installs only the Python client — not the Ollama runtime. If the runtime is not installed and running, all calls raise ConnectionError immediately.","severity":"breaking","affected_versions":"all"},{"fix":"Run 'ollama pull <model>' before use, or catch ResponseError and call ollama.pull(model) programmatically.","message":"Models must be explicitly pulled before use. Calling chat() with a model that hasn't been pulled raises a ResponseError with status_code 404.","severity":"breaking","affected_versions":"all"},{"fix":"Use stream=False when tools are involved. Track https://github.com/ollama/ollama/issues/9084 for resolution.","message":"Tool calling with stream=True is incomplete. When tools are provided, Ollama buffers the full response and returns it as a single block even with stream=True set — true incremental streaming does not occur.","severity":"breaking","affected_versions":"all (as of 0.12.x)"},{"fix":"Use pip install ollama (no hyphen suffix). Official package is at pypi.org/project/ollama.","message":"pip install ollama-python installs a different, unofficial third-party package with a completely different API (ModelManagementAPI, GenerateAPI classes). It is not the official client.","severity":"gotcha","affected_versions":"all"},{"fix":"Pass host explicitly: Client(host=os.environ.get('OLLAMA_HOST', 'http://localhost:11434'))","message":"The Ollama Python library defaults to http://localhost:11434. To connect to a remote Ollama instance, you must pass host explicitly via Client(host='http://<remote>:11434'). The OLLAMA_HOST env var is respected by the runtime but NOT automatically picked up by the Python client.","severity":"gotcha","affected_versions":"all"},{"fix":"from openai import OpenAI; client = OpenAI(base_url='http://localhost:11434/v1', api_key='ollama')","message":"The OpenAI-compatible endpoint is at http://localhost:11434/v1 — but this is a separate REST interface, not the native ollama Python library. Using openai SDK against this endpoint requires api_key='ollama' (any non-empty string) as a placeholder.","severity":"gotcha","affected_versions":"all"},{"fix":"Run 'ollama signin' first. For direct cloud API access, set host='https://ollama.com' and pass Authorization header with OLLAMA_API_KEY.","message":"Cloud models (e.g. gpt-oss:120b-cloud, deepseek-v3.1:671b-cloud) require ollama signin and ollama pull before use. They are routed through ollama.com and require an API key when accessed via the cloud API endpoint directly.","severity":"gotcha","affected_versions":">=0.6.0"}],"env_vars":null,"last_verified":"2026-05-12T05:59:58.075Z","next_check":"2026-05-28T00:00:00.000Z","problems":[{"fix":"Run `pip install ollama` to install the official Python client.","cause":"The 'ollama' Python package has not been installed in your current environment or there is a misconfiguration in your Python path.","error":"ModuleNotFoundError: No module named 'ollama'"},{"fix":"Ensure the Ollama server is running locally by executing `ollama serve` in your terminal and that no firewall is blocking port 11434. If running in a container or on a different host, configure the `OLLAMA_HOST` environment variable or `host` parameter in the `ollama.Client`.","cause":"The Ollama server is not running, is not accessible at the default address (localhost:11434), or a firewall is blocking the connection.","error":"httpx.ConnectError: [Errno 111] Connection refused"},{"fix":"Pull the desired model using the Ollama CLI: `ollama pull <model_name>` (e.g., `ollama pull llama3`). You can list available models with `ollama list`.","cause":"The specified model has not been downloaded and pulled to your local Ollama instance, or the model name is incorrect.","error":"'model' not found"},{"fix":"Rename your Python script file from `ollama.py` to something else (e.g., `my_ollama_app.py`).","cause":"This error typically occurs due to a circular import, most commonly when your Python script file is named `ollama.py`, which conflicts with the installed `ollama` library.","error":"AttributeError: partially initialized module 'ollama' has no attribute 'chat'"}],"ecosystem":"pypi","meta_description":null,"install_score":100,"install_tag":"verified","quickstart_score":0,"quickstart_tag":"stale","pypi_latest":null,"install_checks":{"last_tested":"2026-05-12","tag":"verified","tag_description":"installs cleanly on critical runtimes, fast import, recently tested","results":[{"runtime":"python:3.10-alpine","python_version":"3.10","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":0.78,"mem_mb":15.8,"disk_size":"31.8M"},{"runtime":"python:3.10-slim","python_version":"3.10","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":0.55,"mem_mb":15.8,"disk_size":"31M"},{"runtime":"python:3.11-alpine","python_version":"3.11","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":1.12,"mem_mb":17.3,"disk_size":"34.8M"},{"runtime":"python:3.11-slim","python_version":"3.11","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":0.9,"mem_mb":17.3,"disk_size":"34M"},{"runtime":"python:3.12-alpine","python_version":"3.12","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":1.17,"mem_mb":17.1,"disk_size":"26.3M"},{"runtime":"python:3.12-slim","python_version":"3.12","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":1.13,"mem_mb":17.1,"disk_size":"26M"},{"runtime":"python:3.13-alpine","python_version":"3.13","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":0.81,"mem_mb":16.5,"disk_size":"25.9M"},{"runtime":"python:3.13-slim","python_version":"3.13","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":0.82,"mem_mb":16.5,"disk_size":"26M"},{"runtime":"python:3.9-alpine","python_version":"3.9","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":0.7,"mem_mb":15.8,"disk_size":"31.1M"},{"runtime":"python:3.9-slim","python_version":"3.9","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":0.65,"mem_mb":15.8,"disk_size":"31M"}]},"quickstart_checks":{"last_tested":"2026-05-12","tag":"stale","tag_description":"widespread failures or data too old to trust","results":[{"runtime":"python:3.10-alpine","exit_code":1},{"runtime":"python:3.10-slim","exit_code":1},{"runtime":"python:3.11-alpine","exit_code":1},{"runtime":"python:3.11-slim","exit_code":1},{"runtime":"python:3.12-alpine","exit_code":1},{"runtime":"python:3.12-slim","exit_code":1},{"runtime":"python:3.13-alpine","exit_code":1},{"runtime":"python:3.13-slim","exit_code":1},{"runtime":"python:3.9-alpine","exit_code":1},{"runtime":"python:3.9-slim","exit_code":1}]}}