PastaWater Agent
raw JSON → 1.50.40 verified Sat May 09 auth: no python
A CLI coding assistant powered by your Ollama GPUs. Uses local LLMs for code generation, chat, and agentic workflows. Current version: 1.50.40. Released frequently via PyPI.
pip install pw-agent Common errors
error ModuleNotFoundError: No module named 'pw_agent' ↓
cause The package installs as 'pw-agent' but the import uses underscores: pw_agent.
fix
Install with pip install pw-agent and import as from pw_agent import PastaWaterAgent.
error OllamaError: model 'codellama' not found ↓
cause The specified model is not pulled in Ollama.
fix
Run 'ollama pull codellama' in terminal before using the agent.
error AttributeError: module 'pw_agent' has no attribute 'PastaWaterAgent' ↓
cause Importing incorrectly (e.g., import pw_agent then pw_agent.PastaWaterAgent) or old version.
fix
Use from pw_agent import PastaWaterAgent directly.
Warnings
breaking In version 1.40.0 the constructor changed: `model_name` is now `model`. Old code PastaWaterAgent(model_name='codellama') breaks. ↓
fix Use PastaWaterAgent(model='codellama').
gotcha The agent requires an active Ollama server. If OLLAMA_HOST is unreachable, the agent hangs without timeout error. ↓
fix Set a reasonable timeout or check connectivity before calling agent methods.
deprecated Method `generate` has been deprecated in favor of `chat`. It still works but will be removed in v2.0. ↓
fix Use agent.chat() instead of agent.generate().
Imports
- PastaWaterAgent wrong
from pw_agent.pw_agent import PastaWaterAgentcorrectfrom pw_agent import PastaWaterAgent
Quickstart
from pw_agent import PastaWaterAgent
import os
agent = PastaWaterAgent(
ollama_host=os.environ.get('OLLAMA_HOST', 'http://localhost:11434'),
model='codellama'
)
response = agent.chat("Write a Python function to compute Fibonacci numbers.")
print(response)