{"id":5653,"library":"lmnr-claude-code-proxy","title":"Laminar Claude Code Proxy","description":"A thin proxy server designed to route requests from Claude Code to various Large Language Model (LLM) providers, while incorporating Laminar tracing for observability. It is currently at version 0.1.19 and appears to follow an as-needed release cadence for updates and bug fixes.","status":"active","version":"0.1.19","language":"en","source_language":"en","source_url":"","tags":["claude-code","proxy","llm","tracing","observability","anthropic","openai","gemini"],"install":[{"cmd":"pip install lmnr-claude-code-proxy","lang":"bash","label":"Install with pip"}],"dependencies":[{"reason":"Common ASGI server for Python web applications; likely used internally to run the proxy server.","package":"uvicorn","optional":false},{"reason":"A common framework for building API servers in Python, often used with uvicorn for proxy implementations.","package":"fastapi","optional":true}],"imports":[{"note":"The exact import path for the main entry point is inferred based on typical Python package structure for a runnable application. Direct imports into user code might not be the primary interaction method.","symbol":"run_proxy_server","correct":"from lmnr_claude_code_proxy import run_proxy_server"}],"quickstart":{"code":"import os\nfrom lmnr_claude_code_proxy import run_proxy_server\n\n# Set required environment variables\nos.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'YOUR_OPENAI_API_KEY_HERE') # Or other target LLM API key\nos.environ['ANTHROPIC_BASE_URL'] = 'http://localhost:8082' # Point Claude Code to the proxy\n\n# In a real scenario, you would run this in a separate process or via a CLI command.\n# For demonstration, we simulate starting the server. This function typically blocks.\n# run_proxy_server(host='0.0.0.0', port=8082)","lang":"python","description":"This quickstart demonstrates how to set up the necessary environment variables and conceptually run the proxy server. Typically, such a proxy is run as a standalone process, often via a CLI command or a simple Python script that calls a blocking `run` function. Users would then configure their Claude Code client to point to the proxy's address (e.g., `export ANTHROPIC_BASE_URL=\"http://localhost:8082\"`)."},"warnings":[{"fix":"Monitor official Claude Code and Anthropic API documentation for changes and update the `lmnr-claude-code-proxy` library to the latest compatible version.","message":"Breaking changes in the official Claude API or Claude Code client (e.g., Anthropic Messages API format) can render the proxy non-functional, requiring updates to the proxy implementation. Historically, Claude Code has had frequent updates to its internal workings.","severity":"breaking","affected_versions":"All versions"},{"fix":"Store API keys securely using environment variables, secrets management systems, or `.env` files that are properly excluded from version control. Ensure any client-side `ANTHROPIC_BASE_URL` or similar configurations do not inadvertently expose credentials.","message":"Properly securing API keys and environment variables is critical. Proxies handle sensitive credentials, and misconfigurations (e.g., in `.env` files or project settings) have led to API key exfiltration vulnerabilities in related proxy projects (CVE-2026-21852).","severity":"gotcha","affected_versions":"All versions"},{"fix":"Monitor token usage closely. Consider setting cost controls or usage limits on the backend LLM providers. Be aware that features like `--resume` in Claude Code might still incur significant token usage through the proxy. Downgrading Claude Code to older, more stable versions (e.g., v2.1.34) has been a community workaround for some issues.","message":"Claude Code's internal token consumption and session management can be complex, leading to unexpected rate limit hits or high costs, even when using a proxy. Recent changes in Claude Code's behavior regarding prompt caching and session resume have exacerbated these issues.","severity":"gotcha","affected_versions":"All versions, particularly with newer Claude Code client versions (v2.1.69+)"},{"fix":"Thoroughly test model behavior when routed through the proxy, especially for complex tasks involving tool use, thinking blocks, or specific model features. Verify that expected responses are maintained across translation layers.","message":"The functionality and reliability of proxying Claude Code to non-Anthropic models (e.g., OpenAI, Gemini) heavily depend on the proxy's ability to accurately translate between API formats (Anthropic Messages API to target provider's API and back). Incomplete or incorrect translation can lead to reduced functionality or unexpected model behavior.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-11T00:00:00.000Z","next_check":"2026-07-10T00:00:00.000Z"}