{"id":6034,"library":"portkey-ai","title":"Portkey.ai Python Client","description":"The Portkey.ai Python client library provides an interface to the Portkey API, a unified AI gateway for managing, monitoring, and routing large language model (LLM) requests. It offers features like observability, caching, load balancing, and prompt management across various LLM providers. The library is actively maintained, with frequent updates.","status":"active","version":"2.2.0","language":"en","source_language":"en","source_url":"https://github.com/Portkey-AI/portkey-python-sdk","tags":["AI","LLM","API Gateway","Observability","Prompt Management"],"install":[{"cmd":"pip install portkey-ai","lang":"bash","label":"Install core library"}],"dependencies":[],"imports":[{"note":"The primary synchronous client class is `Portkey`, imported directly from the `portkey_ai` package.","wrong":"import portkey_ai","symbol":"Portkey","correct":"from portkey_ai import Portkey"},{"note":"For asynchronous operations, `AsyncPortkey` is available.","symbol":"AsyncPortkey","correct":"from portkey_ai import AsyncPortkey"}],"quickstart":{"code":"import os\nfrom portkey_ai import Portkey\n\n# Set your Portkey API key as an environment variable: export PORTKEY_API_KEY=\"pk-sk-...\"\n# If using a direct provider (e.g., OpenAI) without Portkey Virtual Keys,\n# you might also need its API key, e.g.: export OPENAI_API_KEY=\"sk-...\"\n\nportkey_client = Portkey(\n    api_key=os.environ.get('PORTKEY_API_KEY', ''),\n    # Use a provider slug from your Portkey Model Catalog\n    # e.g., \"@openai-prod\" if configured in Portkey\n    provider=\"@openai-prod\"\n)\n\ntry:\n    response = portkey_client.chat.completions.create(\n        messages=[\n            {\"role\": \"user\", \"content\": \"What is the capital of France?\"}\n        ],\n        # Model name configured under the \"@openai-prod\" provider in Portkey\n        model=\"gpt-4o\"\n    )\n    print(response.choices[0].message.content)\nexcept Exception as e:\n    print(f\"An error occurred: {e}\")","lang":"python","description":"Initializes the Portkey client using the `PORTKEY_API_KEY` environment variable and directs requests through a configured provider slug (e.g., `@openai-prod`). It then makes a chat completion request using a model available via that provider."},"warnings":[{"fix":"Review the Portkey and OpenAI SDK changelogs for relevant updates. Test existing integrations thoroughly after upgrading to v2.0.0 or later. Consult Portkey documentation for updated usage patterns, especially for advanced configurations or when directly interacting with vendored client internals.","message":"Major version 2.0.0 introduced significant internal changes, including vendoring a specific version of the OpenAI SDK. While Portkey aims for compatibility, direct dependencies or specific behaviors of the underlying OpenAI client might have changed. This could require adjustments for complex integrations that rely on specific OpenAI client versions or internal workings.","severity":"breaking","affected_versions":">=2.0.0"},{"fix":"Ensure `PORTKEY_API_KEY` is set for the `Portkey` client. Manage LLM provider keys via Portkey's dashboard (recommended) and use provider slugs (e.g., `@openai-prod`) or `config` objects. Alternatively, provide provider-specific `Authorization` headers to the `Portkey` client for direct pass-through of provider keys.","message":"Portkey utilizes its own `PORTKEY_API_KEY` for authenticating with the Portkey gateway. Separately, your actual LLM provider API keys (e.g., OpenAI, Anthropic) must be configured within Portkey's 'Virtual Keys' or 'Model Catalog' dashboard, or provided via the `Authorization` parameter in the `Portkey` client constructor. Simply passing a provider's API key to `Portkey(api_key=\"...\")` is incorrect for LLM provider authentication.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Always use the full `@provider-slug/model-name` format for the `model` parameter when using the `provider` argument in the client constructor. Alternatively, define a `config` object for the `Portkey` client that specifies the desired routing strategy and provider details. Refer to the Portkey Model Catalog documentation for valid provider slugs and model names.","message":"When using the `provider` parameter, Portkey often expects models to be specified in a `@provider-slug/model-name` format (e.g., `@openai-prod/gpt-4o`). Simply providing `model=\"gpt-4o\"` without the appropriate provider slug prefix, or without a `config` object defining the routing, may lead to errors or incorrect routing.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-14T00:00:00.000Z","next_check":"2026-07-13T00:00:00.000Z"}