{"total":85,"docs":[{"id":"anthropic","title":"Anthropic Python SDK","library":"anthropic","status":"active","version":"0.84.0","language":"en","summary":"Official Python SDK for the Anthropic API providing access to Claude models. Current version is 0.84.0 (Feb 2026). Releases approximately weekly. Two separate packages exist: anthropic (core SDK) and claude-agent-sdk-python (agent workflows).","last_verified":"2026-02-28","tags":["anthropic","claude","llm","agents","python","nodejs"],"api":"https://checklist.day/api/registry/anthropic"},{"id":"autogen-agentchat","title":"AutoGen (Microsoft)","library":"autogen-agentchat","status":"active","version":"0.7.5","language":"en","summary":"Microsoft's framework for building multi-agent AI applications. As of v0.4+, AutoGen is a layered system: autogen-core (event-driven actor runtime), autogen-agentchat (high-level conversational agents), and autogen-ext (extensions for OpenAI, Azure, MCP, etc.). Completely async. The 0.2 API is a separate, deprecated architecture. NOT the same as AG2.","last_verified":"2026-02-28","tags":["agents","orchestration","multi-agent","autogen","microsoft","async","event-driven","python"],"api":"https://checklist.day/api/registry/autogen-agentchat"},{"id":"autogenstudio","title":"AutoGen Studio","library":"autogenstudio","status":"maintenance","version":"0.4.x (last stable PyPI release: May 2025)","language":"en","summary":"Microsoft's no-code/low-code web UI for prototyping AutoGen multi-agent workflows. Built on FastAPI (backend) and Gatsby/React (frontend). Stores agent configs, sessions, and workflows in a local SQLite or PostgreSQL database. Explicitly NOT production-ready — intended for rapid prototyping only.","last_verified":"2026-02-28","tags":["agents","orchestration","autogen","microsoft","no-code","ui","prototyping","python","maintenance-mode"],"api":"https://checklist.day/api/registry/autogenstudio"},{"id":"chromadb","title":"ChromaDB","library":"chromadb","status":"active","version":"1.5.1","language":"en","summary":"Open-source embedded vector database for AI applications. Runs in-process (EphemeralClient, PersistentClient) or client-server mode (HttpClient). Handles embedding storage, metadata filtering, and similarity search. Supports pluggable embedding functions. Core backend rewritten in Rust in 1.x; also ships a lightweight HTTP-only client as the separate chromadb-client package.","last_verified":"2026-02-28","tags":["chromadb","vector-database","embeddings","rag","similarity-search","persistent","in-memory"],"api":"https://checklist.day/api/registry/chromadb"},{"id":"cohere","title":"Cohere Python SDK","library":"cohere","status":"active","version":"5.20.4","language":"python","summary":"Official Python SDK for Cohere API. Has two coexisting client classes: Client (v1, legacy) and ClientV2 (v2, current). Most LLM-generated code uses v1 patterns. The Generate endpoint is deprecated. model is now required in all v2 calls.","last_verified":"2026-02-28","tags":["cohere","llm","chat","embeddings","rerank","rag","command"],"api":"https://checklist.day/api/registry/cohere"},{"id":"crewai","title":"CrewAI","library":"crewai","status":"active","version":"1.10.0","language":"en","summary":"Python framework for orchestrating role-playing, autonomous AI agents. Agents collaborate via sequential or hierarchical processes to complete complex tasks. Built from scratch — independent of LangChain. Two core primitives: Crews (autonomous multi-agent collaboration) and Flows (event-driven, precise orchestration).","last_verified":"2026-02-28","tags":["agents","orchestration","multi-agent","crewai","python","llm","autonomous","flows"],"api":"https://checklist.day/api/registry/crewai"},{"id":"datasets","title":"Hugging Face Datasets","library":"datasets","status":"active","version":"4.6.0","language":"en","summary":"HuggingFace library for loading, processing, and sharing datasets for ML. Provides load_dataset() for one-line access to 100k+ public datasets on the Hub, plus local file loading (CSV, JSON, Parquet, Arrow, audio, image, etc.). Built on Apache Arrow for memory-efficient, zero-copy data access. Package name on PyPI is 'datasets' (not 'huggingface-datasets'). Import name is also 'datasets'. CRITICAL: datasets 4.0 (July 2025) removed dataset loading scripts and trust_remote_code entirely. Many older community datasets relying on .py loading scripts now fail with datasets>=4.","last_verified":"2026-02-28","tags":["datasets","huggingface","load-dataset","arrow","parquet","nlp","data-loading","streaming","preprocessing","rag"],"api":"https://checklist.day/api/registry/datasets"},{"id":"deepseek","title":"DeepSeek Python SDK","library":"openai","status":"active","version":"compatible with openai>=1.0.0","language":"en","summary":"DeepSeek has no official Python SDK. The canonical integration is via the OpenAI-compatible REST API using the openai package with a custom base_url. deepseek-chat and deepseek-reasoner are the two stable model IDs — both map to DeepSeek-V3.2 (non-thinking and thinking modes respectively).","last_verified":"2026-02-28","tags":["llm","deepseek","openai-compatible","reasoning","python","v3","chain-of-thought"],"api":"https://checklist.day/api/registry/deepseek"},{"id":"dspy-ai","title":"DSPy","library":"dspy-ai","status":"active","version":"3.1.3","language":"en","summary":"Stanford's framework for programming—not prompting—language models. Core primitives: Signatures (typed input/output specs), Modules (dspy.Predict, dspy.ChainOfThought, dspy.ReAct, etc.), and Optimizers (MIPROv2, GEPA, SIMBA, BootstrapFewShot, GRPO). Instead of hand-crafted prompts, you write compositional Python programs and let optimizers automatically tune instructions and few-shot demos against a metric. Supports any LiteLLM-compatible model.","last_verified":"2026-02-28","tags":["prompt-optimization","llm-programming","signatures","modules","optimizers","rag","stanford","research","python"],"api":"https://checklist.day/api/registry/dspy-ai"},{"id":"faiss-cpu","title":"FAISS","library":"faiss-cpu","status":"active","version":"1.13.2","language":"en","summary":"Facebook AI Research Similarity Search. C++ library with Python bindings for efficient similarity search and clustering of dense vectors. Supports flat (exact), IVF (approximate), HNSW, PQ, and many other index types. CPU-only via PyPI (faiss-cpu); GPU support requires conda or building from source. Officially maintained by Meta/Facebook AI Research; PyPI wheel builds maintained by the community (faiss-wheels project). Import is always 'import faiss' regardless of which package is installed.","last_verified":"2026-02-28","tags":["faiss","vector-search","similarity-search","embeddings","ann","hnsw","ivf","offline","library"],"api":"https://checklist.day/api/registry/faiss-cpu"},{"id":"fireworks-ai","title":"Fireworks AI Python SDK","library":"fireworks-ai","status":"active","version":"0.19.20","language":"en","summary":"Python SDK and OpenAI-compatible API for running open-source and proprietary LLMs on Fireworks AI infrastructure.","last_verified":"2026-02-28","tags":["llm","inference","openai-compatible","python","ai-sdk"],"api":"https://checklist.day/api/registry/fireworks-ai"},{"id":"groq","title":"Groq Python SDK","library":"groq","status":"active","version":"0.18.0","language":"python","summary":"Official Python SDK for GroqCloud API. OpenAI-compatible interface for ultra-low-latency LLM inference on Groq LPU hardware. Model IDs change frequently as models are deprecated and replaced with no versioned aliases.","last_verified":"2026-02-28","tags":["groq","llm","inference","fast","llama","openai-compatible","lpu"],"api":"https://checklist.day/api/registry/groq"},{"id":"guidance","title":"Guidance","library":"guidance","status":"active","version":"0.3.1","language":"en","summary":"Microsoft-backed constrained generation framework for LLMs. Programs interleave control flow with generation via lm += gen(...) syntax. Supports regex, CFGs, JSON schema, and select() constraints using a Rust-based llguidance engine. Backends: Transformers, llama.cpp, OpenAI, Azure AI. Model objects are immutable — each += produces a copy.","last_verified":"2026-02-28","tags":["guidance","constrained-generation","llm","structured-output","regex","json-schema","microsoft","llama-cpp","transformers"],"api":"https://checklist.day/api/registry/guidance"},{"id":"haystack-ai","title":"Haystack","library":"haystack-ai","status":"active","version":"2.24.x","language":"en","summary":"Open-source AI orchestration framework by deepset for building production-ready LLM applications in Python. Core abstraction is the Pipeline — a computation graph of Components (embedders, retrievers, generators, rankers, etc.) connected via explicit typed edges. Supports RAG, agents, semantic search, and multimodal workflows. Integrates with OpenAI, Anthropic, Mistral, Hugging Face, Azure, Bedrock, and 70+ vector databases and document stores.","last_verified":"2026-02-28","tags":["agents","rag","orchestration","pipelines","deepset","openai","huggingface","vector-store","python","production-stable"],"api":"https://checklist.day/api/registry/haystack-ai"},{"id":"helicone","title":"Helicone","library":"helicone","status":"active","version":"helicone-helpers 1.0.3 (optional stub)","language":"en","summary":"Open-source LLM observability platform using a proxy-based architecture. Unlike LangSmith or Langfuse, Helicone requires NO Python SDK install for core tracing — it works by routing requests through its AI gateway (https://ai-gateway.helicone.ai) via a base_url override on the OpenAI/Anthropic client. All logging happens at the proxy layer. The 'helicone' PyPI package (helicone-helpers) is a thin optional helper with minimal functionality. Primary integration is via HTTP headers and base_url, not a Python library.","last_verified":"2026-02-28","tags":["helicone","observability","llm-proxy","ai-gateway","llm-monitoring","proxy-based","openai-compatible","caching"],"api":"https://checklist.day/api/registry/helicone"},{"id":"huggingface-hub","title":"Hugging Face Hub","library":"huggingface-hub","status":"active","version":"1.5.0","language":"en","summary":"Official Python client for the Hugging Face Hub. Handles model/dataset/Space downloading, uploading, caching, repo management, and Hub API interactions. Used as a dependency by transformers, datasets, sentence-transformers, and most HF ecosystem packages. The CLI was previously huggingface-cli; in v1.x it was rebranded as hf (also available as a standalone pip install hf package for CLI-only use). Import name: huggingface_hub (underscore). Package name: huggingface-hub (hyphen). These are different from the hf CLI package.","last_verified":"2026-02-28","tags":["huggingface","huggingface-hub","model-download","model-hub","hf-cli","cache","upload","pretrained-models","gated-models"],"api":"https://checklist.day/api/registry/huggingface-hub"},{"id":"instructor","title":"Instructor","library":"instructor","status":"active","version":"1.14.5","language":"en","summary":"Structured data extraction from LLMs via Pydantic models. Patches or wraps provider clients (OpenAI, Anthropic, Gemini, Cohere, Mistral, Groq, Ollama, and 15+ others) to add response_model, automatic validation, and retry logic. Uses tool-calling or JSON mode depending on provider. Core interface: client.chat.completions.create(response_model=MyModel, ...) returns a validated Pydantic instance. Maintained by Jason Liu / jxnl.","last_verified":"2026-02-28","tags":["instructor","structured-outputs","pydantic","openai","anthropic","gemini","extraction","validation","llm","tool-calling"],"api":"https://checklist.day/api/registry/instructor"},{"id":"kakao","title":"Kakao API","library":"kakao","status":"active","version":"2025-latest","language":"en","summary":"Kakao REST API provides access to Kakao Login, KakaoTalk messaging, friends list, maps, and AI features. Primary documentation is in Korean. English docs exist but are incomplete and lag behind Korean versions. No official Python/Node package — all integrations use REST API directly with app keys.","last_verified":"2026-02-28","tags":["kakao","kakaotalk","oauth","korea","rest-api","social-login","messaging"],"api":"https://checklist.day/api/registry/kakao"},{"id":"lancedb","title":"LanceDB","library":"lancedb","status":"active","version":"0.29.2","language":"en","summary":"Embedded, serverless vector database built on the Lance columnar format (Apache Arrow-based). Runs in-process with no separate server required — data is stored on the local filesystem or object storage (S3, GCS, Azure). Supports vector similarity search, full-text search, SQL filtering, and automatic versioning. Also available as a managed cloud service (LanceDB Cloud). Python package is in 'Alpha' status on PyPI despite being production-used. Backed by pyarrow and pylance (the Lance Rust library, not Microsoft's Python language server).","last_verified":"2026-02-28","tags":["lancedb","lance","vector-search","embedded-database","serverless","arrow","rag","multimodal","ann","s3"],"api":"https://checklist.day/api/registry/lancedb"},{"id":"langchain","title":"LangChain","library":"langchain","status":"active","version":"1.0.x","language":"en","summary":"LangChain is an orchestration framework for building LLM-powered applications and agents. It provides abstractions for chaining model calls, tool use, memory, and multi-agent coordination. As of v1.0, LangGraph is the foundational runtime for stateful agent execution.","last_verified":"2026-02-28","tags":["langchain","llm","agents","orchestration","python","nodejs"],"api":"https://checklist.day/api/registry/langchain"},{"id":"langfuse","title":"Langfuse","library":"langfuse","status":"active","version":"3.14.5","language":"en","summary":"Open-source LLM observability and evaluation platform. Python SDK provides tracing via @observe decorator, OpenTelemetry integration, and a low-level client for manual trace/span management. Works with any LLM framework — not tied to LangChain. Self-hostable (Docker/Kubernetes) or cloud (EU/US regions). MAJOR VERSION NOTE: SDK was completely rewritten in v3 (released June 2025). v3 is OpenTelemetry-based with a new singleton client pattern. All v2 import paths, class names, and initialization patterns are broken in v3. pip install langfuse installs v3 as of Feb 2026.","last_verified":"2026-02-28","tags":["langfuse","tracing","observability","llm-monitoring","opentelemetry","evaluation","self-hosted","open-source"],"api":"https://checklist.day/api/registry/langfuse"},{"id":"langgraph","title":"LangGraph","library":"langgraph","status":"active","version":"1.0.x","language":"en","summary":"LangGraph is a low-level agent orchestration framework providing graph-based execution with durable state, built-in persistence, and human-in-the-loop patterns. As of v1.0 it is the foundational runtime for LangChain agents. Use LangChain for high-level agent abstractions, LangGraph for fine-grained control.","last_verified":"2026-02-28","tags":["langgraph","agents","orchestration","state-machine","python","nodejs","multi-agent"],"api":"https://checklist.day/api/registry/langgraph"},{"id":"langsmith","title":"LangSmith","library":"langsmith","status":"active","version":"0.7.7","language":"en","summary":"Official Python SDK for LangSmith — LangChain's observability, tracing, and evaluation platform. Instruments LLM calls, chains, agents, and arbitrary functions with the @traceable decorator or wrap_openai(). Traces are sent asynchronously to LangSmith cloud (or self-hosted). Also provides dataset management and evaluation (evaluate()) APIs. Usable standalone without LangChain. Releases multiple times per week. SECURITY NOTE: CVE-2026-25528 — SSRF via baggage header injection, fixed in 0.6.3. Any version 0.4.10–0.6.2 is vulnerable.","last_verified":"2026-02-28","tags":["langsmith","tracing","observability","langchain","evaluation","llm-monitoring","traceable","evals"],"api":"https://checklist.day/api/registry/langsmith"},{"id":"line","title":"LINE Messaging API","library":"line","status":"active","version":"3.21.0","language":"en","summary":"LINE Messaging API enables bot development for LINE, the dominant messaging platform in Japan, Thailand, and Taiwan with 196M+ monthly users. Official SDKs exist for Python and Node.js. Primary documentation is in Japanese — English docs are complete but may lag Japanese versions.","last_verified":"2026-02-28","tags":["line","messaging","bot","webhook","japan","thailand","taiwan","python","nodejs"],"api":"https://checklist.day/api/registry/line"},{"id":"litellm","title":"LiteLLM","library":"litellm","status":"active","version":"1.81.15","language":"en","summary":"Unified Python SDK and proxy gateway for calling 100+ LLM APIs in OpenAI-compatible format. Single interface for OpenAI, Anthropic, Bedrock, VertexAI, Groq, Mistral, Cohere, HuggingFace, vLLM, and more. Supports /chat/completions, /embeddings, /images, /audio, /rerank, streaming, async, cost tracking, fallbacks, and load balancing. Also ships as a deployable proxy server (AI Gateway) with budget controls, virtual keys, and logging. Releases multiple times per week — version numbers are high (v1.81+) due to frequent patch releases. NOT related to the 'litellm' PyPI stub that existed before BerriAI claimed the name.","last_verified":"2026-02-28","tags":["litellm","llm-gateway","openai-compatible","anthropic","bedrock","vertexai","multi-provider","proxy","cost-tracking","streaming"],"api":"https://checklist.day/api/registry/litellm"},{"id":"llama-parse","title":"LlamaParse","library":"llama-parse","status":"deprecated","version":"0.6.94","language":"en","summary":"GenAI-native cloud document parser by LlamaIndex for RAG-optimized output. Parses PDFs, PPTX, DOCX, XLSX, HTML and more into markdown, text, or structured JSON with accurate table extraction and multimodal support. Cloud API service — requires an API key from cloud.llamaindex.ai. NOT a local/offline tool. CRITICAL: The llama-parse package (and its successor llama-cloud-services) are DEPRECATED as of early 2026. The replacement is 'llama-cloud' (pip install llama-cloud), which targets LlamaParse API v2. The old packages are maintained until May 1, 2026 only.","last_verified":"2026-02-28","tags":["llamaparse","llama-parse","llama-cloud","document-parsing","pdf-parsing","rag","llamaindex","cloud-api","ocr","table-extraction"],"api":"https://checklist.day/api/registry/llama-parse"},{"id":"llamaindex","title":"LlamaIndex","library":"llamaindex","status":"active","version":"0.14.15","language":"en","summary":"LlamaIndex is a data framework for building LLM-powered agents over your data. Specializes in RAG pipelines, document parsing, and agent workflows. Core package is llama-index-core. Integrations are separate packages installed from LlamaHub.","last_verified":"2026-02-28","tags":["llamaindex","rag","agents","documents","python","llm","retrieval"],"api":"https://checklist.day/api/registry/llamaindex"},{"id":"mistral","title":"Mistral AI Python SDK","library":"mistral","status":"active","version":"1.12.4","language":"python","summary":"Official Python SDK for Mistral AI API. Has gone through two major breaking rewrites: v0 → v1 (MistralClient removed) and v1 → v2 (in pre-release on GitHub, not yet on PyPI). Most LLM-generated code references v0 patterns which no longer work.","last_verified":"2026-02-28","tags":["mistral","llm","chat","completions","agents","embeddings","ocr","france"],"api":"https://checklist.day/api/registry/mistral"},{"id":"ollama","title":"Ollama Python Library","library":"ollama","status":"active","version":"0.6.1","language":"en","summary":"Official Python client for Ollama — the local LLM runtime. Wraps the Ollama REST API with native Python types. Requires Ollama to be installed and running locally (or pointed at https://ollama.com for cloud models). Not an inference provider — it's a runtime client.","last_verified":"2026-02-28","tags":["llm","local-inference","ollama","python","runtime-client","openai-compatible","self-hosted"],"api":"https://checklist.day/api/registry/ollama"},{"id":"openai-agents","title":"OpenAI Agents SDK","library":"openai-agents","status":"active","version":"0.10.1","language":"python","summary":"Lightweight multi-agent orchestration framework by OpenAI. Production successor to Swarm. Provider-agnostic.","last_verified":"2026-02-28","tags":["agents","multi-agent","orchestration","openai","handoffs","guardrails","tracing","mcp","realtime"],"api":"https://checklist.day/api/registry/openai-agents"},{"id":"openai","title":"OpenAI Python SDK","library":"openai","status":"active","version":"2.x","language":"en","summary":"Official Python SDK for the OpenAI API. As of v2.x, the Agents SDK requires openai v2. The Assistants API is deprecated and will be removed August 26, 2026. Use the Responses API for new agent builds.","last_verified":"2026-02-28","tags":["openai","gpt","llm","agents","responses-api","python","nodejs"],"api":"https://checklist.day/api/registry/openai"},{"id":"openrouter","title":"OpenRouter Python SDK","library":"openrouter","status":"active","version":"0.7.11","language":"en","summary":"Official Python SDK for the OpenRouter API — a unified gateway to 300+ LLM models across providers (OpenAI, Anthropic, Google, Meta, Mistral, etc.) via a single OpenAI-compatible endpoint. Auto-generated from OpenRouter's OpenAPI spec and updated on every API change. Fully typed with Pydantic. SDK is explicitly in beta — breaking changes can occur between minor versions without a major version bump. IMPORTANT: There are multiple competing 'openrouter' packages on PyPI (openrouter, openrouter-client, python-open-router). Only 'openrouter' (by OpenRouter) is the official SDK.","last_verified":"2026-02-28","tags":["openrouter","llm-gateway","multi-model","openai-compatible","anthropic","model-routing","unified-api","beta"],"api":"https://checklist.day/api/registry/openrouter"},{"id":"outlines","title":"Outlines","library":"outlines","status":"active","version":"1.2.11","language":"en","summary":"Structured generation library by .txt (dottxt-ai). Guarantees schema-valid outputs at generation time via FSM-based logits masking — no post-processing or retries. Supports regex, JSON schema (Pydantic or raw), CFG, and multiple-choice constraints. Backends: Transformers, vLLM, llama.cpp, MLX, Ollama, OpenAI, Mistral, Gemini. Core FSM engine split into separate outlines-core package (Rust). Two coexisting APIs: legacy outlines.models + outlines.generate.* style, and new 1.x outlines.from_* + model(prompt, Schema) style.","last_verified":"2026-02-28","tags":["outlines","structured-generation","constrained-decoding","json-schema","regex","pydantic","vllm","transformers","llama-cpp","fsm"],"api":"https://checklist.day/api/registry/outlines"},{"id":"perplexityai","title":"Perplexity AI Python SDK","library":"perplexityai","status":"active","version":"0.30.1","language":"en","summary":"Official Python SDK for the Perplexity API — web-grounded chat completions with real-time search, citations, and reasoning.","last_verified":"2026-02-28","tags":["llm","search","rag","citations","python","openai-compatible"],"api":"https://checklist.day/api/registry/perplexityai"},{"id":"pgvector","title":"pgvector","library":"pgvector","status":"active","version":"0.8.2","language":"en","summary":"Open-source PostgreSQL extension for vector similarity search. Two components: (1) the server-side Postgres extension (C, compiled and installed into Postgres), and (2) the Python client package 'pgvector' on PyPI which provides ORM/adapter integrations for psycopg2, psycopg3, asyncpg, SQLAlchemy, Django, SQLModel, and Peewee. The extension name in SQL is 'vector' (CREATE EXTENSION vector), not 'pgvector'. Maintained by Andrew Kane. Current extension version: 0.8.2 (CVE security fix). Python client: 0.4.2.","last_verified":"2026-02-28","tags":["pgvector","postgres","postgresql","vector-search","similarity-search","embeddings","hnsw","ivfflat","rag","sql"],"api":"https://checklist.day/api/registry/pgvector"},{"id":"pinecone","title":"Pinecone Python SDK","library":"pinecone","status":"active","version":"8.1.0","language":"en","summary":"Official Python SDK for the Pinecone managed vector database service. Supports serverless and pod-based indexes, vector upsert/query/delete, metadata filtering, namespaces, and integrated inference (embedding + reranking). REST client by default; optional gRPC transport for performance. Async support via PineconeAsyncio. Package was renamed from pinecone-client to pinecone in v5.1.0.","last_verified":"2026-02-28","tags":["pinecone","vector-database","managed","serverless","rag","similarity-search","embeddings","cloud"],"api":"https://checklist.day/api/registry/pinecone"},{"id":"pymilvus","title":"PyMilvus","library":"pymilvus","status":"active","version":"2.6.8","language":"en","summary":"Official Python SDK for Milvus, the open-source vector database. Has two coexisting APIs: the modern MilvusClient (recommended, introduced in 2.3) and the legacy ORM API (connections.connect + Collection class). Both currently work but LLM-generated and tutorial code overwhelmingly uses the old ORM patterns. Milvus Lite (embedded SQLite-based local mode) is available via the milvus-lite extra on Linux/macOS only. Server version must be kept in sync with client minor version.","last_verified":"2026-02-28","tags":["milvus","pymilvus","vector-database","rag","embeddings","grpc","similarity-search","milvus-lite"],"api":"https://checklist.day/api/registry/pymilvus"},{"id":"qdrant-client","title":"Qdrant Python Client","library":"qdrant-client","status":"active","version":"1.17.0","language":"en","summary":"Official Python client for the Qdrant vector search engine. Supports both remote server (gRPC or REST) and local in-memory/on-disk mode without a running server. All data operations use query_points() as the unified interface as of 1.10+. Prior search/recommend/discover methods removed in 1.14.0. Bundles FastEmbed for optional local embedding generation. Async support via AsyncQdrantClient.","last_verified":"2026-02-28","tags":["qdrant","vector-database","similarity-search","rag","embeddings","grpc","in-memory","local-mode"],"api":"https://checklist.day/api/registry/qdrant-client"},{"id":"redisvl","title":"Redis Vector / RedisVL","library":"redisvl","status":"active","version":"0.14.1","language":"en","summary":"The official AI-native Python client for vector search on Redis. Wraps redis-py with a high-level interface for defining vector schemas, building HNSW/FLAT indexes, running hybrid search, semantic routing, LLM caching, and session memory. Requires Redis 7.2+ with Search & Query module, or Redis Stack (self-hosted), or Redis Cloud. The underlying redis-py client is a separate package ('redis') — redisvl depends on it. Import root is 'redisvl'. Maintained by Redis Inc.","last_verified":"2026-02-28","tags":["redis","redisvl","vector-search","embeddings","hnsw","rag","hybrid-search","llm-cache","semantic-router"],"api":"https://checklist.day/api/registry/redisvl"},{"id":"semantic-kernel","title":"Semantic Kernel","library":"semantic-kernel","status":"active","version":"1.39.4","language":"en","summary":"Microsoft's open-source SDK for integrating LLMs into applications. Model-agnostic (OpenAI, Azure OpenAI, Google, Hugging Face, Mistral, etc.), supports Python, C#, and Java. Provides plugins (tools), prompt templating, multi-agent orchestration (AgentChat framework), and vector store integrations for RAG. Production-stable as of v1.x. Currently the primary Microsoft-maintained AI framework (AutoGen is maintenance-only; Microsoft Agent Framework is the future successor).","last_verified":"2026-02-28","tags":["agents","orchestration","microsoft","azure","openai","rag","plugins","vector-store","python","production-stable"],"api":"https://checklist.day/api/registry/semantic-kernel"},{"id":"sentence-transformers","title":"Sentence Transformers","library":"sentence-transformers","status":"active","version":"5.2.3","language":"en","summary":"Framework for computing dense sentence/text/image embeddings using transformer models. Primary use cases: semantic search, semantic similarity, clustering, and reranking. Wraps transformers and provides SentenceTransformer (embedding), CrossEncoder (reranker), and SparseEncoder (sparse embedding) classes. 15,000+ pretrained models on HF Hub. Now officially maintained by Hugging Face (Tom Aarsen) after transfer from UKP Lab/TU Darmstadt. Package name: sentence-transformers (hyphen). Import name: sentence_transformers (underscore).","last_verified":"2026-02-28","tags":["sentence-transformers","embeddings","semantic-search","sbert","cosine-similarity","reranking","cross-encoder","rag","huggingface"],"api":"https://checklist.day/api/registry/sentence-transformers"},{"id":"tiktoken","title":"tiktoken","library":"tiktoken","status":"active","version":"0.12.0","language":"en","summary":"Fast BPE tokenizer from OpenAI, written in Rust. Used to count tokens and encode/decode text for OpenAI models. 3-6x faster than comparable Python tokenizers. Does NOT call any API — purely local computation. Requires a Rust compiler at build time on platforms without pre-built wheels. Package name and import name are both 'tiktoken'.","last_verified":"2026-02-28","tags":["tiktoken","openai","tokenizer","bpe","token-counting","gpt-4o","cl100k","o200k","context-window"],"api":"https://checklist.day/api/registry/tiktoken"},{"id":"together-ai","title":"Together AI Python SDK","library":"together-ai","status":"active","version":"1.4.1","language":"python","summary":"Official Python SDK for Together AI platform. OpenAI-compatible API for running 100+ open-source models. Has gone through two full rewrites: pre-v1 (2023) → v1.0 (April 2024) → v2.0 RC (2025, not yet GA on PyPI). Can also be used via OpenAI client with base_url override.","last_verified":"2026-02-28","tags":["together","llm","open-source","llama","inference","openai-compatible","fine-tuning"],"api":"https://checklist.day/api/registry/together-ai"},{"id":"transformers","title":"Hugging Face Transformers","library":"transformers","status":"active","version":"5.2.0","language":"en","summary":"The central model-definition framework for state-of-the-art ML models across text, vision, audio, video, and multimodal tasks. Provides pretrained model weights, tokenizers, pipelines, and training APIs. Interfaces with PyTorch (primary), with 400+ model architectures and 750k+ checkpoints on the Hub. MAJOR VERSION NOTE: v5 released late 2025 — first major release in 5 years. v5 is PyTorch-only (TensorFlow/Flax/JAX removed). pip install transformers installs v5 as of Feb 2026. v4 was the last stable version before this; v4.57.x is the last v4 release. Requires Python 3.10+ in v5.","last_verified":"2026-02-28","tags":["transformers","huggingface","pytorch","llm","bert","gpt","pipeline","embeddings","fine-tuning","inference","pretrained-models"],"api":"https://checklist.day/api/registry/transformers"},{"id":"weaviate-client","title":"Weaviate Python Client","library":"weaviate-client","status":"active","version":"4.20.0","language":"en","summary":"Official Python client for the Weaviate vector database. v4 is a complete rewrite from v3 — uses gRPC for data operations (significantly faster), strong typing, collection-centric API, and context manager connection lifecycle. v3 API (weaviate.Client class) removed from the v4 package as of late 2024. Compatible with Weaviate server >= 1.23.7. Supports local, cloud (Weaviate Cloud), and custom deployments.","last_verified":"2026-02-28","tags":["weaviate","vector-database","grpc","rag","similarity-search","embeddings","self-hosted","cloud"],"api":"https://checklist.day/api/registry/weaviate-client"},{"id":"xai-sdk","title":"xAI Grok Python SDK","library":"xai-sdk","status":"active","version":"1.7.0","language":"en","summary":"Official Python SDK for xAI's Grok models — gRPC-based client for chat, image generation, and agentic tool use. Also supports OpenAI-compatible REST API.","last_verified":"2026-02-28","tags":["llm","grok","openai-compatible","grpc","python","search","agentic"],"api":"https://checklist.day/api/registry/xai-sdk"},{"id":"arize-phoenix","title":"Arize Phoenix (arize-phoenix)","library":"arize-phoenix","status":"active","version":"13.3.0","language":"en","summary":"Open-source AI observability and evaluation platform from Arize AI. Phoenix provides tracing, evaluation, prompt management, datasets, and experiments for LLM applications. Built on OpenTelemetry and OpenInference. Can be run locally, self-hosted (Docker/K8s), or accessed via cloud at app.phoenix.arize.com. CRITICAL: Phoenix is a separate product from Arize AX (the enterprise cloud platform). They share the Arize brand but use different packages, credentials, and endpoints: Phoenix uses phoenix.otel + PHOENIX_API_KEY; Arize AX uses arize.otel + ARIZE_SPACE_ID + ARIZE_API_KEY. arize-phoenix is the full-platform bundle. For production deployments, the modular sub-packages (arize-phoenix-otel, arize-phoenix-client, arize-phoenix-evals) are preferred. LICENSE: Elastic License 2.0 (ELv2) — NOT MIT/Apache. Restrictions apply to providing Phoenix as a managed service to third parties.","last_verified":"2026-02-28","tags":["arize-phoenix","phoenix","tracing","observability","llm-monitoring","opentelemetry","openinference","evaluation","self-hosted","open-source","elv2"],"api":"https://checklist.day/api/registry/arize-phoenix"},{"id":"arize","title":"Arize (arize SDK)","library":"arize","status":"active","version":"8.2.0","language":"en","summary":"Python SDK for interacting with Arize AX — Arize's enterprise AI engineering platform. IMPORTANT: The 'arize' package covers two separate product tracks with distinct APIs that coexist in the same package. (1) arize.api.Client / arize.ml — the v7 legacy ML monitoring SDK for logging predictions, actuals, embeddings, and drift data to the Arize ML platform. (2) arize.ArizeClient — the v8 LLM/agent observability API for tracing spans, evaluations, and datasets. v7 and v8 are NOT interchangeable. v7 support ends June 1, 2026. Also note: 'arize-phoenix' is a SEPARATE package — Phoenix is Arize's open-source observability tool. Do not conflate arize (enterprise cloud) with arize-phoenix (OSS self-hosted).","last_verified":"2026-02-28","tags":["arize","observability","llm-monitoring","tracing","evaluation","ml-monitoring","opentelemetry","openinference","enterprise"],"api":"https://checklist.day/api/registry/arize"},{"id":"openllmetry","title":"OpenLLMetry (Standalone Instrumentors)","library":"openllmetry","status":"active","version":"0.52.5 (versioned in sync with traceloop-sdk)","language":"en","summary":"OpenLLMetry is a project name, not a single installable PyPI package. It refers to two related but distinct things that must not be confused: (1) The traceloop-sdk — the high-level Traceloop SDK (already documented separately) that wraps OpenLLMetry auto-instrumentation behind Traceloop.init(). (2) Standalone opentelemetry-instrumentation-* packages — individual OTel instrumentors for specific LLM providers/frameworks, published from the traceloop/openllmetry GitHub monorepo. These can be used WITHOUT traceloop-sdk in any existing OpenTelemetry setup. ECOSYSTEM CONFUSION: There is a second, competing instrumentation ecosystem called OpenInference (from Arize-ai/openinference), which publishes openinference-instrumentation-* packages. Both OpenLLMetry and OpenInference instrumentors instrument the same providers (OpenAI, LangChain, etc.), use different span attribute schemas, and route to different preferred backends. They are NOT interchangeable.","last_verified":"2026-02-28","tags":["openllmetry","opentelemetry","instrumentation","tracing","llm-monitoring","openai-instrumentation","langchain-instrumentation","traceloop","otel"],"api":"https://checklist.day/api/registry/openllmetry"},{"id":"traceloop-sdk","title":"Traceloop SDK (OpenLLMetry)","library":"traceloop-sdk","status":"active","version":"0.52.5","language":"en","summary":"Python SDK for OpenLLMetry — an open-source, OpenTelemetry-based observability library for LLM applications built and maintained by Traceloop. Provides auto-instrumentation for 20+ LLM providers (OpenAI, Anthropic, Gemini, Bedrock, Ollama) and frameworks (LangChain, LlamaIndex, CrewAI), plus manual @workflow/@task/@agent decorators. KEY RELATIONSHIP: traceloop-sdk is the convenience wrapper around OpenLLMetry instrumentation packages. 'openllmetry' is NOT a separate installable package — it's the umbrella project name for the GitHub monorepo. The installable SDK is 'traceloop-sdk'. By default, traces are exported to Traceloop cloud, but the SDK is fully vendor-agnostic via TRACELOOP_BASE_URL — it can export to any OTLP backend (Datadog, Grafana, Jaeger, Honeycomb, etc.) without using Traceloop's platform at all.","last_verified":"2026-02-28","tags":["traceloop","openllmetry","tracing","observability","opentelemetry","llm-monitoring","auto-instrumentation","otel"],"api":"https://checklist.day/api/registry/traceloop-sdk"},{"id":"grabfood-api-sdk-python","title":"GrabFood Partner API Python SDK","library":"grabfood-api-sdk-python","status":"active","version":"latest (no versioned releases — install from main branch)","language":"en","summary":"Official Python SDK for the GrabFood Partner API — enables food delivery merchants and platform partners in Southeast Asia to manage orders, store hours, menus, and campaigns programmatically. Auto-generated from Grab's OpenAPI spec using OpenAPITools PythonClientCodegen. IMPORTANT: This SDK covers GrabFood only (merchant/food delivery). Grab rides/mobility has no official Python SDK. The SDK is not published on PyPI — install via GitHub only.","last_verified":"2026-02-28","tags":["grab","grabfood","southeast-asia","sea","food-delivery","partner-api","singapore","indonesia","thailand","payments","merchant-api"],"api":"https://checklist.day/api/registry/grabfood-api-sdk-python"},{"id":"line-pay","title":"LINE Pay Online API","library":"line-pay","status":"active","version":"0.2.0","language":"en","summary":"LINE Pay is a mobile payment service by LY Corporation, integrated with the LINE messaging platform. Active in Taiwan and Thailand. Japan service terminated April 30, 2025 (merged into PayPay). No official Python SDK — integration requires raw HTTP with custom HMAC-SHA256 signature. The community SDK on PyPI (line-pay 0.2.0) is unmaintained since 2020.","last_verified":"2026-03-01","tags":["payment","japan","taiwan","thailand","line","qr-code","mobile-payment","asia"],"api":"https://checklist.day/api/registry/line-pay"},{"id":"naver-openapi","title":"Naver Open API (Search, Papago, Clova)","library":"naver-openapi","status":"active","version":"N/A (REST API — no versioned Python package)","language":"en","summary":"Naver's consumer-facing Open API suite — provides programmatic access to South Korea's dominant search engine (70%+ market share), Papago translation, Clova OCR/AI services, Naver Maps, and Naver Login (OAuth). No official Python SDK exists — integration is done via direct HTTP requests using the requests library. Auth uses Client ID and Client Secret passed as HTTP headers. All APIs are registered and managed via the Naver Cloud Platform console. IMPORTANT: Naver Open API is Korea-first — registration UI, documentation, and error messages are primarily in Korean. Daily quota resets at 00:00 KST (UTC+9), not UTC.","last_verified":"2026-03-01","tags":["naver","korea","search","papago","translation","clova","ocr","korean-search","naver-login","oauth","korean-market"],"api":"https://checklist.day/api/registry/naver-openapi"},{"id":"ncloud-sdk-python","title":"Naver Cloud Platform Python SDK","library":"ncloud-sdk-python","status":"active","version":"2.8.1 (ncloud meta-package)","language":"en","summary":"Official Python SDK for Naver Cloud Platform (NCP) — South Korea's primary cloud provider, equivalent to AWS in the Korean market. Provides programmatic access to NCP infrastructure services: servers, VPC, CDN, DNS, Object Storage, and more. The SDK is split into service-specific packages (ncloud-server, ncloud-vserver, ncloud-vpc, ncloud-cdn, etc.) — install only the services you need. IMPORTANT: NCP has two distinct environments — Classic and VPC — with separate SDK modules, different API endpoints, and incompatible APIs. ncloud-server targets Classic, ncloud-vserver targets VPC. New accounts should use VPC.","last_verified":"2026-03-01","tags":["naver","ncloud","korea","cloud","infrastructure","vpc","server","cdn","korean-cloud","aws-alternative"],"api":"https://checklist.day/api/registry/ncloud-sdk-python"},{"id":"wechatpayv3","title":"WeChat Pay Python SDK","library":"wechatpayv3","status":"active","version":"2.0.1","language":"en","summary":"Community Python SDK for WeChat Pay API v3 — the de-facto standard for WeChat Pay v3 integration in Python. Covers direct merchant mode and service provider (partner) mode. Handles RSA signature generation, platform certificate auto-download and rotation, sensitive field encryption, and callback verification/decryption automatically. IMPORTANT: Multiple competing packages exist on PyPI (wechat_pay, pywechatpay, wechatpy, pywe-pay). Only 'wechatpayv3' targets the v3 API and is actively maintained. NOTE: This is a community SDK — WeChat Pay has no official Python SDK.","last_verified":"2026-02-28","tags":["wechat-pay","wechatpay","china","payments","v3","qr-code","jsapi","miniprogram","cnypayments","asian-payments"],"api":"https://checklist.day/api/registry/wechatpayv3"},{"id":"wechatpy","title":"WeChat Mini Program Server API","library":"wechatpy","status":"active","version":"1.8.18","language":"en","summary":"Server-side Python SDK for WeChat (Weixin) Mini Programs. Handles the backend half of Mini Program interactions: login verification (jscode2session), access token management, template messages, and encrypted data decryption. The dominant community SDK is wechatpy — no official Tencent Python SDK exists. Mini Program frontend runs in WeChat's JavaScript runtime (WXML/WXSS); Python handles server-side API calls only.","last_verified":"2026-03-01","tags":["wechat","miniprogram","weixin","china","tencent","login","openid","session","mobile-payment"],"api":"https://checklist.day/api/registry/wechatpy"},{"id":"banana-dev","title":"Banana Dev","library":"banana-dev","status":"abandoned","version":"6.3.0","language":"en","summary":"Python client for Banana's serverless GPU inference platform. Platform permanently shut down March 31, 2024. Package is abandoned. All API calls fail — the backend no longer exists. Do not use.","last_verified":"2026-03-01","tags":["gpu","serverless","inference","abandoned","shutdown","python"],"api":"https://checklist.day/api/registry/banana-dev"},{"id":"browserbase","title":"Browserbase Python SDK","library":"browserbase","status":"active","version":"1.4.0","language":"en","summary":"Python SDK for Browserbase, a serverless headless browser platform. Provides remote browser sessions accessible via Playwright/Selenium CDP connections. Has undergone a complete SDK rewrite at v1.0.0 (October 2024) — the old high-level API (browserbase.load(), browserbase.screenshot()) is gone. Current SDK is a low-level Stainless-generated client focused on session lifecycle management.","last_verified":"2026-03-01","tags":["browser","headless","playwright","scraping","agents","CDP","cloud"],"api":"https://checklist.day/api/registry/browserbase"},{"id":"composio","title":"Composio","library":"composio","status":"active","version":"composio: latest (new SDK); composio-openai: 0.11.1 (legacy framework packages)","language":"en","summary":"Tool integration platform for AI agents. Provides 250+ pre-built tool integrations (GitHub, Gmail, Slack, Notion, etc.) with managed OAuth across frameworks including OpenAI, Anthropic, LangChain, CrewAI, and AutoGen. Has undergone a major v1→v3 SDK rewrite with significant breaking changes — the majority of tutorials and LLM-generated code targets the old v1 API.","last_verified":"2026-03-01","tags":["tools","integrations","agents","oauth","agentic","python","ai-tools"],"api":"https://checklist.day/api/registry/composio"},{"id":"e2b-code-interpreter","title":"E2B Code Interpreter","library":"e2b-code-interpreter","status":"active","version":"2.4.1","language":"en","summary":"Secure cloud sandboxes for executing AI-generated code. Two packages: e2b-code-interpreter (Jupyter/stateful code execution) and e2b (core sandbox SDK). v0.x is fully deprecated — all versions below 1.0 show deprecation warnings and the API is incompatible with v1.x.","last_verified":"2026-03-01","tags":["sandbox","code-execution","agents","python","jupyter","cloud"],"api":"https://checklist.day/api/registry/e2b-code-interpreter"},{"id":"fastmcp","title":"FastMCP (standalone)","library":"fastmcp","status":"active","version":"3.0.2","language":"en","summary":"The dominant Python framework for building MCP servers and clients. FastMCP 1.0 was incorporated into the official mcp package in 2024, but the standalone project continued evolving independently. Now maintained by Prefect (PrefectHQ/fastmcp), v3.0 shipped February 2026 as a major breaking rewrite. ~1M daily downloads. Powers ~70% of MCP servers across all languages. Critical: minor versions may contain breaking changes — pin aggressively.","last_verified":"2026-03-01","tags":["mcp","model-context-protocol","tools","agents","servers","python","protocol"],"api":"https://checklist.day/api/registry/fastmcp"},{"id":"flyctl","title":"Fly.io (flyctl + Machines API)","library":"flyctl","status":"active","version":"flyctl — see fly version update for current (auto-updates by default)","language":"en","summary":"Fly.io is a compute platform for deploying containerized apps globally. There is no official Fly.io Python SDK — interaction is via the flyctl CLI, fly.toml config file, and the Machines REST API. The unofficial fly-python-sdk on PyPI (pip install fly-python-sdk) is a third-party project last updated October 2023 and should not be relied upon.","last_verified":"2026-03-01","tags":["platform","deployment","containers","serverless","cli","infra","cloud"],"api":"https://checklist.day/api/registry/flyctl"},{"id":"letta","title":"MemGPT / Letta","library":"letta","status":"renamed","version":"0.16.4","language":"en","summary":"Stateful agent framework formerly known as MemGPT. Renamed to Letta in September 2024. PyPI package moved from memgpt to letta. Two packages: letta (full server runtime) and letta-client (API client only). Agents are server-side persistent resources, not in-process Python objects.","last_verified":"2026-03-01","tags":["memory","agents","stateful","letta","memgpt","python","llm"],"api":"https://checklist.day/api/registry/letta"},{"id":"mcp","title":"MCP Python SDK","library":"mcp","status":"active","version":"1.26.0","language":"en","summary":"Official Python SDK for the Model Context Protocol (MCP), maintained by Anthropic. Used to build MCP servers (exposing tools, resources, prompts to LLMs) and MCP clients. Two important ecosystem distinctions: (1) mcp package bundles FastMCP 1.0 via mcp.server.fastmcp; (2) standalone 'fastmcp' package on PyPI is a separate, more feature-rich framework that diverged from the bundled version. v2 of the mcp package is in pre-alpha on main branch — v1.x is stable.","last_verified":"2026-03-01","tags":["mcp","model-context-protocol","tools","agents","protocol","anthropic","servers"],"api":"https://checklist.day/api/registry/mcp"},{"id":"mem0ai","title":"Mem0","library":"mem0ai","status":"active","version":"1.0.4","language":"en","summary":"Memory layer for AI agents and assistants. PyPI package is mem0ai but imports as mem0. Two distinct usage modes: Memory class (OSS, self-hosted) and MemoryClient class (managed platform). v1.0.0 introduced breaking changes to response format.","last_verified":"2026-03-01","tags":["memory","agents","llm","python","personalization","oss"],"api":"https://checklist.day/api/registry/mem0ai"},{"id":"modal","title":"Modal","library":"modal","status":"active","version":"1.3.3","language":"en","summary":"Serverless cloud infrastructure for AI workloads. Run Python functions on GPUs with sub-second cold starts. v1.0 released May 2025 with multiple breaking API changes. Rapid release cadence — multiple versions per week.","last_verified":"2026-03-01","tags":["serverless","gpu","cloud","agents","python","inference","deployment"],"api":"https://checklist.day/api/registry/modal"},{"id":"replicate","title":"Replicate Python Client","library":"replicate","status":"active","version":"1.0.4","language":"en","summary":"Python client for running ML models on Replicate's cloud API. v0.x stable client is at 1.x on PyPI. A v2.0 beta (replicate-python-beta) exists as a separate package with a redesigned API. replicate.stream() is deprecated in v2. Billing changed to prepaid credit for new accounts in July 2025.","last_verified":"2026-03-01","tags":["ml","inference","cloud","python","models","gpu","api"],"api":"https://checklist.day/api/registry/replicate"},{"id":"runpod","title":"RunPod Python SDK","library":"runpod","status":"active","version":"1.8.1","language":"en","summary":"Python SDK for RunPod's GPU cloud platform. Serves two distinct purposes: (1) a serverless worker SDK for building handlers deployed inside RunPod containers, and (2) an API client for managing pods and endpoints remotely. These two use cases have different entry points and patterns.","last_verified":"2026-03-01","tags":["gpu","serverless","cloud","inference","python","workers","pods"],"api":"https://checklist.day/api/registry/runpod"},{"id":"zep-cloud","title":"Zep","library":"zep-cloud","status":"active","version":"3.16.0","language":"en","summary":"Long-term memory and context platform for AI assistants. Two separate PyPI packages: zep-cloud (managed platform) and zep-python (OSS/Community Edition). Not interchangeable. zep-python v2.x main branch now ships the Cloud SDK — OSS users need the oss branch build.","last_verified":"2026-03-01","tags":["memory","agents","llm","python","context","chat-history"],"api":"https://checklist.day/api/registry/zep-cloud"},{"id":"lemonsqueezy","title":"Lemon Squeezy","library":"lemonsqueezy (no official Python SDK)","status":"deprecated","version":"REST API v1 (no official Python SDK)","language":"en","summary":"Merchant of record platform for digital products. Acquired by Stripe in October 2024. Being absorbed into 'Stripe Managed Payments' (announced May 2025, private preview). No official Python SDK exists — only community-maintained libraries. REST API uses JSON:API spec with Bearer auth.","last_verified":"2026-03-01","tags":["payments","merchant-of-record","digital-products","stripe","billing","python"],"api":"https://checklist.day/api/registry/lemonsqueezy"},{"id":"paddle-python-sdk","title":"Paddle","library":"paddle-python-sdk","status":"active","version":"1.13.0","language":"en","summary":"Official Python SDK for Paddle Billing (the new Paddle platform). PyPI package is paddle-python-sdk, imports as paddle_billing. Requires Python >=3.11. Note: there are two Paddle platforms — Paddle Classic (legacy) and Paddle Billing (current). This SDK is Paddle Billing only.","last_verified":"2026-03-01","tags":["payments","paddle","billing","subscriptions","merchant-of-record","python"],"api":"https://checklist.day/api/registry/paddle-python-sdk"},{"id":"plaid-python","title":"Plaid","library":"plaid-python","status":"active","version":"38.3.0","language":"en","summary":"Official Python client for the Plaid API (bank account linking, transactions, identity, income verification). Auto-generated from OpenAPI spec. Updated monthly, major versions contain breaking changes. Only supports the 2020-09-14 API version (no version selection in client — API version is fixed per SDK release).","last_verified":"2026-03-01","tags":["payments","banking","fintech","plaid","open-banking","python","transactions"],"api":"https://checklist.day/api/registry/plaid-python"},{"id":"stripe","title":"Stripe","library":"stripe","status":"active","version":"14.2.0","language":"en","summary":"Official Python SDK for the Stripe Payments API. Actively maintained by Stripe. Versioned on a date-based API release cycle (e.g. 2026-02-25.clover). SDK major versions track breaking Python API changes separately from API version pinning. Current API version: 2026-02-25.clover.","last_verified":"2026-03-01","tags":["payments","stripe","billing","subscriptions","python","api"],"api":"https://checklist.day/api/registry/stripe"},{"id":"adyen","title":"Adyen","library":"Adyen","status":"active","version":"14.0.0","language":"en","summary":"Official Python API library for Adyen payment processing. Auto-generated from OpenAPI spec. PyPI package name is 'Adyen' (capital A). Uses X-API-Key auth, not OAuth. Each major version pins updated API versions and may contain breaking changes from codegen corrections.","last_verified":"2026-03-01","tags":["payments","adyen","enterprise","python","api"],"api":"https://checklist.day/api/registry/adyen"},{"id":"boto3","title":"AWS SDK for Python (boto3)","library":"boto3","status":"active","version":"1.42.59","language":"en","summary":"Official AWS SDK for Python. Extremely stable API — no breaking changes since v1.0. Two client styles: low-level client (boto3.client()) and high-level resource (boto3.resource()). Credential resolution follows a fixed chain. Region must be specified or set in environment — no global default. Released daily with new AWS service additions.","last_verified":"2026-03-01","tags":["aws","cloud","s3","boto3","python","sdk","infrastructure"],"api":"https://checklist.day/api/registry/boto3"},{"id":"google-cloud-storage","title":"Google Cloud Python (google-cloud-storage / google-cloud-aiplatform)","library":"google-cloud-storage","status":"active","version":"google-cloud-storage: 2.x / google-cloud-aiplatform: 1.139.0","language":"en","summary":"Google Cloud's Python client libraries are split into per-service packages — there is no single 'google-cloud-python' package. Install only what you need: google-cloud-storage, google-cloud-bigquery, google-cloud-aiplatform, etc. Auth uses Application Default Credentials (ADC). Service account JSON is the most common auth method outside GCP. All packages share google-auth as the credential layer.","last_verified":"2026-03-01","tags":["gcp","google-cloud","storage","bigquery","vertex-ai","python","sdk"],"api":"https://checklist.day/api/registry/google-cloud-storage"},{"id":"postmarker","title":"Postmark","library":"postmarker","status":"maintenance","version":"1.0","language":"en","summary":"Python client for the Postmark transactional email API. No official Python SDK exists — postmarker is the primary community library (listed by Postmark on their developer docs). Maintained by Dmitry Dygalo. v1.0 released but project marked inactive on Snyk (no new PyPI releases in 12+ months as of mid-2025). Django email backend included.","last_verified":"2026-03-01","tags":["email","postmark","transactional-email","python","community"],"api":"https://checklist.day/api/registry/postmarker"},{"id":"resend","title":"Resend","library":"resend","status":"active","version":"2.23.0","language":"en","summary":"Official Python SDK for the Resend transactional email API. Active, frequently updated. v2.0 (May 2024) added TypedDict-based type hinting. Module-level api_key pattern is the primary auth method. No class instantiation — all methods are called as class methods on resend.Emails, resend.Contacts, etc.","last_verified":"2026-03-01","tags":["email","resend","transactional-email","python","api"],"api":"https://checklist.day/api/registry/resend"},{"id":"sendgrid","title":"SendGrid","library":"sendgrid","status":"maintenance","version":"6.12.5","language":"en","summary":"Official Python SDK for the Twilio SendGrid Email API v3. Owned by Twilio. Two send patterns exist: Mail helper class and raw dict. v6.x is a breaking change from v5.x. Free plan retired May 2025. SDK last meaningfully updated Sep 2025 — in maintenance mode.","last_verified":"2026-03-01","tags":["email","sendgrid","twilio","transactional-email","python","api"],"api":"https://checklist.day/api/registry/sendgrid"},{"id":"twilio","title":"Twilio","library":"twilio","status":"active","version":"9.10.2","language":"en","summary":"Official Python helper library for the Twilio REST API (SMS, voice, WhatsApp, email, video). v9.0 (2024) was a full OpenAPI-generated rewrite. v6.0 removed TwilioRestClient. All pre-v6 import patterns are broken. Actively maintained.","last_verified":"2026-03-01","tags":["sms","voice","communications","twilio","whatsapp","python","api"],"api":"https://checklist.day/api/registry/twilio"},{"id":"vonage","title":"Vonage","library":"vonage","status":"active","version":"4.7.2","language":"en","summary":"Official Python SDK for Vonage APIs (formerly Nexmo) — SMS, Voice, WhatsApp, Verify, Video. v4.0 (2023) is a complete architectural rewrite from v3. Old nexmo package and old vonage v3 patterns are entirely broken in v4. Now a monorepo: top-level vonage package plus separate vonage_sms, vonage_voice, vonage_verify etc. packages installed as dependencies. All responses are Pydantic models.","last_verified":"2026-03-01","tags":["sms","voice","vonage","nexmo","communications","python","api"],"api":"https://checklist.day/api/registry/vonage"},{"id":"azure-identity","title":"Azure SDK for Python (Identity)","library":"azure-identity","status":"active","version":"1.25.2","language":"en","summary":"Microsoft's official credential library for Azure SDK authentication. Part of the Azure SDK for Python ecosystem. Current version is 1.25.2 (Feb 2026). No single 'azure' package — install per service (azure-identity, azure-storage-blob, azure-keyvault-secrets, etc.).","last_verified":"2026-03-01","tags":["azure","microsoft","identity","authentication","cloud","sdk","python"],"api":"https://checklist.day/api/registry/azure-identity"},{"id":"linear-python","title":"Linear API (Python — no official SDK)","library":"linear-python","status":"abandoned","version":"0.2.2","language":"en","summary":"Linear has NO official Python SDK. All Python options are community packages. Three competing packages exist on PyPI: linear-python (0.2.2), linear-api (0.2.0), and linear-py (0.0.6). None are canonical. The recommended approach for non-trivial use is direct GraphQL via httpx/requests against api.linear.app/graphql. Linear's official SDKs are TypeScript-only (@linear/sdk).","last_verified":"2026-03-01","tags":["linear","project-management","graphql","api","python","issues","tickets"],"api":"https://checklist.day/api/registry/linear-python"},{"id":"notion-client","title":"Notion SDK for Python (notion-client)","library":"notion-client","status":"active","version":"3.0.0","language":"en","summary":"Official community Python client for the Notion API. Current version is 3.0.0 (Jun 2025). Package is notion-client; imports as notion_client. Maintained by the open-source community with Notion's backing. Separate unofficial package notion-py (Jamaica) targets undocumented internal API v3 — not the same as the official API client.","last_verified":"2026-03-01","tags":["notion","api","productivity","database","python"],"api":"https://checklist.day/api/registry/notion-client"},{"id":"pyairtable","title":"pyAirtable","library":"pyairtable","status":"active","version":"3.3.0","language":"en","summary":"Community Python client for the Airtable API. Current version is 3.3.0 (Nov 2025). Previously known as airtable-python-wrapper — renamed at v1.0.0. The old package (airtable-python-wrapper, frozen at 0.15.3) still exists on PyPI but points to legacy 0.x API. No official Airtable Python SDK exists.","last_verified":"2026-03-01","tags":["airtable","api","database","nocode","python","pydantic"],"api":"https://checklist.day/api/registry/pyairtable"}]}