Arize Phoenix (arize-phoenix)

13.3.0 · active · verified Sat Feb 28

Open-source AI observability and evaluation platform from Arize AI. Phoenix provides tracing, evaluation, prompt management, datasets, and experiments for LLM applications. Built on OpenTelemetry and OpenInference. Can be run locally, self-hosted (Docker/K8s), or accessed via cloud at app.phoenix.arize.com. CRITICAL: Phoenix is a separate product from Arize AX (the enterprise cloud platform). They share the Arize brand but use different packages, credentials, and endpoints: Phoenix uses phoenix.otel + PHOENIX_API_KEY; Arize AX uses arize.otel + ARIZE_SPACE_ID + ARIZE_API_KEY. arize-phoenix is the full-platform bundle. For production deployments, the modular sub-packages (arize-phoenix-otel, arize-phoenix-client, arize-phoenix-evals) are preferred. LICENSE: Elastic License 2.0 (ELv2) — NOT MIT/Apache. Restrictions apply to providing Phoenix as a managed service to third parties.

Warnings

Install

Imports

Quickstart

For local dev, px.launch_app() starts the Phoenix server in-process. For production deployments, run Phoenix as a separate service (Docker/K8s) and use the modular sub-packages in your app code. Never mix Phoenix and Arize AX credentials.

# === OPTION A: Local dev (run Phoenix server + instrument your app) ===

# 1. Launch Phoenix server locally
import phoenix as px
px.launch_app()  # Opens UI at http://localhost:6006

# 2. Instrument with OTel
from phoenix.otel import register
from openinference.instrumentation.openai import OpenAIInstrumentor

tracer_provider = register(
    project_name='my-app',
    # auto_instrument=True  # Only works if openinference-instrumentation-* pkgs are installed
)
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)

import openai
client = openai.OpenAI()
response = client.chat.completions.create(
    model='gpt-4o',
    messages=[{'role': 'user', 'content': 'Hello!'}]
)
# Trace visible in Phoenix UI at http://localhost:6006

# === OPTION B: Production (app code → deployed Phoenix server) ===
import os
os.environ['PHOENIX_COLLECTOR_ENDPOINT'] = 'https://app.phoenix.arize.com/s/my-space'
os.environ['PHOENIX_API_KEY'] = 'your-phoenix-api-key'

from phoenix.otel import register
tracer_provider = register(project_name='prod-app', batch=True)
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)

# === Phoenix Client (REST API) ===
from phoenix.client import Client
ph = Client()  # reads PHOENIX_BASE_URL and PHOENIX_API_KEY from env

# Query traces
spans = ph.spans.query(project_name='my-app')

# === Evaluations ===
from phoenix.evals import llm_classify, OpenAIModel
model = OpenAIModel(model='gpt-4o')
results = llm_classify(
    dataframe=spans_df,
    model=model,
    template='hallucination',  # built-in template
    rails=['factual', 'hallucinated']
)

view raw JSON →