Prompt Flow Tracing

1.18.4 · active · verified Wed Apr 15

The `promptflow-tracing` package provides tracing capabilities for Prompt Flow, enabling the capture and visualization of internal execution processes for both DAG and Flex flows. It's designed to be compatible with OpenTelemetry, offering comprehensive observability for LLM-based applications, including those using frameworks like Langchain or OpenAI. The current version is 1.18.4, and the library is actively developed with frequent releases.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to enable tracing for LLM calls (e.g., OpenAI) using `start_trace()` and how to trace custom functions with the `@trace` decorator. Upon execution, if `promptflow-devkit` is installed, a local URL to the trace UI will be printed to the console, allowing visualization of the captured traces.

import os
from openai import OpenAI
from promptflow.tracing import start_trace, trace

# Ensure OPENAI_API_KEY is set in your environment
# or pass it explicitly to OpenAI(api_key=...) if not using env var.
# For Azure OpenAI, set AZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_VERSION, AZURE_OPENAI_DEPLOYMENT_NAME

# Start tracing. This will instrument supported libraries like OpenAI.
start_trace()

client = OpenAI(api_key=os.environ.get('OPENAI_API_KEY', ''))

@trace
def poetic_explanation(concept: str) -> str:
    try:
        completion = client.chat.completions.create(
            model="gpt-3.5-turbo",
            messages=[
                {"role": "system", "content": "You are a poetic assistant, skilled in explaining complex programming concepts with creative flair."}, 
                {"role": "user", "content": f"Compose a short poem that explains the concept of {concept} in programming."}
            ]
        )
        return completion.choices[0].message.content
    except Exception as e:
        print(f"Error calling OpenAI: {e}")
        return "Failed to get a poetic explanation."


if os.environ.get('OPENAI_API_KEY'): # Only run if API key is present
    print("--- Tracing LLM call with start_trace() ---")
    poem = poetic_explanation("recursion")
    print(poem)
    print("\nCheck your console for a URL to the trace UI (requires promptflow-devkit).")
else:
    print("Skipping quickstart: OPENAI_API_KEY environment variable not set.")

view raw JSON →