Prompt Flow Devkit

1.18.4 · active · verified Wed Apr 15

Prompt Flow is a development tool designed to streamline the end-to-end development cycle of Large Language Model (LLM)-based AI applications, covering ideation, prototyping, testing, evaluation, and production deployment. It provides a Python SDK and CLI for building, testing, and managing AI workflows. The current version is 1.18.4, with frequent minor releases delivering new features and improvements.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to define a Python function as a Prompt Flow tool using the `@tool` decorator. Tools are the building blocks of Prompt Flow applications. For full flow orchestration and interaction with LLMs via connections, the `PromptFlowClient` or the CLI is typically used.

import os
from promptflow import tool

@tool
def summarize_text(text: str, max_length: int = 100) -> str:
    """
    A simple tool that summarizes text by truncating it.
    In a real Prompt Flow, this might invoke an LLM via a connection.
    """
    if len(text) <= max_length:
        return text
    return text[:max_length-3] + "..."

# You can test a tool directly as a Python function:
long_text = "This is a very long piece of text that needs to be summarized. It contains many words and details."
print(f"Original: {long_text}")
print(f"Summarized: {summarize_text(long_text, max_length=30)}")

# To interact with Prompt Flow programmatically, initialize the client:
# client = PromptFlowClient(subscription_id=os.environ.get("AZURE_SUBSCRIPTION_ID", ""), 
#                          resource_group_name=os.environ.get("AZURE_RESOURCE_GROUP_NAME", ""),
#                          workspace_name=os.environ.get("AZURE_WORKSPACE_NAME", ""))
# You would then use the client to manage flows, connections, and runs.

view raw JSON →