Roboflow Inference

1.2.2 · active · verified Fri Apr 17

Roboflow Inference is a Python library that allows developers to deploy computer vision models to various devices and environments with minimal machine learning knowledge. It simplifies the process of performing inference on models hosted by Roboflow or running locally. The library is actively maintained with frequent releases, currently at version 1.2.2.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to perform object detection using Roboflow's cloud inference service. It initializes `InferenceHTTPClient` with an API key and then sends an image URL for inference. Ensure your `ROBOFLOW_API_KEY` and `ROBOFLOW_PROJECT_VERSION` environment variables are set.

import os
from inference import InferenceHTTPClient

# IMPORTANT: Set ROBOFLOW_API_KEY, ROBOFLOW_WORKSPACE, and ROBOFLOW_PROJECT_VERSION
# as environment variables for actual use. Get them from your Roboflow dashboard.
# For local testing, you may uncomment and set these directly:
# os.environ["ROBOFLOW_API_KEY"] = "YOUR_API_KEY" 
# os.environ["ROBOFLOW_WORKSPACE"] = "YOUR_WORKSPACE_ID"
# os.environ["ROBOFLOW_PROJECT_VERSION"] = "YOUR_PROJECT_ID/YOUR_VERSION" # e.g., "my-project/1"

api_key = os.environ.get("ROBOFLOW_API_KEY", "")
workspace = os.environ.get("ROBOFLOW_WORKSPACE", "")
project_version = os.environ.get("ROBOFLOW_PROJECT_VERSION", "your_project/1") # Replace with your actual project/version

if not api_key:
    print("WARNING: ROBOFLOW_API_KEY environment variable not set. Inference may fail.")
if not workspace:
    print("WARNING: ROBOFLOW_WORKSPACE environment variable not set. This may not be critical for HTTPClient but is for other features.")
if project_version == "your_project/1":
    print("WARNING: ROBOFLOW_PROJECT_VERSION environment variable not set. Using placeholder.")


try:
    # Initialize the client for cloud inference
    client = InferenceHTTPClient(
        api_url="https://detect.roboflow.com", # Or https://infer.roboflow.com for multi-model workflows
        api_key=api_key
    )

    # Example image (replace with a real image path or URL)
    image_url = "https://i.ibb.co/L5hY63C/roboflow-example.jpg"

    # Perform inference
    print(f"Performing inference on {image_url} using model {project_version}...")
    result = client.infer(
        image_path=image_url,
        model_id=project_version,
        # confidence=0.5, # Optional: set confidence threshold
        # overlap=0.3,    # Optional: set NMS overlap threshold
    )

    print("\nInference successful:")
    # The result object has a .json() method for the raw API response
    # print(result.json(indent=2))

    # Accessing structured predictions
    if result and result.predictions:
        print(f"Found {len(result.predictions)} predictions.")
        for i, pred in enumerate(result.predictions[:3]): # Print details for first 3 predictions
            print(f"  Prediction {i+1}: Class='{pred.class_name}', Confidence={pred.confidence:.2f}, Box=({pred.x},{pred.y},{pred.width},{pred.height})")
    else:
        print("No predictions found or unexpected result structure.")

except Exception as e:
    print(f"\nAn error occurred during inference: {e}")
    if "401: Unauthorized" in str(e) or "authentication" in str(e):
        print("HINT: Check your ROBOFLOW_API_KEY. It might be missing or invalid.")
    elif "404: Not Found" in str(e) and ("Model" in str(e) or "project" in str(e)):
        print("HINT: Check your ROBOFLOW_PROJECT_VERSION. The model might not exist or the version is wrong.")
    else:
        print("HINT: Refer to the Roboflow Inference documentation for troubleshooting.")

view raw JSON →