Roboflow Inference (GPU)

1.2.2 · active · verified Sun Apr 12

Roboflow Inference provides a robust framework for deploying computer vision models across various devices and environments without requiring deep machine learning expertise. This GPU-specific variant leverages CUDA for accelerated inference. Currently at version 1.2.2, the library maintains an active release cadence with frequent patches and minor updates.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to load a Roboflow model using `get_roboflow_model` and perform inference on an image. It highlights the use of environment variables for API keys and model identification, which is recommended for security and flexibility. Replace 'YOUR_API_KEY' and 'YOUR_PROJECT/YOUR_VERSION' with your actual credentials and model details.

import os
from inference import get_roboflow_model

# Set your Roboflow API key, project ID, and version
ROBOFLOW_API_KEY = os.environ.get('ROBOFLOW_API_KEY', 'YOUR_API_KEY')
ROBOFLOW_MODEL_ID = os.environ.get('ROBOFLOW_MODEL_ID', 'YOUR_PROJECT/YOUR_VERSION')

# Ensure API key is set
if ROBOFLOW_API_KEY == 'YOUR_API_KEY':
    print("Warning: ROBOFLOW_API_KEY not set. Using placeholder.")

# Load the model
print(f"Loading model: {ROBOFLOW_MODEL_ID}")
model = get_roboflow_model(model_id=ROBOFLOW_MODEL_ID, api_key=ROBOFLOW_API_KEY)

# Example inference (using a dummy image path)
# Replace 'your_image.jpg' with a real image path or PIL Image object
# For local testing, you might need a dummy image or to adjust the source.
try:
    # This part assumes you have an image file named 'test_image.jpg'
    # For a real example, replace with a valid image path.
    # Download a sample image for testing, e.g., from Roboflow Universe.
    # Example: 'https://i.imgur.com/your_image.jpg'
    results = model.infer(image_path="test_image.jpg")
    print("Inference successful!")
    # print(results)
except FileNotFoundError:
    print("Error: test_image.jpg not found. Please provide a valid image path for inference.")
except Exception as e:
    print(f"An error occurred during inference: {e}")

# You can also infer on a PIL Image or numpy array
# from PIL import Image
# import numpy as np
# dummy_image = Image.new('RGB', (640, 480), color = 'red')
# results = model.infer(image=np.array(dummy_image))
# print("Inference successful with dummy image!")

view raw JSON →