ControlNet Auxiliary Models

0.0.10 · active · verified Thu Apr 16

ControlNet Auxiliary Models (controlnet-aux) provides a PyPI installable package of lllyasviel's ControlNet Annotators. It offers various preprocessors (e.g., Canny, OpenPose, Midas) for generating hint images like edges, depth maps, and poses, which are used to guide image generation with ControlNet models. The library is currently at version 0.0.10, released on May 8, 2025, with an irregular release cadence primarily driven by upstream ControlNet updates.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to use the unified `Processor` class to load and apply a ControlNet annotator (e.g., OpenPose) to an input image. The `Processor` dynamically loads the necessary models based on the `processor_id` you specify.

import requests
from PIL import Image
from io import BytesIO
from controlnet_aux.processor import Processor

# Load an image from a URL
url = "https://huggingface.co/lllyasviel/sd-controlnet-openpose/resolve/main/images/pose.png"
response = requests.get(url)
img = Image.open(BytesIO(response.content)).convert("RGB")

# Instantiate the processor for a specific ControlNet annotator (e.g., 'openpose')
# Common processor_ids include: 'canny', 'depth_leres', 'openpose', 'scribble_hed', 'lineart_realistic', 'dwpose'
processor_id = 'openpose'
processor = Processor(processor_id)

# Process the image
processed_image = processor(img, to_pil=True)

# Display or save the processed image
# processed_image.show() # Uncomment to display
print(f"Image processed successfully using {processor_id}. Output image size: {processed_image.size}")

view raw JSON →