NudeNet

raw JSON →
3.4.2 verified Fri May 01 auth: no python

A lightweight nudity detection library for Python that provides deep learning-based classifiers for detecting nudity in images. As of version 3.4.2 (weights updated separately), the package supports labeling images as 'safe' or 'unsafe' and includes support for both CPU and GPU inference. The major v3 series introduced a new detector API and breaking import changes. Releases are intermittent with version bumps tied to model weights.

pip install nudenet
error ModuleNotFoundError: No module named 'nudenet'
cause Library not installed or installed incorrectly.
fix
Run pip install nudenet. If behind a proxy, use pip install nudenet --proxy=....
error AttributeError: module 'nudenet' has no attribute 'Detector'
cause Using NudeDetector (v2) code with an older version that doesn't have Detector, or vice versa.
fix
Check installed version: pip show nudenet. For v2 use from nudenet import NudeDetector, for v3 use from nudenet import Detector.
error OSError: [Errno 2] No such file or directory: '.../.nudenet/nudenet.onnx'
cause Model weights not downloaded.
fix
Initialize the detector, which triggers download. Or manually download from https://github.com/notAI-tech/nudenet/releases and place in ~/.nudenet/.
error ImportError: cannot import name 'NudeDetector' from 'nudenet'
cause Using v3 import pattern with v2 code.
fix
Use from nudenet import Detector for v3. If you need v2, install pip install nudenet==2.0.9.
breaking Import path changed in v3: Use `from nudenet import Detector` instead of `from nudenet import NudeDetector`.
fix Change import to `from nudenet import Detector`. The NudeDetector class was removed.
breaking Method `classify()` replaced with `detect()` in v3. The new method returns a different data structure (list of dicts vs simple labels).
fix Use `detect()`. If you need simple safe/unsafe, call `detect()` and check if any detection has label starting with 'EXPOSED_'. The old classify returned {'safe': prob, 'unsafe': prob}.
gotcha Weights are downloaded on first import/detector init. The download may fail in restricted environments or behind proxies.
fix Manually download the weights from the GitHub releases (e.g., 'v3.4-weights') and place them in the default cache directory (~/.nudenet/) or set environment variable NUDENET_DIR.
gotcha The default detector uses a model that only detects female breast exposure ('EXPOSED_BREAST_F'). It does not detect male nudity or other body parts in the default config.
fix Use a custom model if needed, or understand the label scope. The library has separate models for other categories but they are not default.
gotcha The package includes multiple model variants (e.g., 'v1', 'v2', 'v3'). Loading the wrong version can give poor accuracy.
fix Specify the model path explicitly during Detector initialization: `Detector(model_path='path/to/model.onnx')`.

Initialize the detector and classify an image. The detect method returns a list of detections with bounding boxes, scores, and labels.

from nudenet import Detector

# Initialize detector (auto-downloads weights)
detector = Detector()

# Classify an image
result = detector.detect('path/to/image.jpg')
print(result)
# Example output: [{'box': [0.0, 0.0, 100, 100], 'score': 0.99, 'label': 'EXPOSED_BREAST_F'}]