{"id":23535,"library":"detoxify","title":"Detoxify","description":"Detoxify is a Python library for detecting toxic comments using pre-trained transformer models. It provides a simple interface to classify text as toxic, severe toxic, obscene, threat, insult, identity hate, etc. The latest version is 0.5.2, with releases approximately every few months. It requires Python >=3.7 and supports Hugging Face transformers.","status":"active","version":"0.5.2","language":"python","source_language":"en","source_url":"https://github.com/unitaryai/detoxify","tags":["toxic-comment","moderation","nlp","transformers","pytorch","safety"],"install":[{"cmd":"pip install detoxify","lang":"bash","label":"pip install (without GPU)"},{"cmd":"pip install detoxify[gpu]","lang":"bash","label":"pip install (with GPU support)"}],"dependencies":[{"reason":"Core dependency for loading and running pre-trained models.","package":"transformers","optional":false},{"reason":"PyTorch backend for the transformer models. Optional if using TensorFlow.","package":"torch","optional":true},{"reason":"TensorFlow backend option. Not required if using PyTorch.","package":"tensorflow","optional":true}],"imports":[{"note":"Detoxify is a class inside the module, not the module itself.","wrong":"import Detoxify","symbol":"Detoxify","correct":"from detoxify import Detoxify"}],"quickstart":{"code":"from detoxify import Detoxify\n\n# Load the model (chooses 'original' by default)\nmodel = Detoxify('original')\n\nresult = model.predict('This is a terrible, horrible example!')\nprint(result)\n# Example output: {'toxicity': 0.9, 'severe_toxicity': 0.1, ...}","lang":"python","description":"Loads the 'original' toxic comment model and predicts toxicity scores for a sample sentence."},"warnings":[{"fix":"Specify a model name explicitly: Detoxify('unbiased')","message":"The default model 'original' may be outdated. For better performance, use 'unbiased' (small) or 'multilingual' (larger) models. See https://github.com/unitaryai/detoxify#available-models.","severity":"gotcha","affected_versions":">=0.1.0"},{"fix":"Use the \"toxicity\" score directly; the library already applies a threshold for classification if needed. Check the `predict` method's threshold parameter.","message":"The output scores are not probabilities; they are raw logits or sigmoid outputs. Thresholds are applied by the library (score >= 0.5 is considered toxic). Do not interpret as calibrated probabilities.","severity":"gotcha","affected_versions":"all"},{"fix":"Pre-download models using `detoxify.download_model('original')` before offline use, or set a local cache directory.","message":"First run will download the model weights (~500 MB) from Hugging Face, which may take time and require an internet connection.","severity":"gotcha","affected_versions":"all"},{"fix":"Switch to Detoxify('unbiased') or Detoxify('multilingual').","message":"The 'original' model is based on a deprecated BERT checkpoint. It is no longer recommended for new projects; use 'unbiased' or 'multilingual' instead.","severity":"deprecated","affected_versions":">=0.4.0"}],"env_vars":null,"last_verified":"2026-05-01T00:00:00.000Z","next_check":"2026-07-30T00:00:00.000Z","problems":[{"fix":"Run 'pip install detoxify' in your environment.","cause":"The package is not installed or the module name is misspelled.","error":"ModuleNotFoundError: No module named 'detoxify'"},{"fix":"Ensure internet is available on first run, or pre-download the model using detoxify.download_model('original') while connected.","cause":"The model is not cached locally and there is no internet connection to download it.","error":"OSError: Can't load the model 'original' from HuggingFace. File not found or no internet."},{"fix":"Use 'from detoxify import Detoxify' to import the class.","cause":"Incorrect import pattern: used 'import detoxify' instead of 'from detoxify import Detoxify'.","error":"AttributeError: module 'detoxify' has no attribute 'Detoxify'"}],"ecosystem":"pypi","meta_description":null,"install_score":null,"install_tag":null,"quickstart_score":null,"quickstart_tag":null}