{"id":5943,"library":"gliner","title":"GLiNER: Generalist & Lightweight Named Entity Recognition","description":"GLiNER is a Python library providing a Generalist and Lightweight Model for Named Entity Recognition (NER). It's built on a bidirectional transformer encoder (BERT-like) and can identify any entity type using zero-shot capabilities. It serves as an efficient alternative to traditional NER models and large language models (LLMs) for resource-constrained environments, offering competitive performance on CPUs and consumer hardware. The library currently stands at version 0.2.26 with frequent updates and an active development cadence.","status":"active","version":"0.2.26","language":"en","source_language":"en","source_url":"https://github.com/urchade/GLiNER","tags":["nlp","ner","named-entity-recognition","machine-learning","transformers","zero-shot","data-science","artificial-intelligence"],"install":[{"cmd":"pip install gliner","lang":"bash","label":"Install core library"},{"cmd":"pip install 'gliner[gpu]' # for GPU-backed ONNX runtime","lang":"bash","label":"Install with GPU support (optional)"}],"dependencies":[{"reason":"Required for model interaction","package":"huggingface_hub","optional":false},{"reason":"Core runtime for inference","package":"onnxruntime","optional":false},{"reason":"Tokenization","package":"sentencepiece","optional":false},{"reason":"Deep learning framework","package":"torch","optional":false},{"reason":"Progress bars","package":"tqdm","optional":false},{"reason":"Core transformer models, with version constraints","package":"transformers","optional":false},{"reason":"Optional, for GPU acceleration with ONNX","package":"onnxruntime-gpu","optional":true},{"reason":"Optional, for distributed training/inference","package":"accelerate","optional":true},{"reason":"Optional, for Chinese tokenization","package":"jieba3","optional":true}],"imports":[{"symbol":"GLiNER","correct":"from gliner import GLiNER"}],"quickstart":{"code":"from gliner import GLiNER\n\n# Initialize GLiNER with a pre-trained model\n# Available models: \"urchade/gliner_base\", \"urchade/gliner_small-v2.1\", \"urchade/gliner_medium-v2.1\", \"urchade/gliner_large-v2.1\"\nmodel = GLiNER.from_pretrained(\"urchade/gliner_medium-v2.1\")\n\ntext = \"\"\"Cristiano Ronaldo dos Santos Aveiro (Portuguese pronunciation: [kɾiʃˈtjɐnu ʁɔˈnaldu]; born 5 February 1985) is a Portuguese professional footballer who plays as a forward for and captains both Saudi Pro League club Al Nassr and the Portugal national team. Widely regarded as one of the greatest players of all time, Ronaldo has won five Ballon d'Or awards, a record three UEFA Men's Player of the Year Awards, and four European Golden Shoes, the most by a European player.\"\"\"\n\n# Define the entity labels you want to extract\nlabels = [\"person\", \"team\", \"organization\", \"location\", \"award\", \"nationality\", \"sport\", \"date\"]\n\n# Predict entities\nentities = model.predict_entities(text, labels)\n\n# Print the extracted entities\nfor entity in entities:\n    print(f\"Text: {entity['text']}, Label: {entity['label']}, Score: {entity['score']:.2f}\")","lang":"python","description":"This quickstart demonstrates how to load a pre-trained GLiNER model and use it to extract named entities from a given text based on a custom list of labels. The `predict_entities` method returns a list of dictionaries, each containing the extracted text, its predicted label, and a confidence score."},"warnings":[{"fix":"Carefully manage your Python environment with virtual environments or Conda. For complex dependency trees, consider containerization (e.g., Docker) or specialized tools like `sie-bundles` for isolating incompatible `transformers` versions.","message":"GLiNER requires `transformers>=4.51.3,<5`. This specific version constraint can lead to conflicts when integrating GLiNER into projects that use other machine learning libraries (e.g., `sentence-transformers`) which might require `transformers>=4.57` (or `transformers 5.x`).","severity":"gotcha","affected_versions":"0.2.23 and later"},{"fix":"If deploying to HuggingFace Inference Endpoints, pin your `gliner` version to `<0.2.23` (e.g., `gliner==0.2.22`). Alternatively, ensure your deployment setup explicitly supports `transformers 5.x` and a compatible `huggingface-inference-toolkit` version.","message":"Starting from `v0.2.23`, GLiNER's updated `transformers` dependency (`>=4.57.3`, pulling `transformers 5.x`) breaks deployment on HuggingFace Inference Endpoints due to incompatibility with the `huggingface-inference-toolkit`.","severity":"breaking","affected_versions":"0.2.23 and later"},{"fix":"It is strongly recommended to upgrade to `gliner>=0.2.25` to benefit from the fixes addressing these import and loading issues.","message":"Older versions of GLiNER (prior to `0.2.25`) contained bugs that could cause crashes related to `apex.amp` imports if `torch` was not installed with CUDA, or issues with ONNX model loading when a `torch` file was unexpectedly absent.","severity":"gotcha","affected_versions":"<0.2.25"}],"env_vars":null,"last_verified":"2026-04-14T00:00:00.000Z","next_check":"2026-07-13T00:00:00.000Z"}