Treelite Runtime
raw JSON → 3.9.1 verified Mon Apr 27 auth: no python
Treelite runtime is the lightweight inference engine for tree models compiled by Treelite. It loads compiled shared libraries (.so/.dll) and runs predictions. Current version on PyPI is 3.9.1 (note: GitHub releases have advanced to 4.x, but treelite-runtime may lag). Released monthly.
pip install treelite-runtime Common errors
error ModuleNotFoundError: No module named 'treelite_runtime' ↓
cause The package 'treelite-runtime' is not installed.
fix
Run: pip install treelite-runtime
error FileNotFoundError: [Errno 2] No such file or directory: 'mymodel.so' ↓
cause The compiled model file path is incorrect or not generated.
fix
Ensure the .so file exists and path is correct. Generate with treelite compiler: treelite.Model.export_lib(target='mymodel.so', ...)
error OSError: /path/to/mymodel.so: cannot open shared object file ↓
cause The shared library is missing dependencies or architecture mismatch.
fix
Check platform compatibility (e.g., compile for same architecture). Use ldd mymodel.so to verify dependencies.
Warnings
gotcha The treelite-runtime package version (3.9.1) is older than the latest treelite compiler (4.x). Ensure compatibility between compiler and runtime versions. ↓
fix Use matching versions: treelite==3.9.1 and treelite-runtime==3.9.1, or upgrade both to 4.x when treelite-runtime updates.
gotcha Predictor expects a compiled model file (e.g., .so, .dll), not a Treelite model object. Trying to pass a treelite.Model object directly raises TypeError. ↓
fix Compile the model first using treelite compiler, then pass the file path to Predictor.
gotcha The Predictor class is not thread-safe. Calling predict concurrently may cause data races. ↓
fix Create separate Predictor instances per thread, or use locks.
Imports
- TreeliteRuntimeError wrong
from treelite import TreeliteRuntimeErrorcorrectfrom treelite_runtime import TreeliteRuntimeError - Predictor
from treelite_runtime import Predictor
Quickstart
from treelite_runtime import Predictor
# Load a compiled model (e.g., mymodel.so)
# predictor = Predictor('./mymodel.so', nthread=1)
# result = predictor.predict(batch)
print('Predictor loaded successfully')
# Note: Requires a compiled .so file generated by treelite