AI Edge LiteRT Nightly

raw JSON →
2.2.0.dev20260430 verified Fri May 01 auth: no python

Nightly build of the LiteRT runtime for on-device machine learning, targeting TensorFlow Lite models on mobile and edge devices. Currently at version 2.2.0.dev20260430.

pip install ai-edge-litert-nightly
error ModuleNotFoundError: No module named 'tflite_runtime'
cause tflite_runtime is not the correct package; ai-edge-litert-nightly uses ai_edge_litert module name.
fix
Install ai-edge-litert-nightly and change imports to from ai_edge_litert import ...
error ImportError: cannot import name 'Interpreter' from 'ai_edge_litert'
cause The API has changed; Interpreter class is renamed or replaced.
fix
Use LiteRTModel instead: from ai_edge_litert import LiteRTModel
gotcha This is a nightly build; API may change without notice. Pin to a specific version for stability.
fix Use a stable release like ai-edge-litert (non-nightly) for production.
deprecated The old tflite_runtime package is superseded; direct imports from tflite_runtime will not work with ai-edge-litert-nightly.
fix Migrate to ai_edge_litert imports, e.g., from ai_edge_litert import LiteRTModel.

Basic model loading using LiteRT API. Replace model.tflite with your model file.

from ai_edge_litert import LiteRTOptions, LiteRTModel

# Load a TFLite model from file
model_path = "model.tflite"
options = LiteRTOptions()
model = LiteRTModel(model_path, options)

# Run inference (simplified for illustration)
# input_data = ...
# output = model.run(input_data)
print("LiteRT model loaded successfully")