ONNX Runtime JavaScript Common API
onnxruntime-common serves as the foundational JavaScript API layer for the ONNX Runtime ecosystem, providing shared types, interfaces, and core functionalities. It is designed to be consumed indirectly by platform-specific packages like `onnxruntime-node`, `onnxruntime-web`, and `onnxruntime-react-native`, rather than being used directly by end-user applications. The current stable version is 1.24.3, with frequent patch releases addressing bug fixes, security enhancements, and execution provider updates, as evidenced by recent 1.24.x versions. Its primary differentiator is enabling a consistent programming model across various JavaScript environments, abstracting away platform-specific implementations and ensuring interoperability for machine learning inference.
Common errors
-
Module not found: Can't resolve 'onnxruntime-common' in '...' or similar import errors
cause Attempting to directly import from `onnxruntime-common` without having it installed, or more commonly, trying to use it as a primary entry point for ONNX Runtime functionality.fixInstall the correct platform-specific ONNX Runtime package (e.g., `npm install onnxruntime-node`) and adjust your imports to use that package directly (e.g., `import * as ort from 'onnxruntime-node';`). -
TypeError: ort.InferenceSession.create is not a function
cause Attempting to call static methods like `create` directly on the `InferenceSession` symbol imported from `onnxruntime-common`. The common package often defines interfaces or abstract classes, not concrete implementations with static factory methods.fixEnsure you are importing and calling `create` from the platform-specific package's `InferenceSession` implementation (e.g., `import * as ort from 'onnxruntime-node'; const session = await ort.InferenceSession.create('model.onnx');`). -
Error: ONNX Runtime requires an execution provider. Please ensure you have enabled at least one.
cause While `onnxruntime-common` doesn't directly cause this, it's a common error that propagates from platform-specific packages if no valid execution provider (e.g., WASM, WebGPU, CPU, CUDA) is found or explicitly configured during session creation.fixWhen creating an `InferenceSession`, pass `sessionOptions` to explicitly define execution providers, for example: `{ executionProviders: ['webgpu', 'wasm'] }` for web or `{ executionProviders: ['cuda', 'cpu'] }` for Node.js.
Warnings
- breaking Python 3.10 wheels are no longer published for ONNX Runtime. Users must upgrade their Python environment to 3.11+ when using Python bindings. Python 3.14 support has been added.
- breaking x86_64 binaries for macOS/iOS are no longer provided, and the minimum macOS version supported for ONNX Runtime has been raised to 14.0.
- breaking ONNX Runtime GPU packages now explicitly require CUDA 12.x. Packages built for CUDA 11.x are no longer published, potentially breaking existing setups.
- breaking The minimum supported Windows version for ONNX Runtime has been raised to 10.0.19041 (Windows 10, version 20H1).
- gotcha This `onnxruntime-common` package is designed as a shared API layer and is not intended for direct use. End-user applications should install and depend on platform-specific packages.
- breaking Critical security vulnerabilities, including heap out-of-bounds read/write issues in `GatherCopyData` and `RoiAlign`, were fixed in version 1.24.3.
Install
-
npm install onnxruntime-common -
yarn add onnxruntime-common -
pnpm add onnxruntime-common
Imports
- InferenceSession
import { InferenceSession } from 'onnxruntime-common'; - Tensor
import { Tensor } from 'onnxruntime-common'; - SessionOptions
import { SessionOptions } from 'onnxruntime-common';