sne4onnx
raw JSON → 2.0.1 verified Fri May 01 auth: no python
A lightweight tool for extracting subgraphs (model slicing) from ONNX models. Useful when onnx-simplifier exceeds the 2GB Protocol Buffers file size limit, or to split models into arbitrary sizes. Current version 2.0.1, released June 2024. Development is active.
pip install sne4onnx Common errors
error AttributeError: module 'sne4onnx' has no attribute 'extraction' ↓
cause Incorrect import path; attempting to call extraction as a top-level attribute.
fix
Use
from sne4onnx import extraction or from sne4onnx.extraction import extraction. The correct import is: from sne4onnx import extraction error KeyError: 'input_tensor_name' ↓
cause Providing tensor names instead of node (operation) names.
fix
Use node names (e.g., from model.graph.node[i].name). Convert tensor names to node names if necessary.
error onnx.onnx_cpp2py_export.checker.ValidationError: Model validation failed. ↓
cause Extracted subgraph may have dangling inputs or outputs that are not connected properly.
fix
Ensure input_op_names and output_op_names refer to nodes that form a valid subgraph. Use onnx.checker.check_model on the extracted model to debug.
Warnings
breaking In version 2.0.0, dependency on `onnx_graphsurgeon` was removed. Code that relied on internals of `onnx_graphsurgeon` or used sne4onnx together with it may break. ↓
fix Update to v2.0.0+ and ensure no usage of `onnx_graphsurgeon` inside sne4onnx (it is no longer a dependency).
gotcha The extraction function expects operation (node) names, not tensor names. Providing tensor names may silently fail or produce unexpected subgraphs. ↓
fix Use node names (e.g., from model.graph.node[i].name) as input/output_op_names. Inspect the model's graph nodes to find correct names.
gotcha When using external data (large models with external data files), the library may corrupt the model if shape inference is run. Version 1.0.15 fixed this, but if you manually call `onnx.shape_inference.infer_shapes()` before extraction, you may encounter issues. ↓
fix Avoid calling `onnx.shape_inference.infer_shapes()` on models with external data before extraction. Let sne4onnx handle it.
gotcha The CLI `sne4onnx` arguments changed between versions. Short form parameters were added in v1.0.10 but long form (`--input_onnx_file_path`) may still be used. In v2.0.0, the CLI remains similar but ensure you use the correct syntax. ↓
fix Use `sne4onnx -h` to see current CLI arguments. Prefer long-form options for scripts to avoid ambiguity.
Imports
- sne4onnx
import sne4onnx
Quickstart
import onnx
from sne4onnx import extraction
# Load an ONNX model
model = onnx.load('model.onnx')
# Define input and output operation names for the subgraph
input_names = ['input'] # name of the input tensor or node
output_names = ['output'] # name of the output tensor or node
# Extract the subgraph
new_model = extraction(model, input_names, output_names)
# Save the extracted model
onnx.save(new_model, 'extracted.onnx')