{"id":7464,"library":"onnxoptimizer","title":"ONNX Optimizer","description":"ONNX Optimizer is a Python library that provides a C++ library for performing arbitrary optimizations on ONNX models, offering a growing list of prepackaged optimization passes. It aims to share optimization work between various ONNX backend implementations, making models more efficient for inference. The project is actively maintained with frequent minor releases, often every few months.","status":"active","version":"0.4.2","language":"en","source_language":"en","source_url":"https://github.com/onnx/optimizer","tags":["ONNX","optimizer","machine-learning","deep-learning"],"install":[{"cmd":"pip install onnxoptimizer","lang":"bash","label":"Install from PyPI"}],"dependencies":[{"reason":"Required for loading, manipulating, and saving ONNX model files which onnxoptimizer operates on.","package":"onnx","optional":false},{"reason":"Required for building onnxoptimizer from source if pre-built wheels are not available for your system or architecture.","package":"protobuf","optional":true}],"imports":[{"note":"The ONNX optimizer was moved to its own package, 'onnxoptimizer', after ONNX version 1.9. Direct import from 'onnx' will fail.","wrong":"from onnx import optimizer","symbol":"optimize","correct":"import onnxoptimizer\noptimized_model = onnxoptimizer.optimize(model)"}],"quickstart":{"code":"import onnx\nimport onnxoptimizer\n\n# Create a dummy ONNX model for demonstration\n# In a real scenario, you would load an existing .onnx file\nfrom onnx import helper\nfrom onnx import TensorProto\n\ngraph_def = helper.make_graph(\n    [\n        helper.make_node(\"Add\", [\"input1\", \"input2\"], [\"output\"]), # Example operation\n    ],\n    \"simple_graph\",\n    [\n        helper.make_tensor_value_info(\"input1\", TensorProto.FLOAT, [1, 2]),\n        helper.make_tensor_value_info(\"input2\", TensorProto.FLOAT, [1, 2]),\n    ],\n    [\n        helper.make_tensor_value_info(\"output\", TensorProto.FLOAT, [1, 2]),\n    ],\n)\n\nmodel = helper.make_model(graph_def, producer_name='test-model')\n\n# Save the dummy model (optional, for verification)\nonnx.save(model, \"original_model.onnx\")\nprint(\"Original model saved to original_model.onnx\")\n\n# Optimize the model\n# You can specify passes, e.g., ['fuse_bn_into_conv']\n# By default, a set of common fusion and elimination passes are used.\noptimized_model = onnxoptimizer.optimize(model)\n\n# Save the optimized model\nonnx.save(optimized_model, \"optimized_model.onnx\")\nprint(\"Optimized model saved to optimized_model.onnx\")\n\n# Optional: Verify the optimized model (requires onnxruntime)\ntry:\n    import onnxruntime as ort\n    sess_options = ort.SessionOptions()\n    _ = ort.InferenceSession(\"optimized_model.onnx\", sess_options)\n    print(\"Optimized model successfully loaded by ONNX Runtime.\")\nexcept ImportError:\n    print(\"onnxruntime not installed. Cannot verify optimized model.\")\nexcept Exception as e:\n    print(f\"Error loading optimized model with ONNX Runtime: {e}\")","lang":"python","description":"This quickstart demonstrates how to load an ONNX model (here, a dummy one is created), apply optimizations using `onnxoptimizer.optimize()`, and then save the resulting optimized model. It also includes an optional step to verify the optimized model using ONNX Runtime."},"warnings":[{"fix":"Install `onnxoptimizer` as a separate package (`pip install onnxoptimizer`) and update imports to `import onnxoptimizer` or `from onnxoptimizer import optimize`.","message":"The ONNX optimizer functionality was moved from the `onnx` package to a separate `onnxoptimizer` package. Direct imports like `from onnx import optimizer` will fail for `onnx` versions 1.9.0 and later.","severity":"breaking","affected_versions":"onnx >= 1.9.0"},{"fix":"Ensure `protobuf` is installed before attempting to build from source (e.g., `pip install protobuf` or `sudo apt-get install libprotobuf-dev protobuf-compiler`).","message":"Building `onnxoptimizer` from source (e.g., when pre-built wheels are unavailable or for specific development needs) requires `protobuf` to be installed on your system.","severity":"gotcha","affected_versions":"All versions when building from source"},{"fix":"Carefully inspect the optimized graph using tools like Netron. If expected optimizations are not applied, consider simplifying the model or implementing custom passes if necessary.","message":"Some advanced or custom graph structures might not be optimized as expected. The optimizer relies on exact subgraph matching for certain transformations.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Ensure `cmake` is installed and properly configured, and that `protobuf` development files are available. For example, on Ubuntu: `sudo apt-get update && sudo apt-get install cmake build-essential libprotobuf-dev protobuf-compiler`.","cause":"This error typically occurs during installation from source (when `pip` tries to build the wheel) if the `cmake` build system or other necessary C++ build tools (like `protobuf` development headers) are not present on the system.","error":"subprocess.CalledProcessError: Command '['/usr/bin/cmake', '--build', '.', '--', '-j', '4']' returned non-zero exit status 2."},{"fix":"Install the `onnxoptimizer` package separately (`pip install onnxoptimizer`) and change your import statements from `from onnx import optimizer` to `import onnxoptimizer`.","cause":"Attempting to import the optimizer directly from the `onnx` package, which is no longer the correct path for `onnx` versions 1.9.0 and newer. The optimizer functionality was moved to the `onnxoptimizer` package.","error":"AttributeError: module 'onnx' has no attribute 'optimizer' or ImportError: cannot import name 'optimizer' from 'onnx'"},{"fix":"Verify the expected input shapes of the optimized ONNX model using `onnx.load(\"optimized_model.onnx\").graph.input` and ensure your input data conforms to these shapes. Use tools like Netron to visually inspect the optimized model graph.","cause":"While not directly an `onnxoptimizer` error, this is a common issue when using optimized ONNX models with `onnxruntime` if the input shape provided to the runtime session does not match the expectations of the optimized model. This could indicate a misunderstanding of the model's new input shape after optimization, or an issue with how the model was optimized.","error":"[ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Invalid rank for input: float_input Got: 1 Expected: 2 Please fix either the inputs/outputs or the model."}]}