{"id":9187,"library":"pnnx","title":"PNNX","description":"PNNX (PyTorch to NCNN eXporter) is a Python command-line tool designed for PyTorch model interoperability, primarily converting PyTorch models into the NCNN deep learning inference framework format, or ONNX. It is maintained by Tencent and is part of the larger NCNN project. The current PyPI version is 20260409, with frequent updates often tied to the NCNN project's release cycle.","status":"active","version":"20260409","language":"en","source_language":"en","source_url":"https://github.com/Tencent/ncnn/tree/master/tools/pnnx","tags":["pytorch","ncnn","onnx","model conversion","deep learning","cli"],"install":[{"cmd":"pip install pnnx","lang":"bash","label":"Install pnnx"}],"dependencies":[{"reason":"Required for loading, tracing, and manipulating PyTorch models.","package":"torch","optional":false},{"reason":"Required if exporting models to ONNX format using the `--onnx` flag.","package":"onnx","optional":true}],"imports":[{"note":"While 'pnnx' can be imported as a Python package, its primary user interface is typically through the command line tool 'pnnx'.","symbol":"pnnx","correct":"import pnnx"}],"quickstart":{"code":"import torch\nimport os\n\n# 1. Define a simple PyTorch model\nclass MyModel(torch.nn.Module):\n    def __init__(self):\n        super(MyModel, self).__init__()\n        self.conv = torch.nn.Conv2d(3, 32, 3, padding=1)\n        self.relu = torch.nn.ReLU()\n\n    def forward(self, x):\n        return self.relu(self.conv(x))\n\nmodel = MyModel()\ndummy_input = torch.rand(1, 3, 64, 64)\n\n# 2. Trace the model using torch.jit.trace\n# This creates a TorchScript module which pnnx can convert.\ntraced_model = torch.jit.trace(model, dummy_input)\n\n# 3. Save the traced model to a file\nmodel_path = \"my_model.pt\"\ntraced_model.save(model_path)\n\nprint(f\"PyTorch model saved to {model_path}\")\nprint(\"\\nNow, open your terminal and run the following command to convert the model:\")\nprint(f\"pnnx {model_path} inputshape=[{','.join(map(str, dummy_input.shape))}] outputpath=converted_model\")\nprint(\"\\nThis will generate NCNN model files (e.g., converted_model.param, converted_model.bin) in the 'converted_model' directory.\")\n\n# You can also specify ONNX output:\n# print(\"Or for ONNX output:\")\n# print(f\"pnnx {model_path} inputshape=[{','.join(map(str, dummy_input.shape))}] outputpath=converted_model --onnx\")\n","lang":"python","description":"This quickstart demonstrates how to define a simple PyTorch model, trace it using `torch.jit.trace`, save it, and then use the `pnnx` command-line tool to convert it to the NCNN format. Ensure you have `torch` installed alongside `pnnx`."},"warnings":[{"fix":"Refer to the GitHub README or `pnnx --help` for correct command-line usage. Programmatic use of internal modules is advanced and less documented.","message":"PNNX is primarily a command-line tool. While it exposes internal Python modules (e.g., `pnnx.converter`), most users will interact with it via the `pnnx` shell command for model conversion.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Ensure your PyTorch model is TorchScript-compatible. Use `torch.jit.script` for models with control flow, or refactor models to use static shapes and operations where possible. Thoroughly test the traced model before conversion.","message":"PNNX relies on `torch.jit.trace` or `torch.jit.script` for model ingestion. Models with dynamic control flow (e.g., `if` statements, dynamic loops) or shape-dependent operations may not be correctly traced and will fail conversion. `torch.jit.trace` records a single execution path.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Manually install PyTorch in your environment via `pip install torch` (or `pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118` for CUDA-enabled versions) before attempting to use pnnx.","message":"The `pnnx` PyPI package does not explicitly list `torch` as a dependency, meaning `pip install pnnx` will not automatically install PyTorch. However, PyTorch is absolutely essential for PNNX to function, as it processes PyTorch models.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Always check the GitHub repository's release notes for the exact date version you are using or plan to use to understand changes and new features. Be aware that updates are frequent.","message":"PNNX's versioning scheme is date-based (e.g., `20260409`), which is different from semantic versioning (e.g., 1.0.0). This can make tracking breaking changes or specific feature availability less intuitive.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Install PyTorch: `pip install torch` (or the appropriate command for your system/CUDA version).","cause":"PyTorch is not installed in the environment where pnnx is being run.","error":"ModuleNotFoundError: No module named 'torch'"},{"fix":"Ensure `pip install pnnx` completed successfully. Verify that your environment's Python scripts directory (e.g., `~/.local/bin` or `venv/bin`) is included in your system's PATH.","cause":"The 'pnnx' executable is not in your system's PATH, or pnnx was not installed correctly.","error":"pnnx: command not found"},{"fix":"Consult `pnnx --help` for the correct syntax and available options. Ensure `inputshape` is correctly formatted, e.g., `inputshape=[1,3,224,224]`.","cause":"Incorrect command-line arguments passed to pnnx. Common mistakes include missing `inputshape` or wrong format for arguments.","error":"Failed to parse input arguments"},{"fix":"Simplify your PyTorch model, remove dynamic elements, or ensure all operations are TorchScript-compatible. Consider using `torch.jit.script` for models with control flow, or debugging the tracing process in PyTorch directly.","cause":"The PyTorch model provided to `torch.jit.trace` or `pnnx` directly cannot be successfully traced into a TorchScript graph. This often happens with models using dynamic control flow or unsupported operations.","error":"RuntimeError: Tracing a graph failed! Ensure the input to the trace is a Python function or a 'torch.nn.Module' and that the trace is a valid 'torch.jit.ScriptModule'."}]}