onnx-connx is a tool
- ONNX model to CONNX model compiler
- ONNX backend using CONNX engine
pip install git+https://github.com/tsnlab/onnx-connx
python -m onnx\_connx --help # to get help message
python -m onnx\_connx [onnx model] [output dir] # to convert onnx to connx
- python3
- protobuf-compiler # to run bin/dump utility
- poetry
$ sudo apt install python3 python3-pip
$ poetry install
2 ways to dump onnx or pb file to text.
onnx-connx$ bin/dump [onnx path] # This utility will dump onnx or pb to text using protoc
onnx-connx$ onnx2connx -d [onnx path] # This utility will dump onnx or pb to text using onnx_connx
onnx-connx$ cp [connx binary path] onnx_connx/connx
onnx-connx$ pytest
onnx-connx$ cp [connx binary path] onnx_connx/connx
onnx-connx$ bin/test [onnx version]
ONNX Version | Passed test cases |
---|---|
1.5.0 | 217 |
1.6.0 | 243 |
1.7.0 | 259 |
1.8.1 | 260 |
1.9.0 | 268 |
1.10.2 | 278 |
1.11.0 | 278 |
# First, Install onnx-connx
pip instal -e .
# Run benchmark for specific operator test case
bin/benchmark-operator xor
# Results are stored in "results" directory (by default)
If you want to use specific version of connx, You need to install it separately.
onnx-connx$ git clone https://github.com/tsnlab/connx && cd connx
onnx-connx$ pip install .
Or you can install connx from git directly using pip.
$ pip install git+https://github.com/tsnlab/connx
connx backend will compile the ONNX to CONNX and run it using connx.
connx binary must in onnx_connx
, current directory or in PATH environment variable.
connx-run [onnx model] [[input tensor] ...]
onnx-connx$ cd examples
onnx-connx/examples$ ./download.sh
onnx-connx/examples$ cd ..
onnx-connx$ connx-run examples/mnist/model.onnx examples/mnist/input_0.pb
See CONTRIBUTING.md
CONNX is licensed under GPLv3. See LICENSE