tflite2onnx - Convert TensorFlow Lite models to ONNX

Build and Test Sanity Coverage

tflite2onnx converts TensorFlow Lite (TFLite) models (*.tflite) to ONNX models (*.onnx), with data layout and quantization semantic properly handled (check the introduction blog for detail).

If you’d like to convert a TensorFlow model (frozen graph *.pb, SavedModel or whatever) to ONNX, try tf2onnx. Or, you can firstly convert it to a TFLite (*.tflite) model, and then convert the TFLite model to ONNX.

Usage

Install via pip pip install tflite2onnx. After installation, you may either try either.

Python interface

import tflite2onnx

tflite_path = '/path/to/original/tflite/model'
onnx_path = '/path/to/save/converted/onnx/model'

tflite2onnx.convert(tflite_path, onnx_path)

tflite2onnx now supports explicit layout, check the test example.

Command line

tflite2onnx /path/to/original/tflite/model /path/to/save/converted/onnx/model

Documents

Contributing

License

Apache License Version 2.0.