tflite2onnx - Convert TensorFlow Lite models to ONNX
tflite2onnx converts TensorFlow Lite (TFLite) models (
*.tflite) to ONNX models (
with data layout and quantization semantic properly handled (check the introduction blog for detail).
If you’d like to convert a TensorFlow model (frozen graph
SavedModelor whatever) to ONNX, try
tf2onnx. Or, you can firstly convert it to a TFLite (
*.tflite) model, and then convert the TFLite model to ONNX.
Install via pip
pip install tflite2onnx.
After installation, you may either try either.
import tflite2onnx tflite_path = '/path/to/original/tflite/model' onnx_path = '/path/to/save/converted/onnx/model' tflite2onnx.convert(tflite_path, onnx_path)
tflite2onnx now supports explicit layout, check the
tflite2onnx /path/to/original/tflite/model /path/to/save/converted/onnx/model
- Introduction blog - the background, design and implementation
- Release note
- Supported operators (Onging status issue)
- How to enable a new operator
- If you think something is wrong, report bugs.
- If some operators are not supported yet, you may request a new operator.
- It would be great if you can help to enable new operators, please join us with How to enable a new operator.
- Feel free to open discussions if you have any great idea to improve this tool.
Apache License Version 2.0.