tflite2onnx - Convert TensorFlow Lite models to ONNX
tflite2onnx converts TensorFlow Lite (TFLite) models (
*.tflite) to ONNX models (
with data layout and quantization semantic properly handled (check the introduction blog for detail).
If you’d like to convert a TensorFlow model (frozen graph
SavedModelor whatever) to ONNX, try
tf2onnx. Or, you can firstly convert it to a TFLite (
*.tflite) model, and then convert the TFLite model to ONNX.
Microsoft has implemented another TensorFlow Lite to ONNX model converter in
tf2onnxat Feb 2021 (we open sourced
tflite2onnxin May 2020).
tf2onnxseems to able to convert Quantization just like us, and it seems able to convert RNN networks which we are not supported yet. Please try
tflite2onnxmissing any functionality.
Install via pip
pip install tflite2onnx.
Or install from source to get latest features (please try out with virtualenv):
- Download the repo:
git clone https://github.com/jackwish/tflite2onnx.git && cd tflite2onnx
- Build the package:
- Install the built package:
pip install assets/dist/tflite2onnx-*.whl
Or you can just add the code tree to your
(Command line tool is not avaiable in this mode.)
import tflite2onnx tflite_path = '/path/to/original/tflite/model' onnx_path = '/path/to/save/converted/onnx/model' tflite2onnx.convert(tflite_path, onnx_path)
tflite2onnx now supports explicit layout, check the
tflite2onnx /path/to/original/tflite/model /path/to/save/converted/onnx/model
- Release note
- Contribution guide
- Introduction blog - the background, design and implementation
- How to enable a new operator
- Data layout semantic
- If something seems wrong to you, report bugs.
- If some operators are not supported yet, you may request a new operator.
- It would be great if you can help to enable new operators, please join us with How to enable a new operator.
- Feel free to open any other related discussions.
Check contribution guide for more.
Apache License Version 2.0.