Introducing TFLite Parser Python Package

I have being maintaining the TFLite parser python package since September 2019. With which people can parse TensorFlow Lite (TFLite) models (*.tflite) with one single import tflite. This article introduces the idea behind.

Why A Parser Package

TFLite models are represented in FlatBuffers format, which can be taken as a simplified high performance version of Protocol Buffers for mobile devices.

To construct or read the models (typical FlatBuffer file), a program needs to utilize the helper classes (C++/Java/C#/Go/Python…) generated by FlatBuffer compiler (flatc) from a schema file. As the core functionality of TFLite is written in C++, TensorFlow package doesn’t ship with the interfaces to parse TFLite models (*.tflite). Instead, TFLite interprets models and runs inference itself.

There are many questions asking how to parse TFLite models in Python:

The answer is always, build the interpreter package from the schema file. So, but not create one and share across.

How It Works

Well, this is trivial. As I’d like to maintain a package which can parse models generated by TensorFlow of different versions, the tflite package is also versioned like TensorFlow.

For a given *.tflite model, install the correct package from the version mapping table with single command like below. You can play around with it, without the tedious building progress. That’s it!

pip install tensorflow==2.1.0
pip install tflite==2.1.0

I’d like to make the package a simple one such that parsing TFLite is it’s only target.

For more usage detail, please visit the project site and document site.

Make It Better

Actually, before I created that package, there have been people sharing similar package such as this. As anyone can easily create such, why shall I introduce another?

Firstly, people are not paying efforts to maintain, but left it as a newly created ones. It usually requires several steps to setup before using it, such as downloading the package from GitHub or somehow, configure the PYTHONPATH or build the package from source. I found that is annoying because you need to search for the manuals.

Secondly, these packages are not user-friendly. As the Python source is generated by FlatBuffers complier, the modules and classes are not well structured, such that we may need to import many times when parsing a simple Convolution operation. Below is the code segment from TVM. That is frustrating.

def convert_conv(self, op, conv_type):
    """convolution implementation."""
    try:
        from tflite.BuiltinOptions import BuiltinOptions
        from tflite.ActivationFunctionType import ActivationFunctionType
        from tflite.TensorType import TensorType
        from tflite.Operator import Operator
        from tflite.Conv2DOptions import Conv2DOptions
        from tflite.DepthwiseConv2DOptions import DepthwiseConv2DOptions
        from tflite.Padding import Padding
    except ImportError:
        raise ImportError("The tflite package must be installed")

    assert isinstance(op, Operator)
    input_tensors = self.get_input_tensors(op)
    assert len(input_tensors) >= 2, "input tensors length should be >= 2"

This package introduces an easy method which can start parse with one single import tflite, which is a Python package shall be. We achieve this by importing every classes and functions from all modules - simple but useful. Check the MobileNet parsing example, which demonstrates the essential of parsing a TFLite model.

Tools to build the package has been upgraded such that we can manage it with limited human resource, pretty easy.

Thirdly, the package has been released via PyPI, where uses can install via pip. While others require manual setup progress.

For anything to update, I will leave at the project home. Enjoy hacking!

支付宝 微信赞赏 PayPal

黎明灰烬

Creative Commons License
Archive · Tags · About

最近文章

黎明灰烬 博客 +