I have been planing to write a series articles on Neural Network Quantization.
As we may see in the series, there are plenty of researches and metarils on this topic in recent years, regarding the rapid evolution of Deep Learning. Meanwhile, industry keeps improving implementation in their products. However, for most of the documents you can find that is related with Neural Network Quantization, authors rush into the details so fast to talk about their work while the new comers can hardly understand even the basement. This is of course not a welcomed status in such a fast growing field.
Basically, the series is planned to discuss every aspects of Neural Network Quantization and tries to be easy to understand as far as I can. The theory, arithmetic, mathmetic, research and implementation may all be addressed. So the content could be huge. On the other hand, the series could require significant effort and time to accomplish. I will try my best to make the topics in the series of sound logic. So, be patient. :)
The series will assume that readers are faimilar with Machine Learning, Neural Networks and Deep Learning. If not, Andrew Ng’s Machine Learning course is a very good start. Google also provides tremendous metarils on many topics which are basically TensorFlow based. If you are interested in more academy/theory style things in the field of Deep Learning, try Goodfellow’s Deep Learning book and Fei-Fei Li’s Convolutional Neural Networks for Visual Recognition.
This series will not focus on model design which produces MobileNet, ShuffleNet and so on. We will pay our attention to how the arithmetic happens in frameworks - actually software engineers will be more interested in this series rather than algorithm engineers or researchers.
Lastly, why I am writing in English? Well, we can hardly find Chinese translation of terminologies, in a active research field like Neural Networks, which meets a standard of 信雅达. People work in this field usually are able to read Engilish papers without any trouble.
So this is the opening. Thanks for reading.