Laurence Maroney explains TensorFlow Lite:
TensorFlow Lite is TensorFlow’s lightweight solution for mobile and embedded devices. It enables on-device machine learning inference with low latency and a small binary size. TensorFlow Lite also supports hardware acceleration with the Android Neural Networks API.
It’s designed to be low-latency, with optimized kernels for mobile apps, pre-fused activations and much more. It’s also *really* easy to use, and there’s a great demo app that will get you up and running with image classification from the device camera on both Android and iOS.
It comes in two parts:
-
A set of tools that you can use to prepare your models for use on mobile. These let you freeze your model to make it smaller, and then optimize and convert it in a process also called flattening the model, so that it will run happily on mobile
-
A mobile runtime with an easy API that lets you pass data to the model and get classifications back.
You don’t build the neural network on a phone, but the fact that you can run one on your phone is pretty crazy.