One of the many announcements from I/O 2017 was TensorFlow Lite for machine learning on mobile devices. Starting today, the Android and iOS optimized version of the ML library is now available as a developer preview.
Google calls this Lite version of TensorFlow an evolution on TensorFlow Mobile, with the company pegging it for machine learning on mobile and embedded devices as it matures. Android engineering vice president Dave Burke notes that it is a “crucial step toward enabling hardware-accelerated neural network processing across Android’s diverse silicon ecosystem.”
Still under “active development” and noting the eventual “large” scope for Lite, this initial release is focusing on performance for key models. Namely, it is optimized and trained for:
- MobileNet: A class of vision models able to identify across 1000 different object classes, specifically designed for efficient execution on mobile and embedded devices
- Inception v3: An image recognition model, similar in functionality to MobileNet, that offers higher accuracy but also has a larger size
- Smart Reply: An on-device conversational model that provides one-touch replies to incoming conversational chat messages. First-party and third-party messaging apps use this feature on Android Wear.
TensorFlow Lite was redesigned from scratch to focus on three areas:
- Lightweight Enables inference of on-device machine learning models with a small binary size and fast initialization/startup
- Cross-platform A runtime designed to run on many different platforms, starting with Android and iOS
- Fast Optimized for mobile devices, including dramatically improved model loading times, and supporting hardware acceleration
It also supports and takes advantage of how mobile devices increasingly feature “purpose-built custom hardware to process ML workloads more efficiently,” like the Pixel Visual Core. As such, it supports the Android Neural Networks API introduced with the Android 8.1 developer preview. Even without accelerator hardware, TensorFlow Lite can fall back and run on the CPU.
FTC: We use income earning auto affiliate links. More.