The Data Daily

Google enables machine learning on mobile with TensorFlow Lite - SiliconANGLE

Last updated: 02-11-2019

Read original article here

Google enables machine learning on mobile with TensorFlow Lite - SiliconANGLE

Google Inc. is launching a lightweight version of its open-source TensorFlow machine learning library for mobile platforms. Announced at Google’s I/O developer conference, TensorFlow Lite is available for both Android and iOS developers in preview.

TensorFlow is an open-source software library that was first released in 2015 by Google to make it easier for developers to design, build, and train deep learning models. It functions by sorting through layers of data as part of the learning process. TensorFlow can be thought of as a kind of artificial brain in which complex data structures, or ‘tensors’, flow through. Google says this process is a central aspect of deep learning, and can be used to enhance many technology products.

In a blog post, Google’s TensorFlow team said this new Lite version can be seen as an evolution of its TensorFlow Mobile application programming interface, and is now the company’s recommended solution for deploying machine learning models on mobile and embedded devices.

TensorFlow Lite is a “crucial step toward enabling hardware-accelerated neural network processing across Android’s diverse silicon ecosystem,” said Android engineering vice president Dave Burke.

Because it’s still under development, TensorFlow Lite only has access to a limited number of machine learning models at present, including MobileNet and Inception v3 for object identification with computer vision, and Smart Reply for natural language processing that provides one-touch replies to incoming chat messages. It’s also possible for developer to deploy custom models trained with their own datasets, Google said.

The company also said that more models and functionality will be added in future according to user’s needs.

The Internet giant added that TensorFlow Lite was rebuilt from scratch so as to make it as lightweight as possible, enabling inference of on-device machine models with a small binary. It’s also been “fast optimized” for mobile devices with improved model loading times and support for hardware acceleration. Lastly, TensorFlow Lite also takes advantage of “purpose-built custom hardware to process ML workloads more efficiently,” Google said.

The developer’s preview of TensorFlow Lite can be downloaded from GitHub now.

Read the rest of this article here