The way forward for Android will likely be so much smarter, because of new programming instruments that Google unveiled on Wednesday. The firm introduced TensorStream Lite, a model of its machine studying framework that’s designed to run on smartphones and different cellular gadgets, throughout the keynote tackle at its Google I/O developer convention.
“TensorFlow Lite will leverage a new neural network API to tap into silicon-specific accelerators, and over time we expect to see [digital signal processing chips] specifically designed for neural network inference and training,” mentioned Dave Burke, Google’s vp of engineering for Android. “We think these new capabilities will help power a next generation of on-device speech processing, visual search, augmented reality, and more.”
The Lite framework will likely be made part of the open supply TensorStream challenge quickly, and the neural community API will come to the subsequent main launch of Android later this yr.
The framework has critical implications for what Google sees as the way forward for cellular . AI-focused chips may make it doable for smartphones to deal with extra superior machine studying computations with out consuming as a lot energy. With extra purposes utilizing machine studying to offer clever experiences, making that kind of work extra simply doable on system is vital.
Right now, constructing superior machine studying into purposes—particularly relating to coaching fashions—requires an quantity of computational energy that sometimes requires beefy , loads of time and loads of energy. That’s not likely sensible for client smartphone purposes, which suggests they usually offload that processing to huge datacenter by sending photographs, textual content and different information in want of processing over the web.
Processing that information within the cloud comes with a number of downsides, based on Patrick Moorhead, principal analyst at Moor Insights and Strategy: Users have to be prepared to switch their information to an organization’s servers, they usually need to be in an atmosphere with wealthy sufficient connectivity to verify the operation is low-latency.
There’s already one cellular processor with a machine learning-specific DSP in the marketplace right now. The Qualcomm Snapdragon 835 system-on-a-chip sports activities the Hexagon DSP that helps TensorStream. DSPs are additionally used for offering performance like recognizing the “OK, Google” wake phrase for the Google Assistant, based on Moorhead.
Users ought to count on to see extra machine studying acceleration chips sooner or later, Moorhead mentioned. “Ever since Moore’s Law slowed down, it’s been a heterogeneous computing model,” he mentioned. “We’re using different kinds of processors to do different types of things, whether it’s a DSP, whether it’s a [field-programmable gate array], or whether it’s a CPU. It’s almost like we’re using the right golf club for the right hole.”
Google is already investing in ML-specific with its line of Tensor Processing Unit chips, that are designed to speed up each the coaching of recent machine studying algorithms in addition to information processing utilizing current fashions. On Wednesday, the corporate introduced the second model of that , which is designed to speed up machine studying coaching and inference.
The firm can also be not the one one with a smartphone-focused machine studying framework. Facebook confirmed off a mobile-oriented ML framework known as Caffe2Go final yr, which is used to energy purposes like the corporate’s dwell model switch characteristic.