Skip to main content

Intel’s impressive ‘visual processing unit’ powers the on-device machine learning in Google Clips

Google Clips was one of the only genuine surprise announcements from Google’s October 4th event earlier this week. Designed to unobtrusively capture moments, it has a strong focus on privacy thanks to on-device machine learning. This cloud-free processing is in part thanks to a chip that Intel’s Movidius group calls a “vision processing unit.”

This Myriad 2 VPU is an ultralow-power chip that allows “Google’s advanced machine learning algorithms run in real-time directly on the Clips camera” (via The Verge). As a result, the device does not have to talk to the cloud to identify faces that it should take a picture of. Additionally, the process of training Clips on who to take photos of — by pressing the manual shutter button — is all done offline.

Besides facial recognition, other applications for the Myriad 2 include 3D depth-sensing, gesture/eye tracking, and pose estimation, as well as being capable of running embedded deep neural networks.

Google’s director of Machine Intelligence Blaise Agüera y Arcas noted that its “partnership with Movidius has meant that we can port some of the most ambitious algorithms to a production device years before we could have with alternative silicon offerings.”

Dropping the need for constant communication with the cloud means improved battery life, which also translates to low latency. On the privacy front, the Myriad 2 also allows for the clips “motion photos” to remain on the device’s 16GB of storage until approved by a user for sharing or backup to Google Photos.

Long before Intel acquired Movidius in September 2016, for a rumored amount of $400 million, Google has been using the latter’s chips. The first Project Tango reference phone in 2014 used the first-generation Myriad 1 to power computer vision capabilities.

Earlier last year before the acquisition, Google and Movidius signed a partnership “to accelerate adoption of machine intelligence applications in mobile devices.”

On-device machine learning was a clear trend at Google’s October 4th event, with the Pixel 2’s ability to recognize songs playing around you completely offline being equally impressive.

https://www.youtube.com/watch?v=GEy-vtev1Bw

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Abner Li Abner Li

Editor-in-chief. Interested in the minutiae of Google and Alphabet. Tips/talk: abner@9to5g.com