Last week, Google revealed that the Pixel 2 and Pixel 2 XL contains a “custom-designed co-processor” for machine learning and image processing. This Pixel Visual Core is not yet enabled, but when activated it should result in better processing, among other tasks. A new report today reveals that Google worked with Intel on their first consumer chip.
Google confirmed to CNBC this afternoon that Intel aided in the development of the Pixel Visual Core, noting that no other existing chip delivered what was needed. It comes as teardowns last week revealed Intel-like serial numbers on the chipset that hinted at the origin.
The two companies have increasingly been collaborating with each other. Google Clips features a “visual processing unit” from Intel-acquired Movidius for on-device machine learning, while Waymo is working with Intel on self-driving cars.
The Pixel Visual Core features a Google-designed Image Processing Unit tuned for high performance with minimal power consumption, along with eight custom cores and a compiler that optimizes code for the underlying hardware. The chip should allow HDR+ to run 5x faster while using less than a tenth of the power.
At the moment, the chip lies dormant, but Google is initially targeting the Visual Core for HDR+ image capture in Android 8.1. Its also programmable, with machine learning as other possible applications for it.
Google has been surprisingly quiet on their first consumer chipset. The Tensor processing units for machine learning and Titan chip for securing servers received much more fanfare. In contrast, the Pixel Visual Core went unmentioned at the October 4th event and was announced via a blog post last week as review embargoes dropped.
Image via iFixit
Check out 9to5Google on YouTube for more news:
FTC: We use income earning auto affiliate links. More.
Comments