Google is able to take such fantastic photos on the Pixel 2 because of a fine-tuned mix of physical hardware (camera lens and sensor), software (enabling HDR+), and machine learning. Even in its current form, the new phones take some of the best photos on a phone.

But the HDR+ process on the Pixel 2 and 2 XL uses a lot of processing power when capturing stunning photos with almost no latency, so Google made Pixel Visual Core, its first consumer-facing system-on-a-chip, which will help it process photos faster and more efficiently.

The Pixel Visual Processor is mainly built out of eight Image Processing Unit cores, all running with minimal power while outputting as much performance as possible. In Google’s own words, each core has 512 arithmetic logic units, is capable of over 3 trillion operations per second, and can run HDR+ 5x faster and with less than 1/10th the energy usage when compared to the application processor.

Google points out that it gets this sort of performance and efficiency out of the Pixel Visual Core because the phone’s software handles more of the its functionality than a standard processor. This, in turn, makes it harder for developers to program for it, but Google will use a custom compiler that’ll optimize the code outputted from Halide (used for image processing) and TensorFlow (used for machine learning).

Below is an image of the Pixel Visual Core with marked identifiers:

The chip also means that third-party applications will soon be able to use the Pixel 2’s superb HDR+ functionality to capture breath-taking photographs.

HDR+ is just the first bit of camera software to take full advantage of the Pixel Visual Core. As time moves on, we should see new features and functionality added to the processor, enabling Google and third-parties to provide some future improvements to Pixel 2 owners.

Google says that it will be turning on the Pixel Visual Core in the coming weeks using an OTA update which will also bring a developer preview of Android Oreo 8.1. When it’s a bit more polished, Google will enable the Pixel Visual Core for all developers by implementing it into the Android Camera API.

Below are some sample photos showing what scenes look like with and without the Pixel Visual Core processing of HDR+. All of these images have been captured using a third-party camera application.

Check out 9to5Google on YouTube for more news:

FTC: We use income earning auto affiliate links. More.

Check out 9to5Google on YouTube for more news:

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author

Justin Duino

I’m a writer for 9to5Google with a background in IT and Android development. Follow me on Twitter to read my ramblings about tech and email me at Tips are always welcome.