Skip to main content

Google AI explains how astrophotography with Pixel 4 Night Sight works

After Pixel launches, Google often details key features that were made possible by internal research teams. The Google AI blog has an explainer focusing on the Pixel 4’s astrophotography capabilities today.

Google “started to investigate taking photos in very dark outdoor environments with the goal of capturing the stars” following Night Sight’s launch last year. As a benchmark, the company’s engineers used the Sagittarius constellation. One insight gained was how viewers do not tolerate “motion-blurred stars that look like short line segments.” That said, swaying trees and drifting clouds are acceptable if the image is otherwise sharp.

To mitigate this, we split the exposure into frames with exposure times short enough to make the stars look like points of light. Taking pictures of real night skies we found that the per-frame exposure time should not exceed 16 seconds.

Another human consideration is how “few are willing to wait more than four minutes for a picture.” That is the Pixel 4’s upper limit — with up to 15 frames, while the Pixel 3 and 3a are capped at a minute.

Night Sight has to take into consideration “additional issues that are unique to low-light photography,” including dark current and hot pixels, scene composition, and autofocus. One critique of Night Sight is how images sometime look too bright, and can confuse viewers about the time of day in a shot. Google’s solution involves identifying and “selectively darkening the sky in photos of low-light scenes.”

To do this, we use machine learning to detect which regions of an image represent sky. An on-device convolutional neural network, trained on over 100,000 images that were manually labeled by tracing the outlines of sky regions, identifies each pixel in a photograph as ‘sky’ or ‘not sky.’

This method is also used for noise reduction on the sky, and to “selectively increase contrast to make features like clouds, color gradients, or the Milky Way more prominent.”

Google ends its Pixel 4 astrophotography explainer by noting that there is always “room for improvement,” and what could change in the future:

While we can capture a moonlit landscape, or details on the surface of the moon, the extremely large brightness range, which can exceed 500,000:1, so far prevents us from capturing both in the same image. Also, when the stars are the only source of illumination, we can take clear pictures of the sky, but the landscape is only visible as a silhouette.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Check out 9to5Google on YouTube for more news:

Comments

Author

Avatar for Abner Li Abner Li

Editor-in-chief. Interested in the minutiae of Google and Alphabet. Tips/talk: abner@9to5g.com