Google Explains How Astrophotography Works On Pixel 4

Advertisement
Advertisement

Google has just written a blog explaining how astrophotography works on Pixel 4. The Google AI blog talks about how the company started exploring the possibilities of capturing the stars with a smartphone camera following the launch of Night Sight last year, and the results it reaped. It could take sharp and clear pictures of the stars by extending the exposure time to up to four minutes.

"We started to investigate taking photos in very dark outdoor environments with the goal of capturing the stars. We realized that high-quality pictures would require exposure times of several minutes," writes Florian Kainz and Kiran Murthy, Software Engineers, Google Research.

How astrophotography works

Using the Sagittarius constellation as the benchmark, Google's engineers started experimenting with exposure time to capture high-quality images of a moonless sky soon after Night Sight was released. Extending the exposure time increases the total amount of light captured, thus producing clearer images in low-light conditions. However, long exposure can also cause blur due motion in the scene or unsteadiness of the hands. Google says that viewers do not tolerate "motion-blurred stars that look like short line segments."

Advertisement

google astrophotography Scene composition 1

To overcome this, Google engineers split the exposure into frames with "exposure times short enough to make the stars look like points of light". They found that the per-frame exposure time should not exceed 16 seconds in order to capture the stars. While a higher number of frames would produce much brighter pictures, Google also had to consider how much time a photographer would be willing to wait for a picture. Apparently, not many photographers are willing to wait more than four minutes for a picture. Thus Google set four minutes (15 frames of 16 seconds each) as the Pixel 4's upper limit for a single Night Sight image. The Pixel 3 and 3a, meanwhile, have a maximum exposure of one minute.google astrophotography Scene composition 5

Other low-light photography issues

Google says low-light photography has several other unique issues, such as dark current and hot pixels, scene composition, autofocus, and sky processing. Dark current is one of the main sources for noise in image sensors. If the exposure is long, dark current causes pixels to look like they were exposed to a small amount of light, even when no actual light is present. Some pixels exhibit a higher dark current than their neighbors.

Advertisement

In long exposure images, these "warm pixels," as well as defective "hot pixels," are visible as tiny bright dots. These dots are concealed by replacing their value with the average of their neighboring pixels. Google says this causes a loss of image information but does not noticeably affect image quality.

google astrophotography Scene composition
Left: Warm pixels visible. Right: Warm pixels concealed.

To overcome scene composition issues in extremely low light, Night Sight displays a post-shutter viewfinder. Once the first frame has been captured, it is immediately displayed on the screen. This allows the photographer to adjust the composition to capture the desired scene.

google astrophotography Scene composition 2
Left: Live Night Sight viewfinder. Right: Post-shutter viewfinder.

Night Sight on Pixel 4 also switches to post-shutter autofocus. On pressing the shutter button, the camera captures two autofocus frames with exposure times up to one second. This helps in detecting image details in low-light conditions, but the frames do not take part in the final image.

Advertisement

Google is also wary of the fact that low-light images often look much brighter than the original scenes. This can confuse viewers about the time of day in a shot. The company has a solution to this, of course. By using machine learning, Night Sight selectively darkens the sky in a low-light image. Google says an on-device convolutional neural network has been trained on over 100,000 images to detect which regions of an image represent the sky.

google astrophotography Scene composition 3
Left half: Without sky processing. Right half: With sky processing.

Sky processing is also used for sky-specific noise reduction. It also helps in selectively increasing contrast to "make features like clouds, color gradients, or the Milky Way more prominent."

There is always room for improvement

With all these Google Camera enhancements and a tripod, Google could capture sharp pictures of star-filled skies. "As long as there is at least a small amount of moonlight, landscapes will be clear and colorful," Google says.

Advertisement

However, there is always room for improvement. As Google says:

"While we can capture a moonlit landscape, or details on the surface of the moon, the extremely large brightness range, which can exceed 500,000:1, so far prevents us from capturing both in the same image. Also, when the stars are the only source of illumination, we can take clear pictures of the sky, but the landscape is only visible as a silhouette."

google astrophotography Scene composition 4

Advertisement