Skip to main content

Astrophotography: Google reveals how the Pixel 4 nails those night shots

Google

Google’s astrophotography mode that launched recently with the Pixel 4 before arriving for the Pixel 2 and 3 devices has proved a hit with fans of the night sky.

Using Google’s Camera app, the astrophotography mode lets you capture stunning shots of the stars that would usually involve photography equipment far bulkier and pricier than a simple smartphone.

With so much interest in the feature, the tech giant this week decided to offer some insight into how it works, explaining some of its smarts in a blog post.

The astrophotography mode is essentially a more advanced version of Night Sight, the powerful low-light feature that launched with the Pixel 3 in 2018.

“This year’s version of Night Sight pushes the boundaries of low-light photography with phone cameras,” Google’s photography team wrote in the post. “By allowing exposures up to 4 minutes on Pixel 4, and 1 minute on Pixel 3 and 3a, the latest version makes it possible to take sharp and clear pictures of the stars in the night sky or of night-time landscapes without any artificial light.”

Google

The post covers a fair bit of ground, including how the feature helps to avoid camera shake and blurring from in-scene motion by splitting long exposures into multiple frames before automatically aligning them to create a sharp image.

For the astrophotography mode, the Pixel 4’s per-frame exposure time lasts no more than 16 seconds for a maximum of 15 frames. Longer exposures would create so-called “star trails” caused by the celestial bodies “moving” through the sky. While some astrophotographers like to capture images with star trails, Google’s feature aims to create pictures that make the stars “look like points of light.”

Google’s piece also explains how the software deals with what are known as warm and hot pixels, tiny bright dots that can appear with longer exposures captured by digital camera sensors.

According to Google, warm and hot pixels can be identified “by comparing the values of neighboring pixels within the same frame and across the sequence of frames recorded for a photo, and looking for outliers.” Once located, the pixel is then concealed by replacing its value with the average of its neighbors. “Since the original pixel value is discarded, there is a loss of image information, but in practice this does not noticeably affect image quality,” Google said.

The piece goes on to talk about how the software brightens the display to aid composition, and how it manages to ensure sharp focusing in the challenging low-light conditions.

It also explains how it uses machine learning to reduce noise and selectively darken the sky, giving a more realistic impression of the scene at the time, and making those stars, and the rest of the image, really pop.

You can find the article here.

Editors' Recommendations

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
This is our best look yet at the Google Pixel 9 series
Pixel 9, Pixel 9 Pro and Pixel 9 XL leaks.

The Google Pixel 9 (from left), Pixel 9 Pro, and Pixel 9 XL Rozetked

When they finally launch, the Google Pixel 9, Pixel 9 Pro, and Pixel 9 Pro XL will not have many surprises — at least not when it comes to their design. Rozetked has revealed a series of new images showing the front, back, and sides of the Pixel 9 and Pixel 9 Pro XL, along with comparisons to previous Pixel models and the iPhone 15 Pro and Pro Max.

Read more
I have the Google Pixel 8a. Here are 6 things you need to know
A person holding the Google Pixel 8a.

The Google Pixel 8a is in my hand and will have been for a few days by the time you read this. It’s not long enough for me to give it our full in-depth review treatment, as the battery has only just settled down into everyday life and I’m still experimenting with the camera and features.

But there are some things I’ve quickly discovered about the Pixel 8a that you should know about. So, while we work on the review, take a look at what has already piqued our interest in Google’s newest, cheapest phone.
You're going to notice the bezels

Read more
How to watch Google I/O 2024
A Google logo sign at the top of a building.

We are quickly approaching Google I/O 2024, which is scheduled to take place on Tuesday, May 14, in Mountain View, California. The keynote address will be available for live-streaming, meaning you can watch it from the comfort of your own home.

But what time does I/O begin? What are we expecting? Here's what you need to know!
How to watch Google I/O 2024
It’s time to I/O

Read more