A Googler Explains Exactly How the Pixel 4 Captures the Night Sky

We went stargazing with product manager Alex Schiffhauer to learn more about the new phone’s astrophotography mode

Aaron Gell
OneZero

--

The stars were up there — that we knew. We just couldn’t see them, much less take their picture.

In my pocket was a brand new Pixel 4, the one with the Snapdragon 855 processor, the F2.4 telephoto lens, 16MP sensor, and enhanced Night Sight mode developed explicitly for capturing celestial bodies. I’d made the trip to Goat Rock Beach, in a state park some two and a half hours north of Google’s Mountain View offices, to meet Alex Schiffhauer, the 28-year-old product manager for the Pixel’s computational photography.

The idea was to give the new phone’s astrophotography capabilities a workout, but the National Oceanic and Atmospheric Administration had let us down. Despite the agency’s prediction of clear skies, you couldn’t see more than 10 feet in any direction.

The Pixel’s camera may be good, but it proved no match for that Bodega Bay fog, especially when you factored in the smoke drifting over from the Kincade fire.

We finally found a nice patch of open sky 45 miles to the northeast, in a picturesque spot overlooking Lake Sonoma. There, with…

--

--

OneZero
OneZero

Published in OneZero

OneZero is a former publication from Medium about the impact of technology on people and the future. Currently inactive and not taking submissions.

Aaron Gell
Aaron Gell

Written by Aaron Gell

Medium editor-at-large, with bylines in the New Yorker, Vanity Fair, the New York Times and numerous other publications. ¶ aarongell.com