A Googler Explains Exactly How the Pixel 4 Captures the Night Sky
We went stargazing with product manager Alex Schiffhauer to learn more about the new phone’s astrophotography mode
--
The stars were up there — that we knew. We just couldn’t see them, much less take their picture.
In my pocket was a brand new Pixel 4, the one with the Snapdragon 855 processor, the F2.4 telephoto lens, 16MP sensor, and enhanced Night Sight mode developed explicitly for capturing celestial bodies. I’d made the trip to Goat Rock Beach, in a state park some two and a half hours north of Google’s Mountain View offices, to meet Alex Schiffhauer, the 28-year-old product manager for the Pixel’s computational photography.
The idea was to give the new phone’s astrophotography capabilities a workout, but the National Oceanic and Atmospheric Administration had let us down. Despite the agency’s prediction of clear skies, you couldn’t see more than 10 feet in any direction.
The Pixel’s camera may be good, but it proved no match for that Bodega Bay fog, especially when you factored in the smoke drifting over from the Kincade fire.
We finally found a nice patch of open sky 45 miles to the northeast, in a picturesque spot overlooking Lake Sonoma. There, with our Pixels set to Night Sight and trained on the heavens, capturing four-minute shots, we talked about the history of photography, Google’s rivalry with Apple, what the company does with all that data from our picture-taking, the death of art, and, of course, the nature of reality itself.
This interview has been edited and condensed for clarity.
OneZero: [Looking up at the sky.] That was a long drive, but wow, okay. Damn.
Alex Schiffhauer: It’s intense, isn’t it? But no Milky Way — it looks like it, but I think that’s clouds. That’s probably why there’s nobody here. The last time I came here I got yelled at because I took out my drone and the amateur astronomers were not happy.
Have you tried astro mode?