Your Phone’s Camera Isn’t as Good as You Think

Software — not a better lens — is where the real innovation is happening

AtAt this point, nobody needs to tell you that your smartphone is a pretty great camera. That is, as long as you don’t look too closely at those pixels. Once you zoom in on your phone’s screen, give the photo a crop, or make a large physical print of your favorite shots, things start to get a bit fuzzier. You might notice that images aren’t quite as in focus as you thought, the group shot from the bar last night is actually pretty grainy, or there just isn’t as much detail as you had imagined.

Making your pictures look great on bigger screens isn’t as simple as slapping a “better” camera on the back of your phone. The biggest limitation, according to photographer Jeff Carlson, is that as smartphones continue to get thinner and lighter, fitting a larger sensor and lens into a shrinking frame becomes more difficult. Because the sensor and lenses have to be so small, they aren’t able to capture as much light as, say, a high-end Canon or Nikon DSLR, which means the camera draws in less data. And that erodes the sharpness of your shots.

This usually results in a pretty image that looks fine on Instagram, but it still presents problems if you want to view your picture on a larger screen or print.

This won’t change overnight, but photographer Erin Lodi notes that mirrorless cameras like the Sony α6000 have dramatically decreased sensor size compared to old-school DSLRs.

“Some of these tiny mirrorless cameras and even smaller DSLRs aren’t that much bigger than a smartphone,” Lodi tells OneZero. “I’m sure they’ll keep getting down further in size.”

Size is only part of the problem. To get around the pesky limits of physics, companies like Apple and Google have started employing algorithms in their camera apps that fill the gaps left by these smaller cameras. Instead of just snapping a single shot when you click the shutter button, most modern smartphones will capture a series of images a few milliseconds apart and stitch parts of them together to construct what the software thinks will be the “best” picture.

This usually results in a pretty image that looks fine on Instagram, but it still presents problems if you want to view your picture on a larger screen or print. When you zoom in on an enlarged image, you’ll notice that the edges of your subject won’t be as clearly defined as with a traditional camera, and the overall image won’t be as crisp. “That’s mostly because those software features are sort of mixing and blending and making guesses about what the final image should be,” Carlson says. “It fudges and tries to do its best, but it’s not going to completely accurately represent what was there, so it fills things in.”

While that fudging will generally give you a more than passable image, Carlson says each frame is usually shot at different exposures to capture different aspects of the scene you’re photographing. That inevitably results in some shortcomings. “Where they blend, it mixes tones so you don’t get this Frankenstein patchwork of these different exposures,” he says.

The final image isn’t perfect, but it is consistent, and most people will be perfectly satisfied with the quality of their pictures. While physics puts a limit on the lenses smartphones can use, the camera’s software is getting better and better, and improvements aren’t showing any sign of slowing down.

“As more data gets fed to the algorithms and the algorithms get better, there will be more options,” Carlson says. “So, when you take a picture, the sensor or server or processor will be able to see what’s in the image, identify them, and know what color and exposure the items should be.”

It’s not only smartphone camera apps that employ this kind of machine learning. Adobe Lightroom’s Auto Tone feature performs a similar function on existing photographs. By pulling from its database of anonymously sourced photos, the software is able to identify objects and settings in the photo to determine how to adjust the exposure and tone. Some phones do this by default to some degree, but Carlson expects vast improvements in the coming years.

To get the best resolution out of your phone’s camera, opt for shooting panoramas instead of traditional shots.

But you don’t have to wait for smarter smartphone cameras. Photographer Austin Mann, who does an annual review of the latest iPhone’s camera, says there are a few things you can do right now to spruce up your photography game.

To get the best resolution out of your phone’s camera, opt for shooting panoramas instead of traditional shots. On the latest iPhone models, panoramas pack a whopping 63 megapixels into each shot, which should be plenty for that landscape photo you want to mount over the fireplace.

That won’t work as well for shots with moving subjects. In those cases, Mann suggests shooting in RAW format, which captures more data than the standard JPEG format. You’ll need a dedicated app for this, like Manual Camera for Android or Halide for iOS, plus a good understanding of how to manually control a camera’s exposure. Mann also recommends Photoshop’s resampling feature, which can increase an image’s size and pixels per inch while minimizing the drop in quality.

If all of this isn’t good enough, well, the next upgrade is right around the corner.

Journalist • Photographer • millennial with vague aspirations • @TypingPixels everywhere • jordanmcmahon.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store