At this point, nobody needs to tell you that your smartphone is a pretty great camera. That is, as long as you don’t look too closely at those pixels. Once you zoom in on your phone’s screen, give the photo a crop, or make a large physical print of your favorite shots, things start to get a bit fuzzier. You might notice that images aren’t quite as in focus as you thought, the group shot from the bar last night is actually pretty grainy, or there just isn’t as much detail as you had imagined.
Making your pictures look great on bigger screens isn’t as simple as slapping a “better” camera on the back of your phone. The biggest limitation, according to photographer Jeff Carlson, is that as smartphones continue to get thinner and lighter, fitting a larger sensor and lens into a shrinking frame becomes more difficult. Because the sensor and lenses have to be so small, they aren’t able to capture as much light as, say, a high-end Canon or Nikon DSLR, which means the camera draws in less data. And that erodes the sharpness of your shots.
This usually results in a pretty image that looks fine on Instagram, but it still presents problems if you want to view your picture on a larger screen or print.
This won’t change overnight, but photographer Erin Lodi notes that mirrorless cameras like the Sony α6000 have dramatically decreased sensor size compared to old-school DSLRs.
“Some of these tiny mirrorless cameras and even smaller DSLRs aren’t that much bigger than a smartphone,” Lodi tells OneZero. “I’m sure they’ll keep getting down further in size.”
Size is only part of the problem. To get around the pesky limits of physics, companies like Apple and Google have started employing algorithms in their camera apps that fill the gaps left by these smaller cameras. Instead of just snapping a single shot when you click the shutter button, most modern smartphones will capture a series of images a few milliseconds apart and stitch parts of them together to construct what the software thinks will be the “best”…