Should Apple Scan Our Phones for Abuse Imagery?

Protecting our children is the right thing to do, but is this the way to do it?

Lance Ulanoff
OneZero

--

Photo by Alexandr Bormotin on Unsplash

Apple, the company that famously fought the FBI on backdoor access to anyone’s encrypted iPhone, may be working on a plan to automatically scan photos in the cloud and on your phone for child abuse images.

This, on the face of it, sounds like a solid plan: automated technology that can help authorities get ahead of those who might seek to or be actively harming children. That’s almost fist-pumping stuff.

However, the technology, which reportedly will use artificial intelligence (AI) trained on a database of 200,000 images from the National Center for Missing & Exploited Children, does raise some interesting and potentially concerning issues.

AI and its cohort, machine learning, are powerful tools known for accurately identifying needles in a universe of haystacks. It’s also known for being misguided by bias-informed training and, yes, even false positives. There’s also the larger issue of a potential law-enforcement activity automatically running on the phones in our pockets.

As I was exploring this issue online, some pointed out how Twitter already had implemented a similar system as early as 2013. There is, obviously, a…

--

--

Lance Ulanoff
OneZero

Tech expert, journalist, social media commentator, amateur cartoonist and robotics fan.