Member-only story

Apple’s CSAM Detection System and Its Messaging Are a Work in Progress

Apple has been talking and talking and talking about how its upcoming system for protecting children will also leave our privacy intact

Lance Ulanoff
OneZero
4 min readAug 13, 2021

--

Photo by Laurenz Heymann on Unsplash

What a week it’s been for Apple. It’s trying to do something important: protect children from abuse. But in doing so, revealing its plans well in advance of launching the tools and technology, the normally unflappable Cupertino tech giant opened a Pandora’s Box of questions and concerns.

Since then, it’s been on an information-sharing offensive, speaking to media about the intention and specific tech underpinnings of its CSAM Detection technology, and even having high-level executives sit down with media for on-the-record one-on-ones to eradicate misinformation and calm everyone down.

It may or may not have worked.

For me, it’s been a learning experience, as I try to understand the intricacies of not one, but two systems that will identify harmful or illicit images — one set being shared on the iPhone’s Messaging system, and the other being uploaded to Apple’s iCloud Photos. The technology and response between the two systems are different, but by introducing the concepts together, Apple, in its own view, created confusion. As…

--

--

OneZero
OneZero

Published in OneZero

OneZero is a former publication from Medium about the impact of technology on people and the future. Currently inactive and not taking submissions.

Lance Ulanoff
Lance Ulanoff

Written by Lance Ulanoff

Tech expert, journalist, social media commentator, amateur cartoonist and robotics fan.

No responses yet