We built a system like Apple’s to flag child sexual abuse material — and concluded the tech was dangerous
Earlier this month, Apple unveiled a system that would scan iPhone and iPad photos for child sexual abuse material (CSAM). The announcement sparked a civil liberties firestorm, and Apple’s own employees have been expressing alarm. The company insists reservations about the system are rooted in “misunderstandings.”
Researches disagree. They wrote the only peer-reviewed publication on how to build a system like Apple’s — and concluded the technology was dangerous.