News

Apple Offers Additional Explanation About Why It Dropped Plan to Detect CSAM Material in iCloud Photos

Apple has offered up an additional bit of explanation for deciding to abandon its much-maligned plan to detect Child Sexual Abuse Material (CSAM) photos that are stored in iCloud Photos.

Apple shared a statement with Wired, as seen reproduced below, as it responded to a demand from child safety group Heat Initiative that the company “detect, report, and remove” CSAM from iCloud and provide more tools for users to report such content to the company:

“Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it,” Erik Neuenschwander, Apple’s director of user privacy and child safety, wrote in the company’s response to Heat Initiative. He added, though, that after collaborating with an array of privacy and security researchers, digital rights groups, and child safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.

“Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit,” Neuenschwander wrote. “It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”

In August 2021 Apple announced that iOS and iPadOS would use new applications of cryptography to help limit the spread of CSAM online while designing for user privacy. Apple said CSAM detections would provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

The Messages app would add new tools to warn children and their parents when receiving or sending sexually explicit photos.

When receiving this type of content, the photo would be blurred and the child would be warned, presented with helpful resources, and reassured that it was okay if they did not want to view this photo. As an additional precaution, the child could also be told that, to make sure they are safe, their parents would get a message if they do view it. Similar protections would be available if a child attempted to send sexually explicit photos. The child would be warned before the photo is sent, and the parents could receive a message if the child chose to send it.

While Apple had initially announced the CSAM detection would be included as part of an update to iOS 15 and iPadOS 15 by the end of 2021, the company postponed the inclusion of the feature, based on “feedback from customers, advocacy groups, researchers, and others.” The plans were criticized by a wide range of individuals and groups, including the Electronic Frontier Foundation (EFF), security researchers, policy groups, university researchers, and politicians, and was even criticized internally by some Apple employees.

Chris Hauk

Chris is a Senior Editor at Mactrast. He lives somewhere in the deep Southern part of America, and yes, he has to pump in both sunshine and the Internet.