• Home
  • iOS
  • iPadOS
  • macOS
  • News
  • Apple Provides More Detailed Overview of Security and Privacy of CSAM Detection System

Apple Provides More Detailed Overview of Security and Privacy of CSAM Detection System

Apple Provides More Detailed Overview of Security and Privacy of CSAM Detection System

Apple today shared a document providing a more detailed overview of the new child safety features that will be on the way later this year. The document includes information about the security and privacy requirements, threat model considerations, and more.

The new document comes as Apple is facing widespread criticism, including from its own employees, over its plans to begin scanning iPhone users’ photo libraries for child sexual abuse material (CSAM). The employees are said to be speaking out internally about how the technology could be misused by scanning for other types of content.

The document attempts to address these concerns and gives more details about how the system will work, including a mention that Apple will likely set an initial match threshold of 30 known CSAM images before an iCloud account is flagged for review by an Apple employee.

Apple also notes that the on-device database of known CSAM images includes only images that were provided by two or more child safety organizations operating in separate sovereign jurisdictions and not under the control of the same government. In a memo obtained by Bloomberg‘s Mark Gurman, Apple said it will have an independent auditor review the system as well.

Apple says that despite the criticism and pushback it is experiencing about the new feature that it will stick to its timeframe of making the feature available on the iPhone, iPad, and Mac with software updates this year in the U.S.