• Home
  • iOS
  • iPadOS
  • macOS
  • News
  • Apple Unveils New Child Safety Features – Will Scan Users’ Photo Libraries for Known Sexual Abuse Material

Apple Unveils New Child Safety Features – Will Scan Users’ Photo Libraries for Known Sexual Abuse Material

Apple Unveils New Child Safety Features – Will Scan Users’ Photo Libraries for Known Sexual Abuse Material

Apple today offered a preview of its new child safety features that will be on the way later this year. The software updates across the Cupertino firm’s platforms will bring the new features, US-only at launch to be expanded to other regions later on.

New communication tools will allow parents to play a more informed role in helping their children navigate the world of online communication. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.

When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.

New technology in iOS and iPadOS will allow Apple to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.

For more information, visit the Apple website.