• Home
  • Apple
  • News
  • European Commission to Release Draft Law Enforcing Mandatory Detection of Child Sexual Abuse Material

European Commission to Release Draft Law Enforcing Mandatory Detection of Child Sexual Abuse Material

European Commission to Release Draft Law Enforcing Mandatory Detection of Child Sexual Abuse Material

The European Commission will soon release a draft law this week that will require Apple, Google, and other tech firms to identify, remove and report to law enforcement illegal images of child abuse on their platforms.

A leak of the proposal, published by Politico, shows that the EC believes that the voluntary issues being taken by Apple and other companies have “proven insufficient” in addressing the increasing sharing of child sexual abuse content, which is why the commission wants to make detection of such material mandatory.

The industry is waiting to see how stringent the new laws will be, and how it will be accomplished without tech companies having to scan the gamut of user content – which was deemed illegal by the Court of Justice of the European Union in 2016.

Meanwhile, privacy groups and tech companies are worries that the EU draft law could require the creation of “backdoors” to access encrypted messaging services, which currently are inaccessible by Apple and other hosting platforms. The fear there, of course, is that the backdoors could be misused by the government as well as the bad actors of the world.

Many times lawmakers will use the “it’s to protect the children” gambit to push through legislation that can allow them to gain further monitoring access of their citizens’ online activities.

While the EC’s Home Affairs Commissioner Ylva Johansson has stated that technical solutions exist to keep conversations safe while finding illegal content, but cybersecurity experts disagree.

“The EU shouldn’t be proposing things that are technologically impossible,” said Ella Jakubowska, speaking to Politico. Jakubowska is policy adviser at European Digital Rights (EDRi), a network of 45 non-governmental organizations (NGOs.)

“The idea that all the hundreds of millions of people in the EU would have their intimate private communications, where they have a reasonable expectation that that is private, to instead be kind of indiscriminately and generally scanned 24/7 is unprecedented,” said Jakubowska.

Not everyone on the EC agrees with the proposal. Reacting to the leak of the proposal, centrist Renew Europe MEP Moritz Körner told Politico the Commission’s proposal would mean “the privacy of digital correspondence would be dead.”

Apple faced controversy last year, surrounding its plan to search for CSAM (child sexual abuse material) on iPhones and iPads.

Respected researchers at Princeton University who built an image scanning system warned Apple about the technology the Cupertino firm plans to use to scan iPhone users’ photo libraries for child sexual abuse material (CSAM), calling the technology “dangerous.”

Jonathan Mayer, an assistant professor of computer science and public affairs at Princeton University, along with Anunay Kulshrestha, a researcher at Princeton University Center for Information Technology Policy, both wrote an op-ed article for The Washington Post, discussing their experiences with building image detection technology.

Apple’s plan to detect known CSAM images stored in iCloud Photos has proven to be controversial and has raised concerns from security researchers, academics, privacy groups, and others about the system potentially being abused by governments as a form of mass surveillance.

Apple employees have also reportedly raised concerns internally over the company’s plan.

Mayer and Kulshrestha said that their concerns over how governments could use the system to detect content other than CSAM disturbed them.