The giant Apple has announced that it is completely abandoning its plans to scan its users’ iCloud libraries for child pornography images. A win for privacy advocates, a loss for child protective services.
Apple finally abandons its unloved project initiated in August 2021. The company does not, however, drop its measures to protect children and will focus its efforts on communication security in its Messages app.
Abandoning a difficult plan
In August 2021, Apple announced that it wanted to quickly set up a new analysis and prevention project against child pornography content. The Apple giant, with the support of child protection organizations, wants to introduce a feature allowing the company to automatically scan photo libraries on iCloud in search of such content.
If the investigation and child protection services saw this project in a good light, Apple faced immense discontent from its users. Considered a champion of data protection, the company was accused of wanting to poke its nose directly into people’s private lives.
The anger was such that the company halted its plans, announced less than a month after pausing the project. Apple, at the time, said it took time to collect feedback and improve what needed to be improved. Finally, a year later, the apple giant decided to end the project. A decision communicated to the Wired media and accompanied by the deployment of end-to-end encryption of backups on iCloud.
Plans for the future?
A double blow for the American authorities in particular, who see both the disappearance of hope for a powerful tool that can help them in their tasks, and the arrival of a functionality that reveals much more complicated investigations. End-to-end encryption effectively prevents anyone other than the primary user from accessing their data. But that doesn’t mean Apple is abandoning its mission to protect children. The company now wants to put more effort into communications security.
Apple’s Messages app contains a number of activatable tools that warn and deserve children. For example, if the system detects that the child receives or attempts to send a photo likely to contain nudity, the iPhone or iPad blurs said image and warns the child. Among other things, Apple wants to bring this system to videos as well.