apple-is-erasing-traces-of-this-controversial-feature

Apple is erasing traces of this controversial feature

This year, online platforms have increased the number of announcements concerning child protection. And Apple wanted to go further than other companies by developing a feature planned for iCloud that would have allowed it to detect child pornography uploaded to its online storage service. line.

In essence, the mechanism devised by Apple does not compare photos. This compares the hashes (a kind of digital fingerprint) of user images with those of images included in known databases.

A priori, this should therefore ensure a better protection of children, without compromising user privacy. Moreover, these scans had to be done locally, and not on servers.

“Apple’s method for detecting CSAMs (editor’s note, known illegal content is designed with user privacy in mind. Instead of scanning images to the cloud, the system matches on the device using a database of known CSAM image hashes provided by NCMEC and other child safety organizations ” , had indicated the Cupertino company.

A project criticized by Edward Snowden

But if Apple’s intentions were good, the project raised many concerns. Indeed, experts fear that this system will end up being hijacked as a tool of censorship or mass surveillance.

In a newsletter published in August, the whistleblower wrote that by developing this feature, Apple has declared war on our privacy. Indeed, once this precedent is set, governments could demand that Apple use the same device to search for other types of content. And when it comes to efficiency, Edward Snowden notes that a pedophile would only have to turn off synchronization with iCloud so that their photos are not scanned.

Pinceton University had also raised concerns about potential abuse if Apple launched this feature.

Apple has gone back and now the company wants to erase the traces?

Faced with the pressure, Apple had to revise its plans. In September, the Cupertino company put on hold the e deployment of this feature against child pornography.

And today we learn that Apple has decided to remove all documents relating to the project from its website. As explained by the media outlet The Verge, Apple has updated the web page dedicated to safety features for children, removing all references to the CSAM detection tool (in the United States, it is like that referred to as illegal content). This update would have been made between 10 and 13 December.

Has Apple decided to abandon this project for good? Apparently not. Indeed, although the firm seems to want to make us forget this controversy, it affirms that its position is still the same as in September, when it decided to suspend the deployment.

“Based on feedback from clients, advocacy groups, researchers and others, we have decided to take more time over the next few months to collect feedback and make improvements before releasing these critically important child safety features “, Apple said in September.

Newsletter 🍋 Subscribe, and receive every morning a summary of the tech news in your mailbox.