Apple will scan all iPhones in the US for images of child sexual abuse . A move that raises ovation from child protection groups but also creates concern from some security experts that the system may be misused by governments seeking to monitor their citizens.
The tool designed to detect known images of child sexual abuse, called ” neuralMatch “, will scan the photos before they are uploaded to iCloud . If a match is found, the image will be examined by a human. If child pornography is confirmed, the user’s account will be disabled and a report will be sent to the National Center for Missing and Exploited Children. The system will not report images that are not already in the center’s child pornography database.
Parents taking innocent pictures of a baby in the bathtub presumably need not worry. But some researchers say that the tool, which does not “see” such images, but only mathematical “fingerprints” that represent them, could be used for more nefarious purposes . Matthew Green, a leading cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly harmless images designed to trigger child pornography matches. This could fool Apple’s algorithm and alert law enforcement. “The researchers were able to do this quite easily,” Green explained. Other abuses could include government surveillance of dissidents or protesters . “What happens when the Chinese government says, ‘Here is a list of files we want you to scan’?” Green asked. “Will Apple say no? I hope they say no, but their technology won’t say no.”
Tech companies including Microsoft, Google, Facebook and others have been sharing “fingerprints” of known images of child sexual abuse. Apple used them to scan users’ files stored in its iCloud service, which isn’t as securely encrypted as its data on the device, for child pornography. Apple has been under pressure from the government for years to allow greater surveillance of encrypted data. The introduction of the new security measures required Apple to perform a delicate balancing act between suppressing child exploitation and its commitment to protecting the privacy of its users.
Apple stated that the latest changes will roll out this year as part of its operating software updates for iPhone , Mac and Apple Watch . “Apple’s extended protection for children is a game changer,” said John Clark , president and CEO of National Center for Missing and Exploited in a statement Children . “With so many people using Apple products, these new safety measures have lifesaving potential for kids. ‘ Julia Cordua , CEO of Thorn, said Apple’s technology balances “the need for privacy with digital security for children.” Thorn , a non-profit organization founded by Demi Moore and Ashton Kutcher, uses technology to protect children from sexual abuse by identifying victims and working with technology platforms. But in a fierce criticism, the Center for Democracy and Technology (reality without Washington-based for-profit) called on Apple to abandon the changes because they “destroy the company’s end-to-end encryption guarantee.”