In an unexpected privacy reversal, Apple is reportedly planning an iPhone update with something called a “neuralMatch” Artificial Intelligence (AI) system to allow it to “continuously scan photos that are stored on a US user’s iPhone and [photos that] have also been uploaded to its iCloud back-up system.”
It would scan the iPhones and iCloud for images of child sexual abuse in an attempt to help law enforcement track down child sex abusers.
While this effort is extremely well-intentioned, it opens a dangerous Pandora’s Box of security and privacy concerns.
According to a report from the Financial Times, the system would “proactively alert a team of human reviewers if it believes illegal imagery is detected” and they would alert law enforcement once the images were verified.
No details have been provided as to how they would be “verified.”
The neuralMatch system was apparently trained to scan using a database from the National Center for Missing and Exploited Children. This system will reportedly be limited, at least initially, to iPhones in the United States.
Apple has reportedly been briefing security researchers on neuralMatch, who in turn are raising serious privacy concerns.
Apple was one of the first major companies to embrace “end-to-end” encryption so that only their senders and recipients can read them. Law enforcement, however, has long pressured Big Tech firms such as Apple, Microsoft, Google, and Facebook for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.
The company had previously famously stood up to law enforcement to protect Apple users’ privacy. In 2016 Apple refused to unlock an iPhone belonging to the terrorist behind the San Bernardino terror attack.
At the time, CEO Tim Cook said that the government’s request was “chilling” and would have far-reaching consequences that could effectively create a backdoor for more government surveillance.
In that instance the FBI eventually turned to an Australian security firm, Azimuth, to unlock the phone.
Still, Apple has already been scanning user files stored in its iCloud service, which is not as securely encrypted as its messages, for such images. However, according to Apple, “neuralMatch,” which will access user phones and messages, will detect known images of child sexual abuse without decrypting people’s messages.
Nevertheless, PBS reported that researchers say the tool could still be used for other purposes such as government surveillance of dissidents or protesters. Not only that, but it can be used to frame innocent people. PBS added:
Matthew Green of Johns Hopkins, a top cryptography researcher, was concerned that it could be used to frame innocent people by sending them harmless but malicious images designed designed to appear as matches for child porn, fooling Apple’s algorithm and alerting law enforcement — essentially framing people. “This is a thing that you can do,” said Green. “Researchers have been able to do this pretty easily.”
In light of the Snowden revelations of illegal government surveillance operations, it is clear that the U.S. government has shown that it’s capable of abusing mass-surveillance and tracking technology.
This latest Apple development only adds to these serious concerns. ADN