Apple’s Well-Intentioned but Extremely Dangerous New Threat to iPhone Users’ Privacy

In an unexpected privacy reversal, Apple is reportedly planning an iPhone update with something called a “neuralMatch” Artificial Intelligence (AI) system to allow it to “continuously scan photos that are stored on a US user’s iPhone and [photos that] have also  been uploaded to its iCloud back-up system.”

It would scan the iPhones and iCloud for images of child sexual abuse in an attempt to help law enforcement track down child sex abusers.

While this effort is extremely well-intentioned, it opens a dangerous Pandora’s Box of security and privacy concerns.

According to a report from the Financial Times, the system would “proactively alert a team of human reviewers if it believes illegal imagery is detected” and they would alert law enforcement once the images were verified.

No details have been provided as to how they would be “verified.”

The neuralMatch system was apparently trained to scan using a database from the National Center for Missing and Exploited Children. This system will reportedly be limited, at least initially, to iPhones in the United States.

Apple has reportedly been briefing security researchers on  neuralMatch, who in turn are raising serious privacy concerns.

Apple was one of the first major companies to embrace “end-to-end” encryption so that only their senders and recipients can read them. Law enforcement, however, has long pressured Big Tech firms such as Apple, Microsoft, Google, and Facebook for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.

The company had previously famously stood up to law enforcement to protect Apple users’ privacy. In 2016 Apple refused to unlock an iPhone belonging to the terrorist behind the San Bernardino terror attack.

At the time, CEO Tim Cook said that the government’s request was “chilling” and would have far-reaching consequences that could effectively create a backdoor for more government surveillance.

In that instance the FBI eventually turned to an Australian security firm, Azimuth, to unlock the phone.

Still, Apple has already been scanning user files stored in its iCloud service, which is not as securely encrypted as its messages, for such images. However, according to Apple, “neuralMatch,” which will access user phones and messages, will detect known images of child sexual abuse without decrypting people’s messages.

Nevertheless, PBS  reported that researchers say the tool could still be used for other purposes such as government surveillance of dissidents or protesters. Not only that, but it can be used to frame innocent people. PBS added:

Matthew Green of Johns Hopkins, a top cryptography researcher, was concerned that it could be used to frame innocent people by sending them harmless but malicious images designed designed to appear as matches for child porn, fooling Apple’s algorithm and alerting law enforcement — essentially framing people. “This is a thing that you can do,” said Green. “Researchers have been able to do this pretty easily.”

In light of the Snowden revelations of illegal government surveillance operations, it is clear that the U.S. government has shown that it’s capable of abusing mass-surveillance and tracking technology.

This latest Apple development only adds to these serious concerns. ADN



Subscribe
Notify of
guest
8 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Richard
1 year ago

This is a Ploy by Apple and Law Enforcement to access anyone’s phone without a warrant, and if they can check photo’s they can check emails/text messages to be Sure. This is a clear cut way for them to monitor and control the users Political endeavors ie: Republicans. Big Tech and the media are all in bed with the Biden administration, to be Sure if Donald Trump was the President this wouldn’t be happening, and when he becomes President in 2024 this will end.

SDOFAZ
SDOFAZ
1 year ago
Reply to  Richard

Could not agree more.

SDOFAZ
SDOFAZ
1 year ago

Figures and those who believe this was accidental need to buy my bridge in NY cheap!

Sugar Bear
Sugar Bear
1 year ago
Reply to  SDOFAZ

Is this why Verizon is offering all existing customers a new iPhone?

EMW70
EMW70
1 year ago

Well, Hunter is going to show up on the bad boy list in about 3 seconds of scanning.

Pawn
Pawn
1 year ago
Reply to  EMW70

sorry good old hunter will not be xcanned as he is protected, this is one of bidens pay backs for following orders, although biden is too stupid to know that he is following orders

Pawn
Pawn
1 year ago

AS always there are unwanted consequences of what should be great inovations for people being basterdized by govt and our govt is at the lead. Why do you think all the other govts try to hack us? not only for our “secrets” but for our technology and plans. We are still leaders and many times this leadership is in tthe negative direction as far as freedom is concerned

trackback

[…] a follow up our earlier piece on Apple’s well-intentioned but dangerous privacy-violating app update, Apple defends its […]