Apple Defends Privacy-Wrecking New Update that Scans Users’ Phones for Illegal Explicit Photos

Apple CEO Tim Cook and Austin Community College (ACC) President/CEO Dr. Richard Rhodes join Austin Mayor Steve Adler and State Senator Kirk Watson for an exciting announcement launching a new app development program at ACC on Friday, August 25, 2017 at the Capital Factory in downtown Austin, Texas. / Photo by Austin Community College via (https://creativecommons.org/licenses/by/2.0/deed.en) Flickr

As a follow up our earlier piece on Apple’s well-intentioned but dangerous privacy-violating app update, Apple defends its planned scanning of phones for possible child sex abuse photos, after a company memo called privacy advocates “screeching voices.”

On Monday it responded to critics by stating it will refuse any government demands to expand its new photo-scanning technology beyond the current plan of using it only to detect CSAM (child sexual abuse material).

However, the Big Tech giant’s promises likely won’t convince many privacy advocates.

As ADN reported:

Apple is reportedly planning an iPhone update with something called a “neuralMatch” Artificial Intelligence (AI) system to allow it to “continuously scan photos that are stored on a US user’s iPhone and [photos that] have also been uploaded to its iCloud back-up system.”

It would scan the iPhones and iCloud for images of child sexual abuse in an attempt to help law enforcement track down child sex abusers.

While this effort is extremely well-intentioned, it opens a dangerous Pandora’s Box of security and privacy concerns.

ARS Technica noted that:

Apple has faced days of criticism from security experts, privacy advocates, and privacy-minded users over the plan it announced Thursday, in which iPhones and other Apple devices will scan photos before they are uploaded to iCloud. Many critics pointed out that once the technology is on consumer devices, it won’t be difficult for Apple to expand it beyond the detection of CSAM in response to government demands for broader surveillance.

In a FAQ released today by Apple with the title, “Expanded Protections for Children,” there is a question that asks, “Could governments force Apple to add non-CSAM images to the hash list?” Apple answered the question with this reply:

Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC (National Center for Missing and Exploited Children) and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.

Of course, none of this means that Apple lacks the ability to expand the technology’s uses. The system’s current design doesn’t prevent it from being redesigned and used for other purposes in the future. This entire effort itself is a major change for a company that has used privacy as a selling point for years and calls privacy a “fundamental human right.”

The real question about Apple succumbing to government demands for expanded scanning of users’ phones for other purposes should be not “if”, but “when?” ADN

Paul Crespo is the Managing Editor of American Defense News. A defense and national security expert, he served as a Marine Corps officer and as a military attaché with the Defense Intelligence Agency (DIA) at US embassies worldwide. Paul holds degrees from Georgetown, London, and Cambridge Universities. He is also CEO of SPECTRE Global Risk, a security advisory firm, and President of the Center for American Defense Studies, a national security think tank.

Subscribe
Notify of
guest
10 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
John Baughman
1 month ago

DUH! For the moment ‘Apple’ is not the only phone available, switch away if you are unhappy with Apple powers!

YumaJoy
YumaJoy
1 month ago

Just one of many reasons I don’t want Apple products. Its invasion of privacy!!!!

Ray Jarman
1 month ago
Reply to  YumaJoy

It is just one more reason to retain my flip phone. My travel, purchases in stores and other activities are none of their or anyone’s business. I also use cash for purchases at most stores. Seldom do I use my debit.

Richard
1 month ago

So if Apple, now a corrupt entity of the Governments overreach can SCAN an iPhone users photo’s for CSAM what to say they can’t plant pictures on an iPhone users phone? More importantly are Hunter Biden and Adam Schiff, two know pedophiles going to be first on the list and brought to justice. Or will Apple erase their photo’s and clean up their phones memory and hard drive. I volunteer to be the first person checked by Apple, as long as Hunter Biden and Adam Schiff are 2nd and 3rd respectfully. Check their Lap Top and Computer hard drives as well, come on Apple lets do this…they’ll be more celebrities on the list to check at a future date.

TruthBTold
TruthBTold
1 month ago

Start with Hunter’s account, there is a treasure island if child sexual abuse right there.

Richard
1 month ago
Reply to  TruthBTold

So if Apple, now a corrupt entity of the Governments overreach can SCAN an iPhone users photo’s for CSAM what to say they can’t plant pictures on an iPhone users phone? More importantly are Hunter Biden and Adam Schiff, two know pedophiles going to be first on the list and brought to justice. Or will Apple erase their photo’s and clean up their phones memory and hard drive. I volunteer to be the first person checked by Apple, as long as Hunter Biden and Adam Schiff are 2nd and 3rd respectfully. Check their Lap Top and Computer hard drives as well, come on Apple lets do this…they’ll be more celebrities on the list to check at a future date.

Mary Geiger
1 month ago

How long will it be before ALL phone companies are doing this? It’s great they want to catch child abusers but they have to do it in a normal and non-intrusive way. I myself would not trust any government issued mandate that phone companies spy on citizens by any means possible. You have “experts investigators” who do this, why not use them to hunt down abusers instead of spying on innocent citizens? Are they getting lazy?

Ray Jarman
1 month ago

There was one factor missing in the report which is that Apple intends to permit the CCP full access to this information in China. The same thing could and in all probability will occur in the US and other once free societies.

Murphmeister
Murphmeister
1 month ago

If I take a picture of a naked illegal crossing the border, will Apple scrub it?

Omega 2
Omega 2
1 month ago

TOTAL:LY ” UNCONSTITUTIONAL ” AS IT IS INVASION OF PRIVACY WITH OUT WARRANTS!


People, Places & Things