Commentary

Apple The Autocrat: Tech Firm Announces Plan To Track Presumed Child Pornography Uploads

Apple -- the company that purports to define privacy for the email business -- has now decided that it will rid the world of child pornography. It has announced technology that will scan iPhones for illicit photographs. And it may report alleged offenders to law enforcement. 

Unlike other tech firms that scan for pornography after it is loaded, Apple will scan all photos as they are being uploaded into iCloud Photos and match them instantly against the National Center for Missing & Exploited Children’s database of child sexual abuse material.

Repeat uploadings will be subject to human review, and the account taken down and possibly referred to the police.   

Another feature of this program will notify parents if their under-13 child is sending or receiving sexually explicit images — again, in real-time. 

The new program raises two questions. The first is privacy-related. 

Nobody approves of child pornography. But as the Electronic Frontier Foundation commented on Thursday, Apple is building a backdoor into its data storage and messaging systems.

advertisement

advertisement

“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts,” the EFF continues. “That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.” 

Let’s just imagine that Putin gets hold of this technology: People uploading anti-government circulars could be arrested in minutes. 

“Make no mistake: this is a decrease in privacy for all iCloud Photos users, not an improvement,“ the EFF concludes. 

Then there is the parental alert feature. These notifications “give the sense that Apple is watching over the user’s shoulder — and in the case of under-13s, that’s essentially what Apple has given parents the ability to do,” the EFF argues.  

Presumably, it has never occurred to Apple and other tech firms that parental alerts of this sort could possibly lead to violence against the child. 

EFF concludes that when "Apple releases these 'client-side scanning' functionalities, users of iCloud Photos, child users of iMessage, and anyone who talks to a minor through iMessage will have to carefully consider their privacy and security priorities in light of the changes, and possibly be unable to safely use what until this development is one of the preeminent encrypted messengers.

The second question is: Who appointed Apple as a tin-horn deputy?

Apple employees are not law enforcement officials. Yet they could be sitting there reviewing child pornography, if the EFF is correct.  

This program gives the company power even beyond what it already has, making it a sort of shadow government. 

That’s assuming it even works. When Tumblr instituted a filter for sexual content in 2018, it “famously caught all sorts of other imagery in the net, including pictures of Pomeranian puppies, selfies of fully-clothed individuals, and more,” the EFF recalls.  

It puts one in mind of the 1980s scheme by the U.S. Postal Inspection service to send direct mail offering child pornography to certain mailing lists, then arrest anyone who responded. It didn’t work — elderly ladies in Iowa were getting misaddressed kiddie porn catalogs.

Even well-intentioned efforts can go awry. 

 

 

Next story loading loading..