Apple's Planned iPhone Scans Pave Way For Worldwide Censorship, Groups Warn

Apple's plan to scan users' iPhones for contraband will lay a foundation for “censorship, surveillance and persecution on a global basis,” more than 90 organizations said Thursday in a letter to the company.

The groups -- including the ACLU, Center for Democracy & Technology, PEN America and Electronic Frontier Foundation -- are urging Apple CEO Tim Cook to ditch its controversial surveillance plan.

They join a host of other organizations, computer experts, privacy advocates -- and reportedly, Apple's own employees -- in criticizing the company's new technology, which is slated for inclusion in Apple's next mobile operating system, iOS 15, expected to ship this autumn.

Two weeks ago, Apple unveiled plans to search material in users' iPhones and iPads, in order to combat child sex abuse. One prong of Apple's plan involves scanning childrens' iMessage accounts for nude photos, but parents can opt out of those scans. 

Another component, which can't be deactivated, involves comparing images that users attempt to upload to iCloud with a database of known photos depicting the sexual abuse of children. To accomplish this, Apple will download hashed digital fingerprints of database photos to users' devices, then scan for matches among photos users want to place in iCloud.

If the software finds 30 matches, Apple will manually review the photos and, if it determines they're illegal, will notify the National Center for Missing and Exploited Children, which will alert the authorities.

The ACLU and other groups signing Thursday's letter argue that this type of technology could easily be used to search people's phones for material other than photos depicting sex abuse.

“Once this capability is built into Apple products, the company and its competitors will face enormous pressure -- and potentially legal requirements -- from governments around the world to scan photos ... for other images a government finds objectionable,” the letter states. “Those images may be of human rights abuses, political protests, images companies have tagged as 'terrorist' or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them. And that pressure could extend to all images stored on the device, not just those uploaded to iCloud.”

Princeton University professor Jonathan Mayer and researcher Anunay Kulshrestha separately warned Thursday of that possibility.

Writing in the Washington Post, they said they independently designed a content-matching system similar to Apple's, but concluded it was too risky to deploy.

“Our system could be easily repurposed for surveillance and censorship,” they write. “The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.”

Next story loading loading..