The head of Facebook's WhatsApp on Friday publicly blasted Apple's decision to scan users' content for illegal images.
“Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy,” WhatsApp head Will Cathcart said on Twitter.
“People have asked if we'll adopt this system for WhatsApp. The answer is no,” he wrote.
Cathcart's post came one day after Apple said it will roll out software that scans users' devices for photos that match ones in an existing database of child sex-abuse images, and alert outside organizations if users attempt to upload matched photos to iCloud.
iPhone and iPad users won't have any choice about downloading the surveillance software.
WhatsApp currently says it scans "unencrypted information such as profile and group photos and user reports" for known "child exploitative imagery."
Apple also announced its messaging service will scan posts to and from minors for sexual content, but that feature can be turned off by parents.
News of Apple's plans roiled privacy advocates, including groups like the Electronic Frontier Foundation and Center for Democracy & Technology.
“Make no mistake: this is a decrease in privacy for all iCloud Photos users, not an improvement,” the Electronic Frontier Foundation said, in reference to the plan to scan on-device photos before they're uploaded to the cloud.
“Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy,” the group wrote Thursday afternoon.
Johns Hopkins cryptopgraphy professor Matthew Green, the security researcher who first reported news of Apple's plans, warned the move could have disastrous consequences for civil rights around the world.
“Regardless of what Apple's long term plans are, they've sent a very clear signal,” Green said in a Twitter post. “In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content. That’s the message they’re sending to governments, competing services, China, you.”
Edward Snowden, who famously exposed the National Security Agency's mass surveillance program, chimed in with similar arguments.
“No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this,” he tweeted. “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs -- *without asking.*”
Cathcart echoed some of those concerns Friday.
“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable,” he wrote.