Apple Delays iPhone Scanning Plan

Faced with a backlash by civil rights groups and privacy advocates, Apple has decided to delay a controversial plan to scan users' devices for child sex-abuse images.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” the company said in a statement provided to multiple news outlets. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

The original plan, unveiled last month, involved comparing images users attempt to upload to iCloud to a database of known photos depicting the sexual abuse of children. To accomplish this, Apple intended to download hashed digital fingerprints of database photos to users' devices, then scan for matches among photos users attempt to place in iCloud.

advertisement

advertisement

If the software found 30 matches, Apple would have manually reviewed the photos and, if it determined they are illegal, would have notified the National Center for Missing and Exploited Children, which works with the police.

Another prong of Apple's plan involved scanning childrens' iMessage accounts for nude photos, but parents could opt out of those scans.

News of Apple's scanning technology drew immediate protests from a wide range of security experts and civil rights groups.

“Once this capability is built into Apple products, the company and its competitors will face enormous pressure -- and potentially legal requirements -- from governments around the world to scan photos ... for other images a government finds objectionable,” the ACLU Center for Democracy & Technology, PEN America, Electronic Frontier Foundation and dozens of others said in letter sent to Apple CEO Tim Cook last month.

“Those images may be of human rights abuses, political protests, images companies have tagged as 'terrorist' or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them” the groups continued. “And that pressure could extend to all images stored on the device, not just those uploaded to iCloud.”

It's not yet clear what revisions Apple is contemplating.

The advocacy group Fight for the Future said Friday it still plans to press the company to permanently abandon its plan for on-device scans.

Next story loading loading..