
Apple has been scanning iCloud emails for Child Sexual Abuse
Material (CSAM) for roughly two years, according to a report this week in 9to5 Mac.
“Apple has confirmed to me that it already scans iCloud
Mail for CSAM, and has been doing so since 2019,” writes Ben Lovejoy. "It has not, however, been scanning iCloud
Photos or iCloud backups."
The revelation follows an announcement by Apple that it will start scanning all photos as they are being uploaded into iCloud Photos and
match them instantly against the National Center for Missing & Exploited Children’s database of child sexual abuse material. It also said it would start alerting parents who opt in if their
under-13 child is sending or receiving sexually explicit messages.
The Electronic Frontier Foundation charged that Apple is building a back door into its data storage and messaging
systems.
“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a
tweak of the configuration flags to scan, not just children’s, but anyone’s accounts,” the EFF continues. “That’s not a slippery slope; that’s a fully built system
just waiting for external pressure to make the slightest change.”
Lovejoy says Apple confirmed to him "that it has been scanning outgoing and incoming iCloud Mail for CSAM
attachments since 2019.”
“Email is not encrypted, so so scanning attachments as mail passes through Apple servers would be a trivial task,” Lovejoy
observes.
Lovejoy says the “controversy over Apple’s CSAM plans continues, with two Princeton academics stating that they prototyped a scanning system based on exactly the
same approach as Apple, but abandoned the work due to the risk of governmental misuse.”