Mobile

WhatsApp: Scanning iPhones for Child Sexual Abuse Images Is a Privacy Risk

An Apple effort to use iPhones to detect child sexual abuse imagery risks paving the way for widescale surveillance, according to rival WhatsApp. 

“I think this is the wrong approach and a setback for people’s privacy all over the world,” Will Cathcart, the head of the Facebook-owned WhatsApp, tweeted on Friday. 

For years now, companies including Facebook have been using algorithms to scan, detect, and remove child porn from social media, video sites, and cloud storage platforms. However, Apple said this week that it would use “on-device” processing on the iPhone itself to detect and flag child sexual abuse material (CSAM) as the files are uploaded to an iCloud account.

The on-device processing prompted Cathcart to speak out against Apple’s upcoming system, which arrives in iOS 15

“Instead of focusing on making it easy for people to report [CSAM] that’s shared with them, Apple has built software that can scan all the private photos on your phone—even photos you haven’t shared with anyone. That’s not privacy,” he tweeted. 

Still, it’s important to note that Apple’s system only targets images uploaded to iCloud Photos, at least for now. A related anti-child abuse function from Cupertino will also use on-device algorithms to identify and blur out sexual imagery on iMessages—but this will only apply to iPhones belonging to children.   

Nevertheless, Cathcart says the feature sets the stage for more surveillance on people’s personal hardware. “We’ve had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops, or phones globally for unlawful content. It’s not how technology built in free countries works,” he wrote in a follow-up tweet. 

Cathcart then reiterated the concerns other security researchers and privacy groups have with Apple’s approach: That the same scanning technologies can be expanded to scan for other kinds of content, giving law enforcement and governments a way to search inside people’s personal devices. “Will this system be used in China? What content will they consider illegal there and how will we ever know?” he asked. 

Cathcart added that WhatsApp won’t be implementing a similar system to stop CSAM due to the privacy risks. Instead, the messaging service has been focused on making it easy for users to flag illegal content through reporting functions within the app.

“​​We reported more than 400,000 cases to NCMEC [the National Center for Missing and Exploited Children] last year from WhatsApp, all without breaking encryption,” he added.

Apple did not immediately respond to a request for comment. But it’s no secret the company has had a testy relationship with Facebook. Most recently, Apple added new privacy controls to iPhones capable of undermining Facebook’s ability to serve targeted ads to users. 

In its defense, Apple has said its upcoming system will better help law enforcement to stop CSAM online while maintaining an iPhone’s owner privacy.

“​​The system is very accurate, with an extremely low error rate of less than one in one trillion account per year,” the company wrote in a support document, which adds: “The system is significantly more privacy-preserving than cloud-based scanning, as it only reports users who have a collection of known CSAM stored in iCloud Photos.”

Leave a Reply

Your email address will not be published. Required fields are marked *

  ⁄  five  =  2