April 19, 2024, 2:02

The controversy over Apple’s plan to protect kids by scanning your iPhone

The controversy over Apple’s plan to protect kids by scanning your iPhone

This story is part of a group of stories called

Uncovering and explaining how our digital world is changing — and changing us.

Apple, the company that proudly touted its user privacy bona fides in its recent iOS 15 preview, recently introduced a feature that seems to run counter to its privacy-first ethos: the ability to scan iPhone photos and alert the authorities if any of them contain child sexual abuse material (CSAM). While fighting against child sexual abuse is objectively a good thing, privacy experts aren’t thrilled about how Apple is choosing to do it.

Apple’s new “expanded protections for children” might not be as bad as it seems if the company keeps its promises. But it’s also yet another reminder that we don’t own our data or devices, even the ones we physically possess. You can buy an iPhone for a considerable sum, take a photo with it, and put it in your pocket. And then Apple can figuratively reach into that pocket and into that iPhone to make sure your photo is legal.

Last week, Apple announced that the new technology to scan photos for CSAM will be installed on users’ devices with the upcoming iOS 15 and macOS Monterey updates. Scanning images for CSAM isn’t a new thing — Facebook and Google have been scanning images uploaded to their platforms for years — and Apple is already able to access photos uploaded to iCloud accounts. Scanning photos uploaded to iCloud in order to spot CSAM would make sense and be consistent with Apple’s competitors.

But Apple is doing something a bit different, something that feels more invasive even though Apple says it’s meant to be less so. The image scans will take place on the devices themselves, not on the servers to which you upload your photos. Apple also says it will use new tools in the Message app that scanned photos sent to or from children for sexual imagery, with an option to tell the parents of children ages 12 and under if they viewed those images. Parents can opt in to those features, and all the scanning happens on the devices.

In effect, a company that took not one but two widely publicized stances against the FBI’s demands that it create a back door into suspected terrorists’ phones has seemingly created a back door. It’s not immediately clear why Apple is making this move this way at this time, but it could have something to do with pending laws abroad and potential ones in the US. Currently, companies can be fined up to $300,000 if they find CSAM but do not report it to authorities, though they’re not required to look for CSAM.

Following backlash after its initial announcement of the new features, Apple on Sunday released an FAQ with a few clarifying details about how its on-device scanning tech works. Basically, Apple will download a database of known CSAM images from the National Center for Missing and Exploited Children (NCMEC) to all of its devices. The CSAM has been converted into strings of numbers, so the images aren’t being downloaded onto your device. Apple’s technology scans photos in your iCloud Photo library and compares them to the database. If it finds a certain number of matches (Apple has not specified what that number is), a human will review it and then report it to NCMEC, which will take it from there. It isn’t analyzing the photos to look for signs that they might contain CSAM, like the Messages tool appears to do; it’s just looking for matches to known CSAM.

“A thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor”

Additionally, Apple says that only photos you choose to upload to iCloud Photos are scanned. If you disable iCloud Photos, then your pictures won’t be scanned. Back in 2018, CNBC reported that there were roughly 850 million iCloud users, with 170 million of them paying for the extra storage capacity (Apple offers all iPhone users 5 GB cloud storage free). So a lot of people could be affected here.

Apple says this method has “significant privacy benefits” over simply scanning photos after they’ve been uploaded to iCloud. Nothing leaves the device or is seen by Apple unless there’s a match. Apple also maintains that it will only use a CSAM database and refuse any government requests to add any other types of content to it.

But privacy advocates think the new feature will open the door to abuses. Now that Apple has established that it can do this for some images, it’s almost certainly going to be asked to do it for other ones. The Electronic Frontier Foundation easily sees a future where governments pressure Apple to scan user devices for content that their countries outlaw, both in on-device iCloud photo libraries and in users’ messages.

“That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change,” the EFF said. “At the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”

The Center for Democracy and Technology said in a statement to Recode that Apple’s new tools were deeply concerning and represented an alarming change from the company’s previous privacy stance. It hoped Apple would reconsider the decision.

“Apple will no longer be offering fully end-to-end encrypted messaging through iMessage and will be undermining the privacy previously offered for the storage of iPhone users’ photos,” CDT said.

Will Cathcart, head of Facebook’s encrypted messaging service WhatsApp, blasted Apple’s new measures in a Twitter thread:

(Facebook and Apple have been at odds since Apple introduced its anti-tracking feature to its mobile operating system, which Apple framed as a way to protect its users’ privacy from companies that track their activity across apps, particularly Facebook. So you can imagine that a Facebook executive was quite happy for a chance to weigh in on Apple’s own privacy issues.)

And Edward Snowden expressed his thoughts in meme form:

Some experts think Apple’s move could be a good one — or at least, not as bad as it’s been made to seem. John Gruber wondered if this could give Apple a way to fully encrypt iCloud backups from government surveillance while also being able to say it is monitoring its users’ content for CSAM.

“If these features work as described and only as described, there’s almost no cause for concern,” Gruber wrote, acknowledging that there are still “completely legitimate concerns from trustworthy experts about how the features could be abused or misused in the future.”

Ben Thompson of Stratechery pointed out that this could be Apple’s way of getting out ahead of potential laws in Europe requiring internet service providers to look for CSAM on their platforms. Stateside, American lawmakers have tried to pass their own legislation that would supposedly require internet services to monitor their platforms for CSAM or else lose their Section 230 protections. It’s not inconceivable that they’ll reintroduce that bill or something similar this Congress.

Or maybe Apple’s motives are simpler. Two years ago, the New York Times criticized Apple, along with several other tech companies, for not doing as much as they could to scan their services for CSAM and for implementing measures, such as encryption, that made such scans impossible and CSAM harder to detect. The internet was now “overrun” with CSAM, the Times said.

Apple was okay with being accused of protecting dead terrorists’ data, but perhaps being seen as enabling child sexual abuse was a bridge too far.

Will you support Vox’s explanatory journalism?

Millions turn to Vox to understand what’s happening in the news. Our mission has never been more vital than it is in this moment: to empower through understanding. Financial contributions from our readers are a critical part of supporting our resource-intensive work and help us keep our journalism free for all. Please consider making a contribution to Vox today from as little as $3.

Sourse: vox.com

Related posts

Leave a Reply

Your email address will not be published. Required fields are marked *