Skip to main content

Filed under:

Apple’s controversial plan to try to curb child sexual abuse imagery

When Apple announced changes it plans to make to iOS devices in an effort to help curb child abuse by finding child sexual abuse material (CSAM), parts of its plan generated backlash.

First, it’s rolling out an update to its Search app and Siri voice assistant on iOS 15, watchOS 8, iPadOS 15, and macOS Monterey. When a user searches for topics related to child sexual abuse, Apple will redirect the user to resources for reporting CSAM, or getting help for an attraction to such content.

But it’s Apple’s two other CSAM plans that have garnered criticism. One update will add a parental control option to Messages, sending an alert to parents if a child age 12 or younger views or sends sexually explicit pictures, and obscuring the images for any users under 18.

The one that’s proven most controversial is Apple’s plan to scan on-device images to find CSAM before they images are uploaded to iCloud, reporting them to Apple’s moderators who can then turn the images over to the National Center for Missing and Exploited Children (NCMEC) in the case of a potential match. While Apple says the feature will protect users while allowing the company to find illegal content, many Apple critics and privacy advocates say the provision is basically a security backdoor, an apparent contradiction to Apple’s long-professed commitment to user privacy.

To stay up to speed on the latest news about Apple’s CSAM protection plans, follow our storystream, which we’ll update whenever there’s a new development. If you need a starting point, check out our explainer here.

  • Richard Lawler

    Dec 7, 2022

    Richard Lawler

    Apple drops controversial plans for child sexual abuse imagery scanning

    Apple logo illustration
    Illustration by Alex Castro / The Verge

    Apple has ended the development of technology intended to detect possible child sexual abuse material (CSAM) while it’s stored on user devices, according to The Wall Street Journal.

    That plan was unveiled last fall with an intended rollout for iOS 15, but backlash quickly followed as encryption and consumer privacy experts warned about the danger of creating surveillance systems that work directly from your phone, laptop, or tablet.

    Read Article >
  • Russell Brandom

    Aug 18, 2021

    Russell Brandom

    Apple says collision in child-abuse hashing system is not a concern

    Apple logo illustration
    Illustration by Alex Castro / The Verge

    Researchers have produced a collision in iOS’s built-in hash function, raising new concerns about Apple’s CSAM-scanning system — but Apple says the finding does not threaten the integrity of the system.

    The flaw affects the hashing algorithm, called NeuralHash, which allows Apple to check for exact matches of known child-abuse imagery without possessing any of the images or gleaning any information about non-matching pictures.

    Read Article >
  • Adi Robertson

    Aug 13, 2021

    Adi Robertson

    Apple says its iCloud scanning will rely on multiple child safety groups to address privacy fears

    Illustration by Alex Castro / The Verge

    Apple has filled in more details around its upcoming plans to scan iCloud Photos for child sexual abuse material (CSAM) via users’ iPhones and iPads. The company released a new paper delving into the safeguards it hopes will increase user trust in the initiative. That includes a rule to only flag images found in multiple child safety databases with different government affiliations — theoretically stopping one country from adding non-CSAM content to the system.

    Apple’s upcoming iOS and iPadOS releases will automatically match US-based iCloud Photos accounts against known CSAM from a list of image hashes compiled by child safety groups. While many companies scan cloud storage services remotely, Apple’s device-based strategy has drawn sharp criticism from some cryptography and privacy experts.

    Read Article >
  • Adi Robertson

    Aug 10, 2021

    Adi Robertson

    Apple’s controversial new child protection features, explained

    Apple logo illustration
    Illustration by Alex Castro / The Verge

    Apple stakes its reputation on privacy. The company has promoted encrypted messaging across its ecosystem, encouraged limits on how mobile apps can gather data, and fought law enforcement agencies looking for user records. For the past week, though, Apple has been fighting accusations that its upcoming iOS and iPadOS release will weaken user privacy.

    The debate stems from an announcement Apple made on Thursday. In theory, the idea is pretty simple: Apple wants to fight child sexual abuse, and it’s taking more steps to find and stop it. But critics say Apple’s strategy could weaken users’ control over their own phones, leaving them reliant on Apple’s promise that it won’t abuse its power. And Apple’s response has highlighted just how complicated — and sometimes downright confounding — the conversation really is.

    Read Article >
  • Nilay Patel

    Aug 10, 2021

    Nilay Patel

    Here’s why Apple’s new child safety features are so controversial

    The Apple logo is shown in a photo illustration
    Illustration by Alex Castro / The Verge

    Last week, Apple, without very much warning at all, announced a new set of tools built into the iPhone designed to protect children from abuse. Siri will now offer resources to people who ask for child abuse material or who ask how to report it. iMessage will now flag nudes sent or received by kids under 13 and alert their parents. Images backed up to iCloud Photos will now be matched against a database of known child sexual abuse material (CSAM) and reported to the National Center for Missing and Exploited Children (NCMEC) if more than a certain number of images match. And that matching process doesn’t just happen in the cloud — part of it happens locally on your phone. That’s a big change from how things normally work.

    Apple claims it designed what it says is a much more private process that involves scanning images on your phone. And that is a very big line to cross — basically, the iPhone’s operating system now has the capability to look at your photos and match them up against a database of illegal content, and you cannot remove that capability. And while we might all agree that adding this capability is justifiable in the face of child abuse, there are huge questions about what happens when governments around the world, from the UK to China, ask Apple to match up other kinds of images — terrorist content, images of protests, pictures of dictators looking silly. These kinds of demands are routinely made around the world. And until now, no part of that happened on your phone in your pocket. 

    Read Article >
  • Jon Porter

    Aug 9, 2021

    Jon Porter

    Apple pushes back against child abuse scanning concerns in new FAQ

    Illustration by Alex Castro / The Verge

    In a new FAQ, Apple has attempted to assuage concerns that its new anti-child abuse measures could be turned into surveillance tools by authoritarian governments. “Let us be clear, this technology is limited to detecting CSAM [child sexual abuse material] stored in iCloud and we will not accede to any government’s request to expand it,” the company writes. 

    Apple’s new tools, announced last Thursday, include two features designed to protect children. One, called “communication safety,” uses on-device machine learning to identify and blur sexually explicit images received by children in the Messages app, and can notify a parent if a child age 12 and younger decides to view or send such an image. The second is designed to detect known CSAM by scanning users’ images if they choose to upload them to iCloud. Apple is notified if CSAM is detected, and it will alert the authorities when it verifies such material exists.

    Read Article >
  • Mitchell Clark

    Aug 7, 2021

    Mitchell Clark

    WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan

    Illustration by Alex Castro / The Verge

    The chorus of voices expressing concern and dismay over Apple’s new Child Safety measures grew louder over the weekend, as an open letter with more than 4,000 signatures made the rounds online. The Apple Privacy Letter asked the iPhone maker to “reconsider its technology rollout,” lest it undo “decades of work by technologists, academics and policy advocates” on privacy-preserving measures.

    Apple’s plan, which it announced on Thursday, involves taking hashes of images uploaded to iCloud and comparing them to a database that contains hashes of known CSAM images. According to Apple, this allows it to keep user data encrypted and run the analysis on-device while still allowing it to report users to the authorities if they’re found to be sharing child abuse imagery. Another prong of Apple’s Child Safety strategy involves optionally warning parents if their child under 13 years old sends or views photos containing sexually explicit content. An internal memo at Apple acknowledged that people would be “worried about the implications” of the systems.

    Read Article >
  • Apple reveals new efforts to fight child abuse imagery

    Illustration by Alex Castro / The Verge

    In a briefing on Thursday afternoon, Apple confirmed previously reported plans to deploy new technology within iOS, macOS, watchOS, and iMessage that will detect potential child abuse imagery, but clarified crucial details from the ongoing project. For devices in the US, new versions of iOS and iPadOS rolling out this fall have “new applications of cryptography to help limit the spread of CSAM [child sexual abuse material] online, while designing for user privacy.”

    The project is also detailed in a new “Child Safety” page on Apple’s website. The most invasive and potentially controversial implementation is the system that performs on-device scanning before an image is backed up in iCloud. From the description, scanning does not occur until a file is getting backed up to iCloud, and Apple only receives data about a match if the cryptographic vouchers (uploaded to iCloud along with the image) for a particular account meet a threshold of matching known CSAM.

    Read Article >
  • Jay Peters

    Aug 5, 2021

    Jay Peters

    Apple will scan photos stored on iPhones and iCloud for child abuse imagery

    The iPhone 12, in blue.
    Photo by Vjeran Pavic / The Verge

    Update August 5th, 3:21PM ET: Apple has announced more about what the Financial Times reported and revealed new tools coming to iMessage that warn children about sexually explicit photos. The new features will be coming later this year as updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. You can read more about them on Apple’s website. Our original article follows.

    Apple plans to scan photos stored on iPhones and iCloud for child abuse imagery, according the Financial Times. The new system could help law enforcement in criminal investigations but may open the door to increased legal and government demands for user data.

    Read Article >