Sensitive content warning: child sexual abuse
646 Child Abuse Images Stored on 28-year-old's iCloud Account On November 30, 2020, a 28-year-old British man was arrested after a search of his home uncovered 646 child abuse images on his phone and iCloud account, including images showing children in distress and pain. The man ultimately pleaded guilty to three counts of making an indecent photo, or pseudo-photo, of a child and one charge of possessing extreme pornography. Link to source >1- 2- 3
2,392 Abuse Images and Videos of a 5-year-old Found on iPhone On October 17, 2019, a 32-year-old man was arrested and charged with 13 counts of sexual exploitation of a minor-possession, sexual exploitation of a minor-recording, and sexual exploitation of a minor-distribution. Detectives found numerous explicit images of young children that had been on his cellphone, according to a probable cause statement submitted by police. After issuing a search warrant to Apple, police found multiple messages discussing a 5-year-old boy in a sexual context in his iCloud data. Investigators also collected 2,392 images and videos of sexual exploitation of a minor from the man’s iCloud backups.
Child Sexual Abuse Videos Documented over Several Years Found on iPad and iCloud On April 11, 2023, a 53-year-old-man was sentenced to over 22 years in federal prison for production of child pornography. He was indicted for two counts of production of child pornography and one count of possession of child pornography after his fiancée found an iPad containing naked images of children, including of her seven-year-old daughter. After obtaining a search warrant, law enforcement searched the iPad and its associated iCloud account, where they found explicit images and videos of that same child filmed by the criminal defendant starting when she was four years old. Link to source > 1 - 2 - 3
Child safety organizations, advocates, and those with lived experience support these baseline recommendations for Apple to take action in better safeguarding children.
Provide your email address to stay informed and help us hold Apple accountable!
The Heat Initiative conducted a national public opinion poll to understand the public’s interest in more proactive measures to stop the spread of child sexual abuse images and videos on tech platforms. Bellwether Research conducted a representative survey of 2,041 adults (18+), online, May 7-11, 2023. The full sample was balanced to approximate a target sample of adults in the United States based on the Census (CPS 2020).
We found that Americans overwhelmingly support technology companies adopting new policies and features to remove and report child sexual abuse material from their platforms:
On August 29, 2022, the eSafety Commissioner of Australia issued notices to seven online service providers, compelling them to report on their adherence to the Basic Online Safety Expectations concerning child sexual exploitation and abuse. The information gathered from these notices offers valuable insights that were not previously available through voluntary initiatives. The primary goal of this report is to enhance transparency and accountability among providers by shedding light on their actions, or lack thereof, in safeguarding children on their platforms.
The report highlights that Apple falls behind its peers in three fundamental online safety procedures:
NCMEC’s CyberTipline is the central reporting system in the nation for instances of online child exploitation, including child sexual abuse material, child sex trafficking, and online enticement. It receives reports both from the public and online electronic service providers (ESPs). In 2022, the CyberTipline received an astonishing 32 million reports, with nearly 90% originating from only four electronic service providers: Facebook, Instagram, WhatsApp, and Google. These tech giants, reported and took down more than 29 million images and videos depicting child sexual abuse, showing the significant impact that can be made when companies commit to protecting children on their platforms. In contrast, Apple’s contribution consisted of just 234 reports, highlighting the need for all tech companies to actively engage in ensuring the safety of children online. Read more about individual company performance in their annual CyberTipline Report.
The Heat Initiative commissioned a review of 94 publicly available cases where child sexual images and videos were found on Apple products. The majority of cases were related to iCloud and iPhone and included abuse images of children from infants to toddlers to teens. Review the research in this report.