Child sexual abuse
is stored on iCloud.
Apple allows it.

Apple’s landmark announcement to detect child sexual abuse images and videos in 2021 was silently rolled back, impacting the lives of children worldwide. With every day that passes, there are kids suffering because of this inaction, which is why we’re calling on Apple to deliver on their commitment.

We are calling on Apple to:

Create a robust reporting mechanism for users to report child sexual abuse images and videos to Apple.

Detect, report, and remove child sexual abuse images and videos from iCloud.

Join us in holding Apple accountable.

Read the Case Studies

Sensitive content warning: child sexual abuse

Read the coalition letter to Apple.

Child safety organizations, advocates, and those with lived experience support these baseline recommendations for Apple to take action in better safeguarding children.

Apple falls short when detecting child sexual abuse material.

Research Materials

National Polls show that Americans overwhelmingly support technology companies adopting new policies and features to remove and report child sexual abuse material from their platforms.

Research Materials

Australia eSafety Commissioner’s Basic Online Safety Expectations Report shows Apple falling behind.

Research Materials

Apple reported 234 pieces of child sexual abuse material in 2022, four other tech companies reported 29 million.

Research Materials

The Heat Initiative commissioned a review of publicly available cases where child sexual images and videos were found on Apple products

Join us

Provide your email address to stay informed and help us hold Apple accountable!