Trigger warning: child sexual abuse

Watch other media

Our demand is that Apple make their products safer for kids:

Let kids report abuse: Create easily accessible reporting in iMessage for kids and parents to report inappropriate images and harmful situations.

Require safer apps: Monitor the App Store to ensure exploitative and dangerous apps are unavailable to children and that only age-appropriate apps are advertised to kids.

Stop the spread of child sexual abuse: Stop the storage and spread of known child sexual abuse images and videos in iCloud.

Join us in holding Apple accountable.

Read the Case Studies

Sensitive content warning: child sexual abuse

Read the coalition letter to Apple.

Child safety organizations, advocates, and those with lived experience support these baseline recommendations for Apple to take action in better safeguarding children.

Apple falls short when detecting child sexual abuse material.

Research Materials

National Polls show that Americans overwhelmingly support technology companies adopting new policies and features to remove and report child sexual abuse material from their platforms.

Research Materials

Australia eSafety Commissioner’s Basic Online Safety Expectations Report shows Apple falling behind.

Research Materials

Apple reported 234 pieces of child sexual abuse material in 2022, four other tech companies reported 29 million.

Research Materials

The Heat Initiative commissioned a review of publicly available cases where child sexual images and videos were found on Apple products

Join us

Provide your email address to stay informed and help us hold Apple accountable!

All images are AI generated and do not depict real people or real images in iCloud.

Heat Initiative Logo

Provide your email address to stay informed and help us hold Apple accountable!