Skip to content

Child safety watchdog accuses Apple of hiding real CSAM figures

A child protection organization says it has found more cases of abuse images on Apple platforms in the UK than Apple has reported globally.

Smartphone displaying a sensitive content warning, laid on a white keyboard.
Apple cancelled its major CSAM proposals but introduced features such as automatic blocking of nudity sent to children

In 2022, Apple abandoned its plans for Child Sexual Abuse Material (CSAM) detection, following allegations that it would ultimately be used for surveillance of all users. The company switched to a set of features it calls Communication Safety, which is what blurs nude photos sent to children.

According to The Guardian newspaper, the UK’s National Society for the Prevention of Cruelty to Children (NSPCC) says Apple is vastly undercounting incidents of CSAM in services such as iCloud, FaceTime and iMessage. All US technology firms are required to report detected cases of CSAM to the National Center for Missing & Exploited Children (NCMEC), and in 2023, Apple made 267 reports.

Continue Reading on AppleInsider | Discuss on our Forums