News

Apple employees are now joining the choir of individuals raising concerns over Apple's plans to scan iPhone users' photo libraries for CSAM or child sexual abuse material, reportedly speaking out ...
Apple has responded to misconceptions and concerns about its photo scanning announcements by publishing a CSAM FAQ – answering frequently asked questions about the features. While child safety ...
Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages ...
Multiple Apple employees have expressed concerns about the new CSAM scanning system in an internal Slack channel.
Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM). The proposed class action comes after ...
Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting. The company issued a rare, detailed response on Thursday.
Over a year ago, Apple announced plans to scan for child sexual abuse material (CSAM) with the iOS 15.2 release. The technology is inevitable despite imperfections and silence about it.
Apple details reasons to abandon CSAM-scanning tool, more controversy ensues Safety groups remain concerned about child sexual abuse material scanning and user reporting.
Apple intends to launch CSAM across all iPhones and iPads running iOS 15, but the report states that it is simple for images to both evade detection and “raise strong privacy concerns” for users.
When Apple announced changes it plans to make to iOS devices in an effort to help curb child abuse by finding child sexual abuse material (CSAM), parts of its plan generated backlash.
CSAM Scanning Feature Apple had initially come under pressure and criticism after announcing plans to scan iOS users’ iCloud Photo libraries for child sexual abuse images.