News

Apple employees are now joining the choir of individuals raising concerns over Apple's plans to scan iPhone users' photo libraries for CSAM or child sexual abuse material, reportedly speaking out ...
Apple has responded to misconceptions and concerns about its photo scanning announcements by publishing a CSAM FAQ – answering frequently asked questions about the features. While child safety ...
Multiple Apple employees have expressed concerns about the new CSAM scanning system in an internal Slack channel.
Apple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis Friday August 6, 2021 10:25 am PDT by Joe Rossignol ...
Apple details reasons to abandon CSAM-scanning tool, more controversy ensues Safety groups remain concerned about child sexual abuse material scanning and user reporting.
Apple intends to launch CSAM across all iPhones and iPads running iOS 15, but the report states that it is simple for images to both evade detection and “raise strong privacy concerns” for users.
CSAM Scanning Feature Apple had initially come under pressure and criticism after announcing plans to scan iOS users’ iCloud Photo libraries for child sexual abuse images.
Apple has been accused of underreporting the prevalence of child sexual abuse material (CSAM) on its platforms. The National Society for the Prevention of Cruelty to Children (NSPCC), a child ...
A new report claims that the Apple App Store platform allows children to access adult-only applications and existing CSAM policies are not effective.
Child Safety Group, Heat Initiative, has urged Apple to revive its Anti-CSAM Tool iCloud Tool that scanned reported child abuse images.
A child protection charity claims Apple is behind many of its peers "in tackling child sexual abuse," accusing it of underreporting CSAM cases.