Apple To Refuse Government Demands Of Expanding Scanning Beyond Child Abuse
The child sexual abuse material (CSAM) detection system will have devices running iOS 15, iPadOS 15, watchOS 8, and macOS Monterey matching images on the device against a list of known CSAM image hashes provided by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations before an image is stored in iCloud. If a hashing match is made, metadata that Apple is calling “safety vouchers” will be uploaded along with the image, and once an unnamed threshold is reached, Apple will manually inspect the metadata and if it regards it as CSAM, the account will be disabled and a report sent to NCMEC....