A new feature in iOS 15 and macOS will scan users' iCloud photos for known child sexual abuse material (CSAM). The feature preserves privacy by keeping photos encrypted over the network and in the cloud. The new scanning takes place on the user's device, comparing users' photos against a database of known child abuse imagery, provided by child protection organizations like the National Center for Missing & Exploited Children (NCMEC) and others. New technology developed by Apple called NeuralHash can match photos even after a certain amount of editing, and without adding any actual CSAM to the user's device. A cryptographic principle called threshold secret sharing allows Apple to decrypt and examine offending images uploaded to its cloud only after a threshold is cleared for a certain amount of CSAM detected by NeuralHash. When that happens, Apple can "manually verify the contents, disable a user's account and report the imagery to NCMEC, which is then passed to law enforcement." Apple promises that an appeal process will be available. The feature will be active only in the US at first. Privacy experts have expressed concern over the feature, as some governments may be tempted to force Apple to search for other types of imagery. Privacy advocates also expressed concern that the system could be abused by third parties to implicate innocent people.
from Phone Scoop - Latest News https://ift.tt/3s5VR0g
No comments:
Post a Comment