
Apple plans to scan devices for child abuse content
Apple disclosed that it aims to scan all U.S based devices to detect child sexual abuse. This idea has gained praise from protection groups for children but has raised serious concerns from security experts, as they expect the system to be misused by the government for citizen surveillance.
Apple, in this regard, plans to start client-side scanning of shared images through apple devices for known child sexual abuse content before being uploaded to iCloud. Moreover, all iMessage images sent to accounts aged under 13 will be scrutinized to alert the parents of any suspicious content shared over the messaging platform. In addition to this, Apple will also upgrade Siri and Search so that when people look out for Child Sexual Abuse Material (CSAM), an alert of problematic topics will be generated.
Apple stated that messages use machine learning as a feature called communication safety to detect sexually explicit images. This feature is an opt-in setting that parents must enable via the family sharing feature.
The tool designed to detect the CSAM images is called ‘Neural Hash’. This involves on-device matching of images through CSAM images hashes provided by child safety organizations. When the iCloud photo sharing is turned on, the scanning starts automatically. If there is a match, a human will review the image. Once confirmed, the involved account will be disabled, along with being notified to the National Center for Missing and Exploited Children (NCMEC).
These efforts by Apple have initiated some feelings of anxiety among the researchers. According to Matthew Green, Johns Hopkins University cryptography professor and security expert, this initiative could lead to some serious implications, including political and safety concerns, while also framing innocent beings by sending them images that might trigger pornography.
Apple, just like Google and other giants, already checks iCloud email files and images for known child abuse imagery, but this attempt has triggered debates of weakening encryption, thus raising concerns of privacy in the present digital era.
The Electronic Frontier Foundation (EFF) highlighted that child abuse, being a very critical concern, has been in the limelight for years by many organizations, apart from Apple. But this step can raise many issues like breaking encryption protections and opening new doors for bigger abuses.