April 16, 2024, 17:51

Apple Unveils Limits to Its Child Sex Abuse Scanner After Critics Question Users’ Privacy

Apple Unveils Limits to Its Child Sex Abuse Scanner After Critics Question Users’ Privacy

Last week, Apple announced details of its “neuralMatch” software, designed to detect child sexual abuse material on US users’ devices.

Apple has unveiled limits to its new scanner looking for child abuse content after the technology was criticised for possible violations of user privacy.

The company will allow the “neuraMatch” software, designed to detect Child Sexual Abuse Material (CSAM) to scan photos uploaded to the iCloud by US users. If a case of child sexual abuse imagery is confirmed, the National Centre for Missing and Exploited Children (NCMEC) will be notified of the user’s account.

Many, however, were quick to claim that the system could be expanded to scan for images unrelated to child abuse, something that could ride roughshod over the privacy of Apple users.

The document then notes that “Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices”.

According to the explainer, it would take 30 matching images for the system to activate, which means that “the possibility of any given account being flagged incorrectly is lower than one in one trillion”.

The remarks follow a group of security and privacy tech advocates releasing an open letter, in which they warned that Apple’s new software “introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products”.

The letter came after the company said in a statement that by rolling out the new system, they want “[…] to help protect children from predators who use communication tools to recruit and exploit them”.

Sourse: sputniknews.com

Related posts

Leave a Reply

Your email address will not be published. Required fields are marked *