April 24, 2024, 10:41

Apple’s New Scanner Looking for Child Abuse Content Raises Concerns Around Users’ Privacy

Apple’s New Scanner Looking for Child Abuse Content Raises Concerns Around Users’ Privacy

The software called “neuralMatch” will detect images falling under the category of child pornography and will subsequently send it to be reviewed by a human.

Apple will allow a tool designed to detect known images of child sexual abuse to scan photos uploaded to the iCloud by US users.

If a case of child sexual abuse imagery is confirmed, the National Center for Missing and Exploited Children (NCMEC) will be notified of the user’s account.

Apple will introduce the safety features in three areas, including when parents will be more informed about their children’s internet navigation. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple, the tech giant explained.

John Clark, the president and CEO of NCMEC, called Apple’s expanded protection for children “a game changer”.

Despite the words of praise for “neuralMatch”, the technology has been criticised for possible violations of user privacy.

Experts from John Hopkins University and Stanford University have discussed their privacy disruption concerns as a result of Apple’s scanning tech.

They argued that the system could get expanded to scan for images unrelated to child abuse, which could be violating the privacy of Apple users.

Parents, who store images of their children taking a bath, for example, where an element of nudity is present, could also be concerned with the software detecting their photos as abuse content.

The new features will be introduced by Apple later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

Sourse: sputniknews.com

Related posts

Leave a Reply

Your email address will not be published. Required fields are marked *