March 29, 2024, 14:50

After Coming Under Intense Criticism Apple Delays Plan to Scan Phones for Child Abuse Images

After Coming Under Intense Criticism Apple Delays Plan to Scan Phones for Child Abuse Images

Apple had said it had invented a tool which was designed to detect whether child sexual abuse images had been uploaded to the iCloud by users. But critics claimed the software, called neuralMatch, was an invasion of privacy and was prone to error.

After an avalanche of criticism from privacy groups, Apple has backed down and delayed plans to introduce a “child safety feature” on iPhones.

Last month Apple said a tool would check the iPhones and personal computers of customers in the United States to flag up images popular with paedophiles.

Apple, which had planned to roll out the feature for iPhones, iPads, and Macs later this year, had insisted the software would not have flagged up innocent images such as holiday snaps of young children in swimming costumes.

​But critics said the idea was the tip of the iceberg and could allow for repressive governments to scan phones for political images and cartoons and could be used as a tool of censorship.

Apple had insisted it would allow security researchers to verify its claims but on Friday, 3 September, the company finally backed down and said it would put its plans on hold.

In a statement Apple said: “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Matthew Green, a cybersecurity researcher at Johns Hopkins University, welcomed the move.

 

Sourse: sputniknews.com

Related posts

Leave a Reply

Your email address will not be published. Required fields are marked *