Apple plans to scan U.S. iPhones for child abuse images


Five iPhones on a table
Enlarge / The 2020 iPhone lineup. From left to right: iPhone 12 Pro Max, iPhone 12 Pro, iPhone 12, iPhone SE and iPhone 12 mini.

Apple intends to install software on American iPhones to search for images of child abuse, according to people briefed on its plans, raising alarm among security researchers who warn it could open the door to surveillance of the personal devices of millions. of people.

Apple detailed its proposed system, known as a “neuralMatch,” to some American academics earlier this week, according to two security researchers briefed on the virtual meeting. The plans could be published more widely this week, they said.

The automated system would proactively alert a team of human reviewers if it believes illegal images are detected, who will then contact police if the material can be verified. The scheme will initially be implemented only in the US.

Apple declined to comment.

The proposals are Apple’s attempt to find a compromise between its own promise to protect customer privacy and ongoing demands from governments, law enforcement agencies, and child safety advocates for further assistance in the criminal investigations, including terrorism and child pornography.

The tension between tech companies like Apple and Facebook, which have defended their increasing use of encryption in their products and services, and law enforcement has only intensified since the iPhone maker went to court with the FBI in 2016. for accessing the iPhone of a terrorist suspect. following a shooting in San Bernardino, California.

Security researchers, while supporting efforts to combat child abuse, are concerned that Apple runs the risk of allowing governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.

“It is an absolutely gruesome idea, because it will lead to mass distributed surveillance of. . . our phones and laptops, ”said Ross Anderson, professor of security engineering at the University of Cambridge.

Although the system is currently trained to detect child sexual abuse, it could be adapted to search for any other specific images and text – for example, terrorist beheadings or anti-government posters at protests, the researchers say. Apple’s precedent could also increase pressure on other tech companies to use similar techniques.

“This will break the dam; governments will demand it of everyone,” said Matthew Green, a security professor at Johns Hopkins University, who is believed to be the first researcher to post a tweet about the topic.

Alec Muffett, a security researcher and privacy activist who previously worked at Facebook and Deliveroo, said Apple’s move was “tectonic” and a “huge, regressive step for individual privacy.”

“Apple is rolling back privacy to allow 1984,” he said.

Cloud-based photo storage systems and social media sites already scan images of child abuse, but that process becomes more complex when trying to access data stored on a personal device.

Apple’s system is less invasive in the sense that detection is done on the phone, and “only if there is a match is a notification sent to the searchers,” said Alan Woodward, professor of computer security at the University of California. Surrey. “This decentralized approach is the best approach you could take if you go this route.”

Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and uploaded to their iCloud backup system as well. Users’ photos, converted into a series of numbers through a process known as “hashing,” will be matched against a database of known images of child sexual abuse.

The system has been trained on 200,000 sexual abuse images collected by the National Center for Missing and Exploited Children, a US non-profit organization.

According to people briefed on the plans, every photo uploaded to iCloud in the US will receive a “security voucher” indicating whether or not it is suspicious. Once a certain number of photos are marked suspicious, Apple will allow all suspicious photos to be decrypted and, if they appear to be illegal, will pass them on to the appropriate authorities.

© 2021 The Financial Times Ltd. All rights reserved It must not be redistributed, copied or modified in any way.




arstechnica.com

Leave a Reply

Your email address will not be published.