Apple will scan iPhones in the US for images of child sexual abuse

Apple has unveiled plans to scan U.S. iPhones for child sexual abuse images, drawing applause from child advocacy groups but raising concerns from some security researchers that the system could be misused, including by governments seeking to control their citizens.

A tool designed to detect known child sexual abuse images is called “neuralMatch” and scans the images before uploading them to iCloud.

If it finds a match, the image will be viewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children will be notified.

Separately, Apple plans to scan encrypted user communications for sexual content as a child safety measure, which has also alarmed privacy advocates.

The detection system will only flag images that are already in the known child pornography center’s database. Parents taking innocent photos of their baby in the bath seem to have nothing to worry about.

But the researchers say the matching tool, which doesn’t “see” such images but only the mathematical “fingerprints” that represent them, could be used for more nefarious purposes.

Matthew Green, a leading cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to match child pornography. This could fool Apple’s algorithm and alert law enforcement.

“Researchers have been able to do this quite easily,” he said of the ability to fool such systems.

Possibility of abuse

Other violations may include government surveillance of dissidents or protesters. “What happens when the Chinese government says, ‘Here’s a list of files we want you to scan,'” Greene asked.

Apple says no? I hope they say no, but their technology doesn’t say no.”

Tech companies including Microsoft, Google, Facebook and others have been sharing digital prints of known child sexual abuse images for years. Apple used them to scan user files stored on the iCloud service, which is not as securely encrypted as the data on the device, for child pornography.

For years, Apple has come under pressure from the government to increase surveillance of encrypted data.

The adoption of new security measures has required Apple to strike a delicate balance between fighting child exploitation and maintaining its high-profile commitment to protecting the privacy of its users.

But a dejected Electronic Frontier Foundation, an Internet civil liberties pioneer, called Apple’s compromise on privacy protection “a shocking turn for users who have relied on the company’s leadership in privacy and security.”

Meanwhile, the computer scientist who more than a decade ago invented PhotoDNA, a technology used by law enforcement to detect child pornography on the Internet, acknowledged the potential for Apple’s system to be abused, but said the imperative of combating child sexual abuse far outweighs it.

“Is it possible? Of course. But does it bother me? No,” said Hani Farid, a researcher at the University of California at Berkeley, who claims that a host of other programs designed to protect devices from various threats have not seen “this type of mission creep.”

For example, WhatsApp provides users with end-to-end encryption to protect their privacy, but also uses the system to detect malware and warn users not to click on malicious links.

“Game Changer”

Apple was one of the first major companies to implement “end-to-end encryption,” in which messages are encrypted so that only senders and recipients can read them. to investigate crimes such as terrorism or child sexual exploitation.

Apple said the latest changes will be released this year as part of updates to its operating software for iPhone, Mac and Apple Watch.

“Apple’s extended child protection is a game-changer,” said John Clark, President and CEO of the National Center for Missing and Exploited Children. “Because so many people use Apple products, these new security measures could save the lives of children.”

Julia Cordua, CEO of Thorn, said Apple’s technology balances “the need for privacy with digital safety for kids.” work with technological platforms.

security breach

But in a scathing criticism, the Washington-based nonprofit Center for Democracy and Technology urged Apple to back away from the change, which it says effectively destroyed the company’s “end-to-end encryption” guarantee.

Scanning messages for sexually explicit content on phones or computers is actually a security breach, he said.

The organization also questioned Apple’s technology for distinguishing dangerous content from something as tame as art or a meme. Such technologies are notoriously error prone, the CDT said in an emailed statement. Apple denies that the changes constitute a backdoor that degrades its encryption. It states that these are carefully considered innovations that do not violate user privacy, but rather protect it.

Separately, Apple said its messaging app will use on-device machine learning to detect and blur sexually explicit photos on children’s phones, and could also alert parents of younger children via text messages. He also said that his software would “intervene” when users attempted to search for topics related to child sexual abuse.

To receive alerts about sexually explicit images on their children’s devices, parents must register their child’s phone. Children over the age of 13 can unregister, meaning teens’ parents will not receive notifications.

Apple has said none of the features will compromise the security of private messages or notify the police.