In Monitoring Sex Abuse of Children, Apple Is Caught Between Safety and Privacy


In 2021, Apple was embroiled in controversy over a plan to scan iPhones for child sexual abuse materials. Privacy experts warned that governments could abuse the system, and the backlash was so severe that Apple eventually abandoned the plan.

Two years later, Apple is facing criticism from child safety crusaders and activist investors who are calling on the company to do more to protect children from online abuse.

A child advocacy group called the Heat Initiative has raised $2 million for a new national advertising campaign calling on Apple to detect, report and remove child sexual abuse materials from iCloud, its cloud storage platform.

Next week, the group will release digital advertisements on websites popular with policymakers in Washington, such as Politico. It will also put up posters across San Francisco and New York that say: “Child sexual abuse material is stored on iCloud. Apple allows it.”

The criticism speaks to a predicament that has dogged Apple for years. The company has made protecting privacy a central part of its iPhone pitch to consumers. But that promise of security has helped make its services and devices, two billion of which are in use, useful tools for sharing child sexual abuse imagery.

The company is caught between child safety groups, who want it to do more to stop the spread of such materials, and privacy experts, who want it to maintain the promise of secure devices.

A group of two dozen investors with nearly $1 trillion in assets under management have also called on Apple to publicly report the number of abusive images that it catches across its devices and services.

Two investors — Degroof Petercam, a Belgian asset manager, and Christian Brothers Investment Services, a Catholic investment firm — will submit a shareholder proposal this month that would require Apple to provide a detailed report on how effective its safety tools are at protecting children.

“Apple seems stuck between privacy and action,” said Matthew Welch, an investment specialist at Degroof Petercam. “We thought a proposal would wake up management and get them to take this more seriously.”

Apple has been quick to respond to child safety advocates. In early August, its privacy executives met with the group of investors, Mr. Welch said. Then, on Thursday, the company responded to an email from the Heat Initiative with a letter that defended its decision not to scan iCloud. It shared the correspondence with Wired, a technology publication.

In Apple’s letter, Erik Neuenschwander, the director for user privacy and child safety, said the company had concluded that “it was not practically possible” to scan iCloud photos without “imperiling the security and privacy of our users.”

“Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems,” Mr. Neuenschwander said.

Apple, he added, has created a new default feature for all child accounts that intervenes with a warning if they receive or try to send nude images. It’s designed to prevent the creation of new child sexual abuse material and limit the risk of predators coercing and blackmailing children for money or nude images. It has made those tools available to app developers as well.

In 2021, Apple said it would use technology called image hashes to spot abusive material on iPhones and in iCloud.

But the company failed to communicate that plan broadly with privacy experts, intensifying their skepticism and fueling concern that the technology could be abused by governments, said Alex Stamos, the director of the Stanford Internet Observatory at the Cyber Policy Center, who opposed the idea.

Last year, the company discreetly abandoned its plan to scan iCloud, catching child safety groups by surprise.

Apple has won praise from both privacy and child safety groups for its efforts to blunt the creation of new nude images on iMessage and other services. But Mr. Stamos, who applauded the company’s decision not to scan iPhones, said that it could do more to stop people from sharing problematic images in the cloud.

“You can have privacy if you store something for yourself, but if you share something with someone else, you don’t get the same privacy,” Mr. Stamos said.

Governments around the world are putting pressure on Apple to take action. Last year, eSafety Commissioner in Australia issued a report criticizing Apple and Microsoft for failing to do more to proactively police their services for abusive material.

In the United States, the company made 121 reports in 2021 to the National Center for Missing and Exploited Children, a federally designated clearinghouse for abusive material. Google made 875,783 reports, while Facebook made 22 million. These reports do not always reflect truly abusive material; some parents have had their Google accounts suspended and have been reported to the police for images of their children that were not criminal in nature.

The Heat Initiative timed its campaign ahead of Apple’s annual iPhone unveiling, which is scheduled for Sept. 12. The campaign is being led by Sarah Gardner, who was previously the vice president for external affairs at Thorn, a nonprofit founded by Ashton Kutcher and Demi Moore to combat child sexual abuse online. Ms. Gardner raised money from a number of child safety supporters, including the Children’s Investment Fund Foundation and the Oak Foundation.

The group has built a website that documents law enforcement cases where iCloud has been named. The list will include child pornography charges brought against a 55-year-old in New York who had more than 200 images stored in iCloud.

Ms. Gardner said that the Heat Initiative planned to target advertising throughout the fall in areas where Apple customers and employees would encounter it. “The goal is to continue to run the tactics until Apple changes its policy,” Ms. Gardner said.

Kashmir Hill contributed reporting.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *