Apple's Abandonment of iCloud CSAM Scanner Is Hurting Victims, Lawsuit Alleges

1 week ago 3

A class-action lawsuit filed in a Northern California district court alleges Apple's iCloud service has been used to spread child sexual-abuse materials, or CSAM. It also alleges that Apple's abandonment of software it began developing to scan for such materials amounts to the company knowingly allowing it to proliferate.

In 2021, Apple began several initiatives to scan media on devices and on iCloud for CSAM. One of them would have checked files stored on iCloud. At the time, John Clark, the CEO and president of the National Center for Missing & Exploited Children called the technology, "a game changer."

But Apple paused development soon after complaints from privacy and security groups, and digital rights groups. By late 2022, Apple ended up curtailing some of those plans altogether, including scanning images in iCloud.

In an interview with one of the plaintiffs, whose identity is being kept anonymous, the New York Times reported over the weekend that in some cases, victims have received multiple notices a day from law enforcement that their images are being found in cases prosecuting predators. The 27-year-old plaintiff told the New York Times that she joined the suit because Apple gave false hope to victims and that she thinks the company needs to change.

The suit includes 2,680 plaintiffs and, if successful, could cost Apple more than $1.2 billion if the company is found liable by a jury.

A separate but similar individual suit in North Carolina was filed in August on behalf of a 9-year-old sexual-assault victim. 

Apple did not respond immediately to a request for comment about the suit, but the company released a statement over the weekend. Spokesperson Fred Sainz said: "Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users."

Sainz pointed to existing Apple features such as Communication Safety, which he said warns children when they receive or try to send content that includes nudity. "We remain deeply focused on building protections that help prevent the spread of CSAM before it starts," Sainz added.

Read Entire Article