Apple Sued for Not Implementing CSAM Detection Feature on iCloud

1 week ago 7
View looking up at a glass building with the Apple logo prominently displayed. Tall skyscrapers surround the structure under a partly cloudy sky. The architecture features reflective glass panels and a modern design.Apple dropped its plan to scan iCloud photos for CSAM in 2021 and is now being sued over that decision.

Apple is being sued for dropping its plan to scan iCloud photos for child sexual abuse material (CSAM) after the company cited security and privacy concerns.

The tech giant quietly shelved its controversial plan to scan iCloud photos back in December 2021 but is now facing a lawsuit accusing Apple of forcing victims to relive their trauma.

Apple first announced that it would add a CSAM scanning feature in all of its devices in August 2021. After some strong pushback from users and privacy experts, Apple said it had introduced “confusion” with its announcement of the photo scanning feature and released a white paper that it hoped would better explain its plan to scan photo libraries for CSAM. But ultimately the project never came to fruition.

Tech Crunch reports that the lawsuit has been filed by a 27-year-old woman in a U.S. District Court in Northern California. The lawsuit, which was filed under a pseudonym, says that a relative molested her when she was a small child and shared images of her online. She says that police regularly contact her even now, nearly every day, to say people are still being charged for possessing those images.

The woman says that Apple’s failure to implement the CSAM detection system means it sells defective products that harm child sexual abuse victims. The suit wants to change Apple’s policy and compensate a group of what could be 2,680 victims who are eligible. Victims of child sexual abuse are entitled to receive a minimum of $150,000 which means Apple could face a bill of $1.2 billion if a jury finds the company liable.

The New York Times reports that Apple reports less CSAM than Facebook and Google but defends its practice by citing user privacy. Child safety groups have criticized its stance.

In a statement to The New York Times, Apple says it is “urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”


Image credits: Header photo is licensed via Depositphotos.

Read Entire Article