Apple is facing a lawsuit for its decision not to implement a system for detecting child sexual abuse material (CSAM) on iCloud Photos, according to TechCrunch. The New York Times reports that the lawsuit accuses Apple of failing to take adequate steps to prevent the spread of such materials, thereby forcing victims to relive their trauma.
The complaint highlights Apple’s earlier promise to introduce an advanced system to protect children but alleges that the company failed to deliver on this commitment. In 2021, Apple announced plans to use digital signatures from organizations like the National Center for Missing and Exploited Children to identify known CSAM content in iCloud libraries. However, after privacy and security experts raised concerns about potential government surveillance risks, Apple abandoned the plan.
The lawsuit has been filed by a 27-year-old woman under a pseudonym. She claims she was abused by a relative as a child, with her images shared online. To this day, she receives notifications from law enforcement about cases involving individuals accused of possessing or sharing those images.
James Marsh, an attorney involved in the case, stated that approximately 2,680 victims may qualify for compensation as part of this lawsuit.
An Apple spokesperson told The New York Times: “The company actively works to ensure the safety and privacy of all users while combating these crimes.”
In a related case, a 9-year-old girl and her guardian also filed a lawsuit against Apple in August, citing its failure to prevent CSAM on iCloud.