A woman has sued Apple, claiming the company’s abandonment of its CSAM detection plan left her vulnerable to the resurfacing of child sexual abuse material. The lawsuit, potentially involving thousands of victims, alleges Apple’s actions broke promises to protect users and calls for significant compensation. This adds to Apple’s growing legal troubles regarding child safety and online content moderation.