A woman has sued Apple, claiming the company’s abandonment of its CSAM detection plan left her vulnerable to the resurfacing of child sexual abuse material. The lawsuit, potentially involving thousands of victims, alleges Apple’s actions broke promises to protect users and calls for significant compensation. This adds to Apple’s growing legal troubles regarding child safety and online content moderation.
Results for: Child sexual abuse material
Over 270 security and privacy experts have issued an open letter expressing their concerns over a controversial proposal by the European Union (EU) that would require messaging platforms to scan citizens’ private communications for child sexual abuse material (CSAM). The experts argue that the proposal is technically flawed, will lead to millions of false positives per day, and will undermine encryption and privacy protections. They also warn that it could set a dangerous precedent for internet filtering and surveillance. Despite these concerns, the EU is moving forward with the proposal, which is expected to be discussed further in a working party meeting on May 8.