Gadgets

Apple sued for failing to use tools that could detect CSAM in iCloud

Apple has been sued by victims of child sexual abuse for failing to follow programs to scan iCloud for child sexual abuse material (CSAM), reports. In 2021, Apple announced that it was working on something that would flag photos showing such abuse and notify the National Center for Missing and Exploited Children. But the company was hit by a quick backlash about the privacy implications of the technology, and eventually .

The lawsuit, filed Saturday in Northern California, seeks more than $1.2 billion in damages for the group’s 2,680 potential victims, according to the. The NYT. It says that, after Apple demonstrated its planned child safety devices, the company “failed to implement those designs or take any steps to detect and limit” CSAM on its devices, leading to harm to victims as the images continued to circulate. Engadget has reached out to Apple for comment.

In a statement to The New York Times Regarding the case, Apple spokesman Fred Sainz said, “Child sexual abuse is abhorrent and we are committed to combating the ways in which abusers put children at risk.” We are urgently developing new ways to combat this crime without compromising the security and privacy of all our users. ” The case comes a few months after Apple was released by the UK’s National Society for the Prevention of Cruelty to Children (NSPCC).


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button