Apple sued for alleged role in spreading child abuse images via iCloud: Report

Apple sued for alleged role in spreading child abuse images via iCloud: Report

Apple is facing a lawsuit filed by a 27-year-old woman who claims the tech giant failed to prevent the proliferation of child sexual abuse material (CSAM) stored on its iCloud service, according to The New York Times.

The plaintiff, using a pseudonym to protect her identity, has alleged that Apple abandoned a system it had developed to detect such content, breaking its promise to safeguard victims like her.

The lawsuit, filed in the US District Court for Northern California, accuses Apple of selling defective products by not implementing tools to identify and limit CSAM. The New York Times says that the suit seeks to change Apple’s practices and compensate up to 2,680 potential victims. If successful, the damages could exceed $1.2 billion.

The plaintiff’s abuse began in infancy when a relative molested her, took explicit photos, and shared them online. Years later, she learned that some of these images were found on a man’s MacBook and stored on Apple’s iCloud, according to the report.

In 2021, Apple announced NeuralHash, a tool designed to scan iCloud photos for CSAM, but the initiative was abandoned after backlash from privacy advocates who warned that the system could create vulnerabilities for government surveillance. Apple defended its decision, citing a commitment to user privacy, but the lawsuit argues this left victims unprotected.

Apple has reported significantly fewer CSAM instances than peers like Google and Facebook. In 2019, the company reported only 267 cases to the National Center for Missing & Exploited Children compared to millions filed by its competitors, The New York Times reported.

Fred Sainz, an Apple spokesperson, called CSAM “abhorrent” and stated the company is “urgently innovating to combat these crimes without compromising user privacy.” He highlighted tools like child safety warnings in the Messages app but did not address the abandoned NeuralHash system.

Legal experts told The New York Times that the case could challenge Apple’s reliance on Section 230 of the Communications Decency Act, which protects tech companies from liability for user content. Recent court rulings have narrowed this protection, increasing the likelihood of lawsuits like this one.

The plaintiff said she joined the lawsuit to push Apple to prioritise people over profit. “Apple’s inaction is heart-wrenching,” she said. The case underscores ongoing tensions between privacy concerns and the responsibility to prevent the spread of illegal material online.

Also Read: Wall Street Journal claims Instagram is connecting a network of paedophile accounts

World Test Championship Final Qualification: With 10 Games To Go, All Possible Scenarios Explained Previous post World Test Championship Final Qualification: With 10 Games To Go, All Possible Scenarios Explained
Ex-Ivy League Student, 26, Arrested For UnitedHealthcare CEO’s Murder Next post Ex-Ivy League Student, 26, Arrested For UnitedHealthcare CEO’s Murder

Leave a Reply

Your email address will not be published. Required fields are marked *