World News

Apple faces lawsuit over discontinuation of CSAM detection

Apple is being sued by a woman, who has used a pseudonym to protect her identity, in a lawsuit alleging the company failed victims of child sexual abuse by abandoning its child sexual abuse material detection plan.

It was first unveiled in 2021, purporting to use on-device technology to scan iCloud images for abusive content. Retaining the nudity-detecting feature of its Messages app, Apple eventually ditched its CSAM detection feature altogether in 2022 over concerns expressed by experts and advocacy groups that the tech company’s implementation would make its iPhones more vulnerable to cyber attacks and data breaches.

The lawsuit argues that Apple’s decision has enabled the continued distribution of illegal material online and has broken its promise to protect abuse survivors. The plaintiff, 27, who was abused as a child, said Apple’s removal of the CSAM detection left her vulnerable to further harm. She said police told her that investigators had found images of her abuse stored on iCloud, seized off a MacBook in Vermont. The plaintiff alleges that Apple sells “defective products” by not taking adequate measures to protect the victims, seeking to force changes in the company’s practice and providing compensation to the affected people.

https://twitter.com/Swipeline_Media/status/1866055666085044669

The lawsuit also points out that other technology giants, like Google and Meta, already use CSAM-scanning tools that detect far more illegal content than Apple’s nudity detection features. The plaintiff’s lawyers estimate that up to 2,680 victims may ultimately join the lawsuit and could award damages of over $1.2 billion if Apple is proven liable.

This is not Apple’s only legal battle associated with CSAM issues. In another case filed in North Carolina, a nine-year-old victim and her family also accused the company of neglecting CSAM. The girl in question said that unknown individuals used iCloud links to forward her videos of CSAM and pressed her to create and post similar content.

Apple is trying to dismiss the complaint under the auspices of federal Section 230 protection that usually protects companies from the liability of user-uploaded material. But recent court decisions have hinted that those protections do not apply if the companies don’t do anything to stop the harmful content.

For its part, Apple again defended its balancing act of fighting child exploitation without encroaching on users’ privacy. The company mentioned various technologies it has developed or improved over time, such as the aforementioned Messages feature that detects nudity, or a way for users to report harmful content.

The plaintiff’s attorney, Margaret Mabie, however, contends that these actions are insufficient. Mabie’s investigation uncovered more than 80 instances of the plaintiff’s images being shared, including by a man in California who stored thousands of illegal images on iCloud.

Source
First Post

HD News Desk

From local issues to national events and global affairs, Hindustan Dot's news desk covers the latest news and developments from India and the world.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button