San Fran city attorney sues sites that ‘undress’ women with AI

외신뉴스
2024-08-16 13:39 PM

Jesse Coghlan4 hours agoSan Fran city attorney sues sites that ‘undress’ women with AIDek: AI-powered websites allowing users to create nonconsensual nude photos of women and girls were visited 200 million times in the first half of the year.1669 Total views14 Total sharesListen to article 0:00NewsOwn this piece of crypto historyCollect this article as NFTCOINTELEGRAPH IN YOUR SOCIAL FEEDFollow ourSubscribe onSan Francisco’s City Attorney has filed a lawsuit against the owners of 16 websites that have allowed users to “nudify” women and young girls using AI.


The office of San Francisco City Attorney David Chiu on Aug. 15 said he was suing the owners of 16 of the “most-visited websites” that allow users to “undress” people in a photo to make “nonconsensual nude images of women and girls.”


A redacted version of the suit filed in the city’s Superior Court alleges the site owners include individuals and companies from Los Angeles, New Mexico, the United Kingdom and Estonia who have violated California and United States laws on deepfake porn, revenge porn and child sexual abuse material.


The websites are far from unknown, either. The complaint claims that they have racked up 200 million visits in just the first half of the year.


One website boasted that it allows its users to “see anyone naked.” Another says, “Imagine wasting time taking her out on dates when you can just use [the website] to get her nudes,” according to the complaint.Source:SF City Attorney


The AI models used by the sites are trained on images of porn and child sexual abuse material, Chiu’s office said.


Essentially, someone can upload a picture of their target to generate a realistic, pornographic version of them. Some sites limit their generations to adults only, but others even allow images of children to be created.


Chiu’s office said the images are “virtually indistinguishable” from the real thing and have been used to “extort, bully, threaten, and humiliate women and girls,” many of which have no ability to control the fake images once they’ve been created.


Related:California AI bill aimed at preventing disasters draws ire from Silicon Valley


In February, AI-generated nude images of 16 eighth-grade students — who are typically 13 to 14 years old — were shared around by students at a California middle school, it said.


In June, ABC News reported Victoria Police arrested a teenager for allegedly circulating 50 images of grade nine to 12 students who attended a school outside Melbourne, Australia.


“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” said Chiu.


“We all need to do our part to crack down on bad actors using AI to exploit and abuse real people, including children,” he added.


Chiu said that AI has “enormous promise,” but there are criminals that are exploiting the technology, adding, “We have to be very clear that this is not innovation — this is sexual abuse.”


AI Eye:AI bubble not over yet despite entering "trough of disillusionment"# Law# Porn# San Francisco# AI# CrimesAdd reaction

외신뉴스
Crypto news


함께 보면 좋은 콘텐츠

All posts
Top