旧金山市检察官起诉利用人工智能“脱衣”女性的网站
San Francisco City Attorney Sues Sites That "Undress" Women With AI

原始链接: https://www.zerohedge.com/technology/san-francisco-city-attorney-sues-sites-undress-women-ai

旧金山市检察官办公室对 16 个网站的运营商提起诉讼,这些网站被指控通过用户生成的内容创建女性和未成年人的露骨人工智能 (AI) 图像。 这些网站允许用户对人们的照片进行“脱衣”,以制作未经同意的女性和未成年女孩的露骨色情照片。 据报道,被告来自洛杉矶、英国、新墨西哥州和爱沙尼亚等地,违反了加州和美国有关深度伪造色情内容、复仇色情内容和儿童性虐待材料的法律。 上半年访问量超过2亿,这些网站很受欢迎。 他们声称可以让用户看到任何脱衣的人,有些广告还宣称这是一种无需约会即可获得亲密照片的手段。 这些网站使用的人工智能算法接受了成人色情和儿童虐待图像的训练,允许任何人生成与目标目标逼真的色情图像。 尽管一些平台限制成人图像的生成,但其他平台则允许描绘未成年人。 法庭记录的图像与现实几乎无法区分,并被用来勒索、恐吓、骚扰和羞辱受害者,其中大多数人在创作后缺乏控制。 今年 2 月,人工智能制作的加州一所中学 13 至 14 岁八年级学生的不雅图像在学生中出现。 随后的报道显示,澳大利亚当局拘留了一名未成年人,因为他分发了 50 张人工智能创建的高中生图像。 市检察官 David Chiu 表示,人工智能生成的图像构成数字性侵犯。 他强调了打击滥用人工智能技术的犯罪行为的必要性,并表示,“我们必须立即采取行动,以免造成更多伤害。”

相关文章

原文

Authored by Jesse Coghlan via CoinTelegraph.com,

San Francisco’s City Attorney has filed a lawsuit against the owners of 16 websites that have allowed users to “nudify” women and young girls using AI.

The office of San Francisco City Attorney David Chiu on Aug. 15 said he was suing the owners of 16 of the “most-visited websites” that allow users to “undress” people in a photo to make “nonconsensual nude images of women and girls.”

A redacted version of the suit filed in the city’s Superior Court alleges the site owners include individuals and companies from Los Angeles, New Mexico, the United Kingdom and Estonia who have violated California and United States laws on deepfake porn, revenge porn and child sexual abuse material.

The websites are far from unknown, either. The complaint claims that they have racked up 200 million visits in just the first half of the year.

One website boasted that it allows its users to “see anyone naked.” Another says, “Imagine wasting time taking her out on dates when you can just use [the website] to get her nudes,” according to the complaint.

Source: SF City Attorney

The AI models used by the sites are trained on images of porn and child sexual abuse material, Chiu’s office said.

Essentially, someone can upload a picture of their target to generate a realistic, pornographic version of them. Some sites limit their generations to adults only, but others even allow images of children to be created.

Chiu’s office said the images are “virtually indistinguishable” from the real thing and have been used to “extort, bully, threaten, and humiliate women and girls,” many of which have no ability to control the fake images once they’ve been created.

In February, AI-generated nude images of 16 eighth-grade students — who are typically 13 to 14 years old — were shared around by students at a California middle school, it said.

In June, ABC News reported Victoria Police arrested a teenager for allegedly circulating 50 images of grade nine to 12 students who attended a school outside Melbourne, Australia.

“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” said Chiu.

“We all need to do our part to crack down on bad actors using AI to exploit and abuse real people, including children,” he added.

Chiu said that AI has “enormous promise,” but there are criminals that are exploiting the technology, adding, “We have to be very clear that this is not innovation — this is sexual abuse.”

联系我们 contact @ memedata.com