青少年起诉xAI,称Grok生成了他们的色情图片。
Teens sue xAI over Grok's pornographic images of them

原始链接: https://www.bbc.com/news/articles/cgk2lzmm22eo

## xAI 因 AI 生成露骨图像被起诉 埃隆·马斯克的 AI 公司 xAI 正在被三名年轻女性起诉,她们指控该公司利用其聊天机器人 Grok 促成了对其未经同意的性暴露图像的创建和传播。起诉书称,Grok 去年发布“火辣模式”允许用户修改图像和视频——包括未成年人的图像——以创建深度伪造色情内容。 律师辩称,xAI 明知故犯地发布该功能以提高聊天机器人使用量,将利润置于安全之上。原告发现自己被修改过的图像在网上流传,包括在 Discord 上,并正在寻求赔偿和禁止 Grok 的图像修改功能。 此案紧随英国、欧洲和加利福尼亚监管机构对 Grok 将个人性化的能力展开的调查。虽然 X 已经实施了防止“脱衣”图像的措施,但起诉书详细描述了一个更广泛的滥用网络,其中一名犯罪者已被捕,因为它传播了数百张 AI 生成的图像。马斯克最初淡化了这个问题,将责任归咎于用户,但起诉书描绘了一幅蓄意冒险和疏忽的图景。

黑客新闻 新 | 过去 | 评论 | 提问 | 展示 | 招聘 | 提交 登录 青少年起诉 xAI,称 Grok 生成了他们的色情图片 (bbc.com) 28 分,来自 1659447091 39 分钟前 | 隐藏 | 过去 | 收藏 | 2 条评论 帮助 paxys 5 分钟前 [–] 我打赌 SpaceX 的股东们会因为无端暴露于此而感到高兴。回复 manoDev 1 分钟前 | 父评论 [–] 你肯定会输这场赌注。 指南 | 常见问题 | 列表 | API | 安全 | 法律 | 申请 YC | 联系 搜索:
相关文章

原文

Teens sue Musk's xAI over Grok's pornographic images of them

Kali HaysTechnology reporter
Reuters Elon Musk on a stage with his hands tented in front of his mouth as he listens to someone next to him.Reuters

Elon Musk's artificial intelligence company is facing a lawsuit from teenagers who say the company facilitated child pornography by allowing the creation of sexually explicit images of them.

The lawsuit against xAI was filed Monday in a federal California court by three young women whose images and videos were altered by a Grok user without their knowledge to show them nude or in otherwise overtly sexual ways.

Grok is a chatbot developed by xAI and hosted on Musk's social media platform X. xAI did not respond to a request for comment made via its parent company.

The legal action is part of the fallout since last year's controversial release of new Grok features that X called "spicy" mode.

Lawyers for the young women said Grok's ability to alter images and video had been created and released by xAI solely to drive use of the chatbot and X.

They likened the way images of the young women were changed to "a rag doll brought to life through the dark arts".

"xAI—and its founder Elon Musk— saw a business opportunity," the complaint says. "They knew Grok could produce such results, including by using the images and videos of children, and publicly released it anyway."

The young women are seeking unspecified damages, as well as an immediate order barring Grok from creating such images.

"Their lives have been shattered by the devastating loss of privacy, dignity, and personal safety", lawyers for the young women said in their complaint.

Two of the teenagers behind the lawsuit are under the age of 18, but all three are withholding their names from the public in order to protect their privacy.

One of the young plaintiffs said she found out about the imagery after she received an anonymous message on Instagram pointing her toward images and videos, including her high school yearbook photo, which had been altered to show her in sexually explicit actions and full nudity.

The material was being shared on a Discord server, a private chat space on that platform, and included similar imagery that had also been altered using Grok of at least 18 other women who were minors, according to the complaint.

The other two women who are suing xAI also found fake sexually explicit imagery of them online, which was found to have been created via Grok.

Grok was launched in 2023 by Musk's xAI. The company, along with X, is now part of Musk's SpaceX company, which took over xAI last month.

Last year, xAI released what it called Grok Imagine or "spicy mode", with features that allowed Grok users to prompt it to create fake images that were more sexual in nature.

The mode even carried out the "undressing" of real people using their images online, from Taylor Swift to more average users.

In less than two weeks, Grok had created millions of sexualized images, including more than 20,000 of children, according to a sampling of the images conducted by the Center for Countering Digital Hate.

Musk initially downplayed Grok's ability to create fake sexualized content, saying in January he was "not aware of any naked underage images generated by Grok. Literally zero," and putting the blame on users of the feature.

"Obviously, Grok does not spontaneously generate images, it does so only according to user requests", Musk wrote on X.

As such online abuse continued this year, however, UK watchdog Ofcom, the European Commission and California each launched investigations into the feature's ability to create sexualized images of real people, particularly children.

By mid-January, X said that it would implement "technological measures" to stop Grok's ability to undress people in photos.

Eventually, the perpetrator behind the Discord server mentioned in the new lawsuit was arrested. He was not named in the lawsuit but is part of a separate police investigation.

That investigation discovered he had hundreds of AI-generated and altered sexual abuse images of minors, which were traded on the messaging platform Telegram and on the file-sharing platform Mega, according to the lawsuit.


联系我们 contact @ memedata.com