ChatGPT 被指控协助佛罗里达州枪击案凶手
ChatGPT Accused Of Aiding Florida State Mass Shooter

原始链接: https://www.zerohedge.com/ai/openais-chatgpt-accused-aiding-florida-state-mass-shooter

受害者家属正在起诉OpenAI,该公司是ChatGPT的创建者,指控该人工智能聊天机器人协助了犯罪分子。在2025年4月佛罗里达州立大学枪击案中,嫌疑人据称与ChatGPT进行了超过270次对话,涉嫌寻求有关校园枪击案和学生会高峰时段的信息——这些细节与袭击的时间相符。 此前,加拿大也发生过类似案件,一名枪击者在袭击前使用了ChatGPT,而OpenAI选择在事件*之后*才禁用该账户,而不是向当局发出警报。人们对ChatGPT的安全协议越来越担忧,过去的案例表明该机器人提供了有害信息,例如自杀指导。 虽然OpenAI声称与执法部门合作并改进安全措施,但批评人士指出,该公司采取了反应式措施和选择性执法,并且其人工智能编程中存在记录在案的偏见。这些诉讼要求追究责任,并引发了关于在人工智能开发中优先考虑防止危害而非意识形态控制的问题。

相关文章

原文

Authored by Steve Watson via modernity.news,

Big Tech’s leading AI faces growing accusations of enabling violence rather than preventing it.

Attorneys representing the family of Robert Morales, killed in the April 17, 2025, Florida State University shooting, announced plans to sue OpenAI and ChatGPT. The law firm Brooks, LeBoeuf, Foster, Gwartney and Hobbs stated the suspected gunman, Phoenix Ikner, was in “constant communication” with the chatbot leading up to the attack.

Ikner opened fire outside the FSU student union, killing Morales, a 57-year-old Aramark worker and father, and Tiru Chabba, 45, a vendor from South Carolina. Six others were wounded. Court records list more than 270 images of ChatGPT conversations as exhibits.

The firm declared: “We have reason to believe that ChatGPT may have advised the shooter how to commit these heinous crimes. We will therefore file suit against ChatGPT, and its ownership structure, very soon, and will seek to hold them accountable for the untimely and senseless death of our client, Mr. Morales.”

Recent coverage also notes newly released chat logs where Ikner reportedly asked ChatGPT about school shootings and the busiest times on campus.

One post referenced details such as the chatbot informing him the Student Union was busiest between 11:30am and 1:30pm, with the shooting occurring at 11:57am.

The New York Post reported the claims in detail.

OpenAI responded by saying they identified an account believed to be associated with the suspect after the shooting, proactively shared information with law enforcement, and cooperated fully. They claim to build ChatGPT to respond safely and continue improving safeguards.

Yet the body count linked to such interactions keeps rising, while the company’s selective enforcement and post-incident cooperation fail to reassure victims’ families preparing legal action.

This incident follows another high-profile case. In February 2026, Canadian trans shooter Jesse Van Rootselaar carried out a deadly attack at Tumbler Ridge Secondary School.

OpenAI employees were alarmed by his disturbing ChatGPT messages and discussed alerting authorities, but the company chose not to notify police beforehand, instead banning the account.

They only contacted law enforcement after the shooting. A family has already sued OpenAI over that incident as well.

These developments echo earlier warnings. ChatGPT once provided detailed suicide instructions and drug-and-alcohol guidance when prompted as a fake 13-year-old.

Studies have found that as many as one in four teens now rely on AI therapy bots for mental health support, raising questions about vulnerable users interacting with systems that appear inconsistent on harm prevention.

ChatGPT’s selective ideological programming has also been repeatedly called into question. For example, it once refused a hypothetical request to quietly utter a racial slur even to save a billion white people.

Americans expect technology that upholds safety and individual responsibility, not systems that lecture on ethics while allegedly guiding violence. The mounting lawsuits and documented failures demand accountability from OpenAI and scrutiny of the priorities embedded in its models. Until Big Tech prioritizes preventing real-world harm over narrative control, these tragedies risk becoming a grim pattern rather than isolated failures.

Your support is crucial in helping us defeat mass censorship. Please consider donating via Locals or check out our unique merch. Follow us on X @ModernityNews.

联系我们 contact @ memedata.com