Copilot 仅供娱乐。微软使用条款规定。
Copilot is 'for entertainment purposes only', per Microsoft's terms of use

原始链接: https://techcrunch.com/2026/04/05/copilot-is-for-entertainment-purposes-only-according-to-microsofts-terms-of-service/

即使是开发人工智能工具的公司,如微软的Copilot、OpenAI的模型以及xAI的产品,也在告诫用户不要盲目信任其输出结果。 他们的服务条款明确指出,这些人工智能容易出错,不应被用于重要决策。 微软目前正在向企业推广Copilot,最初包含免责声明,将该工具标明“仅供娱乐”,并建议用户“自行承担风险”。 他们承认这种措辞已过时,并计划更新。 同样,OpenAI和xAI也警告不要将他们的人工智能的回复视为绝对真理或事实信息。 这些免责声明强调了一个关键点:虽然人工智能正在迅速发展,但它仍然存在缺陷,需要对生成的内容进行批判性评估——这种观点与怀疑论者*和*人工智能开发者本人不谋而合。

最近一篇TechCrunch文章强调了微软的使用条款,指出Copilot“仅供娱乐”,引发了Hacker News上的讨论。用户们正在辩论微软当前的AI战略,一些人认为他们在早期投资OpenAI后落后了。 一位评论员指出,直接测试Copilot就能发现它的局限性,这强化了“仅供娱乐”的声明。另一些人认为,认识到“AI寒冬”的可能性并缩减投资实际上可能是一个聪明的竞争举措。讨论以一种轻松的方式延伸到质疑那些*避免* AI投资的企业——比如一家当地的冰淇淋店——最终可能会受益。这场讨论反映了人们对当前AI技术的即时实用性和长期可行性日益增长的怀疑。
相关文章

原文

AI skeptics aren’t the only ones warning users not to unthinkingly trust models’ outputs — that’s what the AI companies say themselves in their terms of service.

Take Microsoft, which is currently focused on getting corporate customers to pay for Copilot. But it’s also been getting dinged on social media over Copilot’s terms of use, which appear to have been last updated on October 24, 2025.

“Copilot is for entertainment purposes only,” the company warned. “It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.”

A Microsoft spokesperson told PCMag that the company will be updating what they described as “legacy language.”

“As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update,” the spokesperson said.

Tom’s Hardware noted that Microsoft isn’t the only company using this kind of disclaimer for AI.  For example, both OpenAI and xAI caution users that they should not rely on their output as “the truth” (to quote xAI) or as “a sole service of truth or factual information” (OpenAI).

联系我们 contact @ memedata.com