(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=43384035

一篇Hacker News的帖子讨论了ft.com上的一篇文章,文章内容关于AI芯片市场的竞争,特别是通过“推理”技术的进步来挑战英伟达的霸主地位。用户jsemrau重点引用了英伟达CEO黄仁勋的一些关键言论,强调了推理计算能力爆炸式增长的需求(已经比LLM刚出现时高出100倍)。这种激增是由服务LLM响应的成本迅速下降所驱动的,这要归功于更强大的芯片、更高效的AI系统以及谷歌、OpenAI和Anthropic等AI开发者之间的激烈竞争。用户bookofjoe提供了一个指向ft.com文章存档版本的链接。


原文
Hacker News new | past | comments | ask | show | jobs | submit login
How 'inference' is driving competition to Nvidia's AI chip dominance (ft.com)
5 points by bookofjoe 1 hour ago | hide | past | favorite | 2 comments










I think the key takeaway quotes are these:

“The amount of inference compute needed is already 100x more” than it was when large language models started out, Huang said on last month’s earnings call. “And that’s just the beginning.”

The cost of serving up responses from LLMs has fallen rapidly over the past two years, driven by a combination of more powerful chips, more efficient AI systems and intense competition between AI developers such as Google, OpenAI and Anthropic.









Join us for AI Startup School this June 16-17 in San Francisco!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact



Search:
联系我们 contact @ memedata.com