Nvidia 首席执行官黄仁勋宣布推出新的 AI 芯片:“我们需要更大的 GPU”
Nvidia CEO Jensen Huang announces new AI chips: ‘We need bigger GPUs’

原始链接: https://www.cnbc.com/2024/03/18/nvidia-announces-gb200-blackwell-ai-chip-launching-later-this-year.html

本文介绍的是 Nvidia 宣布推出名为 Blackwell 的新型 AI 芯片,该芯片将于今年晚些时候推出。 Nvidia 的高需求产品包括 Hopper 系列,但供不应求。 新芯片有望显着提高人工智能性能,吸引了科技巨头微软、Meta、甲骨文和亚马逊,它们计划通过各自的云服务转售这些芯片。 此外,Nvidia 还推出了 NIM 软件,可以轻松地在各种 Nvidia GPU 上实现模型。 这为寻求本地解决方案而不是购买第三方人工智能服务的客户增加了价值。

您提到的会议似乎主要针对 NVIDIA 的开发者技术大会 (GTC),该会议主要针对使用 NVIDIA 技术的开发者、研究人员和科学家。 您之前提到的吉姆·克莱默 (Jim Cramer) 因担任财经电视节目《疯狂金钱》(Mad Money) 的主持人而闻名。 他出席会议可能表明他试图深入了解与 NVIDIA 相关的潜在投资机会,或者只是为了了解计算技术的最新进展。 然而,他评论的语气和重点可能与会议本身的演讲和讨论有很大不同。 尽管一些批评者认为他的幽默或演讲风格可能不会吸引所有观众,包括 Z 世代成员,但这次会议对于那些需要高性能计算解决方案的领域的人们来说仍然很重要。
相关文章

原文

Nvidia CEO Jensen Huang delivers a keynote address during the Nvidia GTC Artificial Intelligence Conference at SAP Center on March 18, 2024 in San Jose, California. 

Justin Sullivan | Getty Images

Nvidia on Monday announced a new generation of artificial intelligence chips and software for running artificial intelligence models. The announcement, made during Nvidia's developer's conference in San Jose, comes as the chipmaker seeks to solidify its position as the go-to supplier for AI companies.

Nvidia's share price is up five-fold and total sales have more than tripled since OpenAI's ChatGPT kicked off the AI boom in late 2022. Nvidia's high-end server GPUs are essential for training and deploying large AI models. Companies like Microsoft and Meta have spent billions of dollars buying the chips.

The new generation of AI graphics processors is named Blackwell. The first Blackwell chip is called the GB200 and will ship later this year. Nvidia is enticing its customers with more powerful chips to spur new orders. Companies and software makers, for example, are still scrambling to get their hands on the current generation of "Hopper" H100s and similar chips.

“Hopper is fantastic, but we need bigger GPUs,” Nvidia CEO Jensen Huang said on Monday at the company's developer conference in California.

Nvidia shares fell more than 1% in extended trading on Monday.

The company also introduced revenue-generating software called NIM that will make it easier to deploy AI, giving customers another reason to stick with Nvidia chips over a rising field of competitors.

Nvidia executives say that the company is becoming less of a mercenary chip provider and more of a platform provider, like Microsoft or Apple, on which other companies can build software.

"Blackwell's not a chip, it's the name of a platform," Huang said.

"The sellable commercial product was the GPU and the software was all to help people use the GPU in different ways," said Nvidia enterprise VP Manuvir Das in an interview. "Of course, we still do that. But what's really changed is, we really have a commercial software business now."

Das said Nvidia's new software will make it easier to run programs on any of Nvidia's GPUs, even older ones that might be better suited for deploying but not building AI.

"If you're a developer, you've got an interesting model you want people to adopt, if you put it in a NIM, we'll make sure that it's runnable on all our GPUs, so you reach a lot of people," Das said.

Meet Blackwell, the successor to Hopper

Nvidia CEO Jensen Huang compares the size of the new "Blackwell" chip versus the current "Hopper" H100 chip at the company's developer conference, in San Jose, California.

Nvidia

Amazon, Google, Microsoft, and Oracle will sell access to the GB200 through cloud services. The GB200 pairs two B200 Blackwell GPUs with one Arm-based Grace CPU. Nvidia said Amazon Web Services would build a server cluster with 20,000 GB200 chips.

Nvidia said that the system can deploy a 27-trillion-parameter model. That's much larger than even the biggest models, such as GPT-4, which reportedly has 1.7 trillion parameters. Many artificial intelligence researchers believe bigger models with more parameters and data could unlock new capabilities.

Nvidia didn't provide a cost for the new GB200 or the systems it's used in. Nvidia's Hopper-based H100 costs between $25,000 and $40,000 per chip, with whole systems that cost as much as $200,000, according to analyst estimates.

Nvidia will also sell B200 graphics processors as part of a complete system that takes up an entire server rack.

Nvidia inference microservice

联系我们 contact @ memedata.com