谷歌最新挑战英伟达,推出两款面向智能代理时代的芯片。
In Latest Shot At Nvidia, Google Unveils Two Chips For The Agentic Era

原始链接: https://www.zerohedge.com/ai/google-unveils-two-chips-agentic-era

谷歌最近推出了第八代张量处理单元(TPU),将产品线分为两种专用芯片:用于AI模型*训练*的TPU 8t和用于*推理*(运行部署的AI)的TPU 8i。此举直接挑战了英伟达在AI芯片市场的统治地位。 据谷歌称,这些芯片历经十年研发,在性能和效率方面都实现了显著提升——每瓦性能分别提高了124%和117%,考虑到不断上涨的数据中心电力成本,这一点至关重要。一个关键特性是处理器上数据存储的增加,从而减少了复杂AI任务的延迟。 谷歌预计对训练和推理的需求都将增长,特别是随着AI代理的兴起,这 оправдывает 专业化。虽然谷歌将继续提供基于英伟达的系统,但其目标是通过定制芯片以适应特定工作负载来加强其AI基础设施,这与微软和亚马逊等其他科技巨头的努力相呼应。该消息导致谷歌的股价小幅上涨。

相关文章

原文

In a blog post titled "Our Eighth Generation TPUs: Two Chips for the Agentic Era," Google Senior Vice President and Chief Technologist for AI and Infrastructure Amin Vahdat unveiled the latest generation of the company's in-house AI chips, splitting the lineup into two versions.

The TPU 8t is designed for training AI models, while the TPU 8i is built for inference, or running AI services once they are developed and deployed, taking direct aim at Nvidia.

"The culmination of a decade of development, TPU 8t and TPU 8i are custom-engineered to power the next generation of supercomputing with efficiency and scale," Vahdat wrote in the blog post.

He continued:

Hardware development cycles are much longer than software. With each generation of TPUs, we need to consider what technologies and demands will exist by the time they are brought to market.

Several years ago, we anticipated rising demand for inference from customers as frontier AI models are deployed in production and at scale.

And with the rise of AI agents, we determined the community would benefit from chips individually specialized to the needs of training and serving.

Vahdat pointed out that the new chips store more data directly on the processor, helping reduce delays and improve responsiveness, especially for more complex AI models that reason through tasks in steps. He also emphasized efficiency, saying the TPU 8t delivers 124% more performance per watt than the prior generation, while the TPU 8i improves that metric by 117% - this is important since the power bill crisis has swayed local politicians in many states and become a major roadblock in data center buildouts.

The announcement highlights Google's move to become a leader in producing in-house AI chips despite Nvidia's continued dominance. At the same time, Google said it will continue offering Nvidia-based systems to customers.

The broader message is that Google is trying to strengthen its AI infrastructure advantage by tailoring its chips more precisely for the distinct demands of training and inference.

Also, across the industry, Microsoft, Meta, Amazon, Apple, and others are pursuing custom AI chips for specialized workloads.

Shares of Google were modestly higher in premarket trading after the blog post, rising about 1.5%. Even so, the stock has traded sideways this year.

TPUs have powered Google's Gemini models for years, and the latest announcement signals a push to deliver greater scale, efficiency, and responsiveness across both training and serving workloads.

联系我们 contact @ memedata.com