人工智能热潮之后:我们可能会留下什么?
After the AI boom: what might we be left with?

原始链接: https://blog.robbowley.net/2025/10/12/after-the-ai-boom-what-might-we-be-left-with/

一些人将当前的AI热潮比作互联网泡沫,认为过度建设并非一定是坏事,但两者之间的相似之处有限。互联网时代创造了寿命长、可广泛重复使用的*开放*基础设施(如使用TCP/IP的 fiber 网络)。而今天的AI投资主要流向*专有*系统。 昂贵且快速过时的GPU和专用数据中心是为少数主要供应商(英伟达、谷歌、亚马逊)构建的,并针对生成式AI进行了优化,缺乏早期基础设施的通用性。泡沫破裂的风险在于留下无法使用、寿命短的硬件。 然而,过剩的算力*可能*会降低计算成本,从而促进AI以外领域的实验。二手市场和基础设施升级(电力、网络)也将是有益的。 至关重要的是,互联网的持久影响源于*开放标准*。AI的封闭生态系统由少数公司控制,这有风险将收益锁定在私人手中。真正的长期价值需要开放性和互操作性,才能将今天的基础设施转变为一个为未来创新广泛可用的平台。

## AI 繁荣之后:总结 一篇 Hacker News 的讨论探讨了当前人工智能浪潮的潜在后果。一个核心问题是驱动人工智能发展的专用硬件——GPU 的寿命和经济可行性,一些人认为快速过时和高成本可能会重演互联网泡沫。与留下光纤电缆等持久基础设施的互联网时代不同,人工智能的核心资产可能更短暂。 虽然承认人工智能的潜力,许多评论员质疑当前投资水平的可持续性以及大型语言模型 (LLM) 的长期盈利能力。然而,另一些人认为,高效的模型将能够在消费设备上使用,并且即使在潜在的“泡沫破裂”之后,由爱好者和小型企业推动的持续创新也将存在。 争论的中心在于人工智能是否代表着根本性的转变(如互联网)或另一种过度炒作的技术。一些人预见一个后稀缺的未来,而另一些人则预计会回归更实用、以用户为中心的发展,摆脱当前“将人工智能添加到一切事物”的趋势。一个关键问题仍然是:如果大规模资本注入枯竭,经济模式是否能够支持人工智能的持续发展?
相关文章

原文

Some argue that even if the current AI boom leads to an overbuild, it might not be a bad thing – just as the dotcom bubble left behind the internet infrastructure that powered later decades of growth.

It’s a tempting comparison, but the parallels only go so far.

The dotcom era’s overbuild created durable, open infrastructure – fibre networks and interconnects built on open standards like TCP/IP and HTTP. Those systems had multi-decade lifespans and could be reused for whatever came next. Much of the fibre laid in the 1990s still carries traffic today, upgraded simply by swapping out the electronics at each end. That overinvestment became the backbone of broadband, cloud computing, and the modern web.

Most of today’s AI investment, by contrast, is flowing into proprietary, vertically integrated systems rather than open, general-purpose infrastructure. Most of the money is being spent on incredibly expensive GPUs that have a 1-3 year lifespan due to becoming obsolete quickly and wearing out under constant, high-intensity use. These chips aren’t general-purpose compute engines; they’re purpose-built for training and running generative AI models, tuned to the specific architectures and software stacks of a few major vendors such as Nvidia, Google, and Amazon.

These chips live inside purpose-built AI data centres – engineered for extreme power density, advanced cooling, and specialised networking. Unlike the general-purpose facilities of the early cloud era, these sites are tightly coupled to the hardware and software of whoever built them. Together, they form a closed ecosystem optimised for scale but hard to repurpose.

That’s why, if the AI bubble bursts, we could just be left with a pile of short-lived, highly specialised silicon and silent cathedrals of compute – monuments from a bygone era.

The possible upside

Still, there’s a more positive scenario.

If investment outruns demand, surplus capacity could push prices down, just as the post-dotcom bandwidth glut did in the early 2000s. Cheap access to this kind of compute might open the door for new experimentation – not just in generative AI, but in other high-compute domains such as simulation, scientific research, and data-intensive analytics. Even if the hardware is optimised for GenAI, falling prices could still make large-scale computation more accessible overall. A second-hand market in AI hardware could emerge, spreading access to powerful compute much more widely.

The supporting infrastructure – power grid upgrades, networking, and edge facilities – will hopefully remain useful regardless. And even if some systems are stranded, the talent, tooling, and operational experience built during the boom will persist, as it did after the dotcom crash.

Without openness, the benefits stay locked up

The internet’s long-term value came not just from cheap capacity, but from open standards and universal access. Protocols like TCP/IP and HTTP meant anyone could build on the same foundations, without permission or platform lock-in. That openness turned surplus infrastructure into a shared public platform, unlocking decades of innovation far beyond what the original investors imagined.

The AI ecosystem is the opposite: powerful but closed. Its compute, models, and APIs are owned and controlled by a handful of vendors, each defining their own stack and terms of access. Even if hardware becomes cheap, it won’t automatically become open. Without shared standards or interoperability, any overbuild risks remaining a private surplus rather than a public good.

So the AI boom may not leave behind another decades-long backbone like the internet’s fibre networks. But it could still seed innovation if the industry finds ways to open up what it’s building – turning today’s private infrastructure into tomorrow’s shared platform.

联系我们 contact @ memedata.com