超越数据中心:高盛硅谷考察发现人工智能正从芯片转向工作流程
Beyond The Data Center: Goldman's Silicon Valley Field Trip Finds AI Moving From Chips To Workflows

原始链接: https://www.zerohedge.com/ai/beyond-data-center-goldmans-silicon-valley-field-trip-finds-ai-moving-chips-workflows

高盛分析师最近从硅谷回来,评估了生成式人工智能的实际应用情况。他们的发现表明,一场转变正在进行中,正在从大规模基础设施投资(如数据中心和GPU)转向实际人工智能*应用*和垂直软件的开发。 重要的是,尽管由于使用量增加导致资本支出持续增长,但大型语言模型(LLM)的使用成本正在下降,这得益于学术研究。虽然软件开发成本正在下降,从而导致竞争加剧,但构建应用人工智能的公司可以通过用户基础、数据整合和工作流程整合来建立优势。 分析师预测,从2026年开始,生成式人工智能的采用将显著加速。这对标普全球和麦格劳-希尔等公司(评级为“买入”)具有积极意义,这些公司正在积极利用人工智能进行转型。关键问题仍然是,采用率是否能证明目前创纪录的投资水平,还是会面临人工智能股票估值的修正风险。

相关文章

原文

Goldman analysts led by George Tong returned to Silicon Valley for their second AI field trip, meeting with AI startups, public companies, VCs, and professors from Stanford, UCSF, and UC Berkeley to assess whether corporate America is truly embracing generative AI. The visit comes as record AI capex fuels record hyperscale data center buildouts nationwide, while investors search for clues on whether the adoption phase will materialize: a shift beyond infrastructure into the application layer.

"Insights indicate AI labs are expanding from the infrastructure layer to the application layer and LLM costs are sharply declining though capex may continue to rise as Gen AI usage and adoption grows," Tong wrote in a note to clients on Friday. 

He continued: "Academic research on LLM technologies could further bring down costs. While software development costs are falling and increasing competitive and pricing risks, moats in application AI and SaaS companies include broader user distribution, engagement with power users to drive reinforcement learning from feedback loops, integration into workflows and leveraging proprietary data."

Tong's discussions with Silicon Valley business and academic leaders point to an acceleration in generative AI adoption starting in 2026

Here's a summary of the findings:

  • Shift from infrastructure to applications: AI innovation is moving beyond chips and cloud (Nvidia, GPUs, etc.) toward actual end-user applications and vertical software solutions.

  • LLM costs are sliding: Training and using large language models is getting cheaper, though capex will still rise as usage expands. Academia is helping reduce costs: University research may accelerate efficiency gains in AI models.

  • Software development deflation: Building with AI is cheaper and faster, but that means higher competition and pricing pressure for software companies.

Tong said the conversations in Silicon Valley point to "positive implications" for S&P Global, Moody's, Iron Mountain, Verisk Analytics, and Thomson Reuters. He noted that his team has initiated coverage on McGraw-Hill with a "Buy" rating and a $27 12-month price target based on a "digital transformation" in the education space. 

The analyst provided clients with a "chart of the week" that showed how McGraw-Hill is leveraging AI to improve product efficacy and drive growth. 

Is the AI rate adoption (read here) enough to justify this record capex spending (more details here) by hyperscalers? 

Let's hope so, or AI stocks face a hefty correction. 

More in the full Goldman note available to pro subs.

Loading recommendations...

联系我们 contact @ memedata.com