Cryptopolitan
2024-12-18 09:10:18

Microsoft outbuys tech giants with massive Nvidia AI chip orders

Microsoft has secured its spot at the top of the AI hardware food chain, outspending every tech giant in the game and buying 485,000 Nvidia Hopper chips this year. This leaves competitors like Meta, Amazon, and Google gasping for air in the AI arms race. Meta came closest but still trailed far behind, with 224,000 chips. Amazon and Google? Not even close, with orders of 196,000 and 169,000 Hopper chips respectively. Nvidia’s GPUs have become Silicon Valley’s gold standard, driving a frenzy of data center investments to support cutting-edge AI models like ChatGPT. Microsoft, with its Azure cloud infrastructure, is going all in, betting big on AI as the future of tech. Microsoft’s giant leap Nvidia’s Hopper chips, hailed as the crown jewels of AI processing, have been the hottest tech commodity for two years. They’ve become critical tools for training and running next-gen AI models. Analysts at Omdia estimate that the company’s purchases more than tripled compared to last year, a direct response to skyrocketing demand following ChatGPT’s breakout success. “Good data center infrastructure… takes multi-years of planning,” said Alistair Speirs, senior director of Azure Global Infrastructure at Microsoft. The company is also positioning itself to dominate the cloud market, renting out its hardware to customers through Azure. Omdia’s analysis shows that Microsoft’s Hopper haul dwarfs the 230,000 chips bought by ByteDance and Tencent combined. However, the chips shipped to China were modified H20 models—downgraded to comply with US export restrictions. Nvidia tweaked these chips to limit their capabilities, but even with reduced power, Chinese companies are snapping them up. The AI spending spree This year alone, tech companies have shelled out a jaw-dropping $229 billion on servers, with Microsoft leading the charge. The company’s $31 billion capital expenditure makes it the biggest spender in the global AI infrastructure boom, overshadowing Amazon’s $26 billion. According to Omdia, Nvidia GPUs accounted for 43% of server spending in 2024, cementing the company’s dominance in AI hardware. Nvidia’s Hopper chips may be the star, but Microsoft isn’t putting all its eggs in one basket. It has begun rolling out its own Maia chips, installing 200,000 units this year. Meanwhile, Amazon has been ramping up production of its Trainium and Inferentia chips, deploying 1.3 million units to fuel its AI ambitions. It’s even building a new AI cluster featuring hundreds of thousands of Trainium chips for Anthropic , an OpenAI competitor it has backed with an $8 billion investment. Meta and Google are also doubling down on their in-house chips, each deploying around 1.5 million units of their custom designs. From Zero to Web3 Pro: Your 90-Day Career Launch Plan

获取加密通讯
阅读免责声明 : 此处提供的所有内容我们的网站,超链接网站,相关应用程序,论坛,博客,社交媒体帐户和其他平台(“网站”)仅供您提供一般信息,从第三方采购。 我们不对与我们的内容有任何形式的保证,包括但不限于准确性和更新性。 我们提供的内容中没有任何内容构成财务建议,法律建议或任何其他形式的建议,以满足您对任何目的的特定依赖。 任何使用或依赖我们的内容完全由您自行承担风险和自由裁量权。 在依赖它们之前,您应该进行自己的研究,审查,分析和验证我们的内容。 交易是一项高风险的活动,可能导致重大损失,因此请在做出任何决定之前咨询您的财务顾问。 我们网站上的任何内容均不构成招揽或要约