HomeIndustriesMicrosoft acquires twice as many Nvidia AI chips as technology rivals

Microsoft acquires twice as many Nvidia AI chips as technology rivals

Microsoft bought twice as lots of Nvidia's flagship chips as any of its biggest rivals within the U.S. and China this 12 months, as OpenAI's biggest investor accelerated its investments in artificial intelligence infrastructure.

Analysts at technology consulting firm Omdia estimate that Microsoft has purchased 485,000 “hopper” chips from Nvidia this 12 months. This put Microsoft far ahead of Nvidia's second-largest US customer Meta, which bought 224,000 Hopper chips, in addition to its cloud computing competitors Amazon and Google.

With demand outstripping supply of Nvidia's most advanced graphics processors for much of the past two years, Microsoft's chip supply has given it a head start within the race to develop the following generation of AI systems.

This 12 months, major technology firms have spent tens of billions of dollars on data centers running Nvidia's latest chips. These have turn out to be the most well liked commodity in Silicon Valley since ChatGPT's debut two years ago sparked an unprecedented surge in investment in AI.

Microsoft's Azure cloud infrastructure has been used to coach OpenAI's latest o1 model because it competes against a resurgent Google, startups like Anthropic and Elon Musk's xAI, and rivals in China for supremacy in the following generation of computing .

Omdia estimates that ByteDance and Tencent have each ordered about 230,000 chips from Nvidia this 12 months, including the H20 model, a less powerful version of Hopper modified to comply with U.S. export controls for Chinese customers.

Amazon and Google, which together with Meta are increasing the usage of their very own custom AI chips as a substitute for Nvidia's, bought 196,000 and 169,000 Hopper chips, respectively, the analysts said.

Omdia analyzes firms' publicly disclosed capital expenditures, server shipments and provide chain information to calculate its estimates.

Nvidia, which is now beginning to roll out Hopper successor Blackwell, has seen its value rise to greater than $3 trillion this 12 months as major tech firms rush to assemble ever-larger clusters of its GPUs.

However, the stock's extraordinary rise has faded in recent months amid concerns about slower growth, competition from Big Tech firms' own AI chips and a possible disruption to their business in China by Donald Trump's recent administration the USA exist.

ByteDance and Tencent have emerged as two of Nvidia's biggest customers this 12 months, at the same time as the U.S. government limits the capabilities of American AI chips that may be sold in China.

Microsoft, which has invested $13 billion in OpenAI, has been essentially the most aggressive of U.S. big tech firms in constructing out data center infrastructure, each to power its own AI services like its Copilot assistant and thru its Azure department for rent to customers.

Orders for Microsoft's Nvidia chips are greater than thrice the variety of same-generation AI processors from Nvidia that the corporate bought in 2023, as Nvidia struggled to expand Hopper production after the breakthrough success of ChatGPT .

“Good data center infrastructure, they’re very complex, capital intensive projects,” Alistair Speirs, senior director of Azure Global Infrastructure at Microsoft, told the Financial Times. “They require planning over several years. So it’s essential to predict where our growth shall be with some buffer.”

According to Omdia, tech firms world wide will spend an estimated $229 billion on servers in 2024, led by Microsoft at $31 billion and Amazon at $26 billion. The top 10 buyers of information center infrastructure – which now includes relative newcomers xAI and CoreWeave – account for 60 percent of worldwide computing power investments.

Vlad Galabov, director of cloud and data center research at Omdia, said that in 2024, around 43 percent of server spending went to Nvidia.

“Nvidia GPUs accounted for an unlimited share of server investments,” he said. “We are near the summit.”

Bar chart of server capital spending in 2024 ($ billion), showing Big Tech's biggest spenders in the AI ​​data center boom

While Nvidia still dominates the AI ​​chip market, its Silicon Valley rival AMD is on the rise. According to Omdia, Meta bought 173,000 of AMD's MI300 chips this 12 months, while Microsoft bought 96,000.

Big tech firms have also increased the usage of their very own AI chips this 12 months to cut back their dependence on Nvidia. Google, which has been developing its “Tensor Processing Units,” or TPUs, for a decade, and Meta, which introduced the primary generation of its Meta Training and Inference Accelerator chip last 12 months, each deployed about 1.5 million of their very own chips.

Amazon, which is investing heavily in its Trainium and Inferentia chips for cloud computing customers, has deployed about 1.3 million of the chips this 12 months. Amazon said this month that it plans to construct a brand new cluster with lots of of 1000’s of its latest Trainium chips for Anthropic, an OpenAI competitor wherein Amazon has invested $8 billion, to power the following generation of its AI models train.

However, Microsoft is way earlier in its efforts to develop an AI accelerator that may compete with Nvidia's, with only about 200,000 of its Maia chips installed this 12 months.

Speirs said that using Microsoft's Nvidia chips would still require significant investment in its own technology to supply a “unique” service to customers.

“In our experience, constructing the AI ​​infrastructure just isn’t nearly having the very best chip, but additionally about having the correct storage components, the correct infrastructure, the correct software layer, the correct host management layer, error correction and all the opposite components to construct this technique,” he said.

Video: AI is changing the world of labor, are we ready for it? | FT is working on it

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read