HomeWinBuzzer NewsMicrosoft Azure First to Get NVIDIA’s GB200 Blackwell Chips

Microsoft Azure First to Get NVIDIA’s GB200 Blackwell Chips

Microsoft gains access to NVIDIA's new GB200 chips, becoming the first to use the powerful chips for better performance.

-

Microsoft has announced a major coup in the AI race, becoming the first to install NVIDIA’s much-anticipated GB200 Blackwell chips into their AI infrastructure. The setup will allow the cloud giant to cut down on both energy use and costs, thanks to the improvements NVIDIA made in their latest hardware.

Announced earlier this year, NVIDIA’s Blackwell platform promises a massive improvement over previous chips, claiming it can reduce expenses and energy demands by up to 25 times. While tech companies like Google, AWS, and Meta had shown interest, Microsoft Azure ended up being the first to roll out the new technology.

A New Era for AI Hardware

Initially presented at Nvidia’s GTC conference, the Blackwell architecture is set to significantly outperform its H100-class predecessor, offering 2.5 to 5 times the performance and doubling both memory capacity and bandwidth. While a mid-2024 release was initially planned, the schedule has now shifted to Q4.

Manufacturing issues that necessitated a mask change led to production delays, as disclosed during Nvidia’s Q2 earnings call. Huang acknowledged these setbacks and their effects on stakeholders, asserting that Nvidia has ramped up production to satisfy strong demand.

Despite the setback, Microsoft didn’t waste any time once the chips arrived. Their AI servers are now set up to fully take advantage of the GB200’s capabilities, with enhancements like Infiniband networking and liquid cooling already in place.

Azure’s team shared this development on social media platform X, explaining that these optimizations are helping power advanced AI models more efficiently. Their use of closed-loop liquid cooling is seen as a crucial component in keeping things running smoothly, as AI workloads get more and more intense.

NVIDIA’s Partnership with Microsoft

Satya Nadella, Microsoft’s CEO, commented on the long-standing partnership with NVIDIA, stating that their collaboration continues to push AI forward. The relationship places Microsoft in a strong position, especially considering they rely solely on NVIDIA’s hardware for training AI models, while other cloud providers like Google have developed their own chips.

It’s no secret that Microsoft’s full commitment to NVIDIA sets it apart from competitors like Google and AWS, who have both developed proprietary hardware for AI training. Google, for instance, uses its Tensor Processing Units (TPUs) for most of its workloads, while AWS has created its own specialized chips.

With Microsoft now leading the charge on the GB200 rollout, the next few months could see NVIDIA’s dominance in the AI hardware market grow even further, especially as companies look to adopt the platform for more efficient AI model training. Further details about this hardware integration are expected at Microsoft’s upcoming Ignite conference, where the company will likely reveal more about how it’s using NVIDIA’s technology to push its AI ambitions.

Last Updated on November 7, 2024 2:35 pm CET

SourceMicrosoft
Luke Jones
Luke Jones
Luke has been writing about Microsoft and the wider tech industry for over 10 years. With a degree in creative and professional writing, Luke looks for the interesting spin when covering AI, Windows, Xbox, and more.

Recent News

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
0
We would love to hear your opinion! Please comment below.x
()
x