HomeWinBuzzer NewsOracle Cloud Helps Microsoft With Increasing OpenAI GPU Workloads

Oracle Cloud Helps Microsoft With Increasing OpenAI GPU Workloads

The deal between the companies suggests that Microsoft Azure is struggling to meet the surging demand of OpenAI´s computing requests.

-

Oracle has announced a partnership where Microsoft will utilize Oracle Cloud Infrastructure (OCI) to manage OpenAI’s workloads. This collaboration will extend Microsoft’s Azure AI capabilities by leveraging Oracle’s cloud resources, meeting the increasing GPU capacity demand from OpenAI. As per OpenAI’s CEO Sam Altman, integrating with OCI is essential for scaling their AI models, which Azure currently finds challenging.

OpenAI to use Oracle Cloud Infrastructure

The race to build the world’s greatest large language model is on, and it is fueling unlimited demand for Oracle’s Gen2 AI infrastructure”, said Larry Ellison, Oracle Chairman and CTO.

OpenAI, servicing over 100 million users monthly with generative AI, will benefit from OCI’s AI infrastructure. This partnership places OpenAI among various AI innovators across different sectors using OCI for their AI needs. Companies like Adept, Modal, MosaicML, NVIDIA, Reka, Suno, Together AI, Twelve Labs, and xAI currently leverage OCI Supercluster for training and inferencing AI models.

The agreement among Microsoft, Oracle, and OpenAI indicates that Microsoft Azure is facing challenges in accommodating the increasing computing demands of OpenAI without compromising the service quality for other cloud clients, while ramping up its own infrastructure.

Microsoft and Oracle already partnered in November 2023 to enhance Bing’s search and Bing Chat capabilities using Oracle’s cloud infrastructure. The ties between the two long-standing rivals in the cloud market date back to 2019. Microsoft then surprisingly decided to collaborate with Oracle Cloud to integrate Azure services into the company’s solutions.

OCI’s advanced AI capabilities cater to both startups and large enterprises, facilitating model building and training across Oracle’s distributed cloud. For LLM training, OCI Supercluster can scale up to 64,000 NVIDIA Blackwell GPUs or GB200 Grace Blackwell Superchips, interconnected by low-latency RDMA networking and HPC storage options. OCI’s Compute virtual machines and NVIDIA GPU bare metal instances enable applications for generative AI, computer vision, natural language processing, and more.

Oracle´s Cloud Strategy

In Oracle’s Q4 2024 earnings call, Larry Ellison revealed plans for a new data center, half of which will be allocated to Microsoft. The facility will include advanced Nvidia chips, interconnects, and liquid cooling systems, mainly designed for training large AI models rather than inferencing.

On the same day, Oracle announced it will also partner with Google to integrate its Cross-Cloud Interconnect service into 11 OCI regions. Oracle Database@Google Cloud is also slated for a 2024 launch, further enhancing Oracle’s cloud offerings.

Financial Performance and Market Trends 

Oracle reported Q4 2024 revenues at $14.3 billion, a three percent increase from the previous year. The cloud services and license support segment alone contributed $10.2 billion, showing a nine percent rise. CEO Safra Catz highlighted the quarter saw the largest sales contracts in Oracle’s history, largely driven by the demand for training large language models and record sales for OCI, Autonomous, Fusion, and NetSuite.

Catz observed a notable customer shift towards multi-year cloud commitments instead of traditional one-time licenses. Oracle’s remaining performance obligations (RPO) now total $98 billion. Annual revenue for 2024 was $53 billion, up six percent from the previous year. Catz also projected that infrastructure services would grow faster than the 50 percent seen this year, with Q1 expected to show six to eight percent year-over-year revenue growth.

Ellison underscored Oracle’s growth prospects in the earnings call, highlighting the need for increasingly larger data centers to keep up with the AI competition. These data centers are expected to support continuously updating AI models, essential for remaining competitive in the AI space.

SourceOracle
Markus Kasanmascheff
Markus Kasanmascheff
Markus has been covering the tech industry for more than 15 years. He is holding a Master´s degree in International Economics and is the founder and managing editor of Winbuzzer.com.

Recent News

Mastodon