Microsoft has introduced the Azure Cobalt 100 processor, now available in a preview program for Azure Virtual Machines (VMs). The new chip aims to boost general-purpose cloud computing workloads, offering improved performance metrics over previous Azure ARM-based VMs.
Performance Enhancements and Technical Specifications
The Azure Cobalt 100 VMs are equipped with 128 Neoverse N2 cores on Armv9 and 12 channels of DDR5, based on ARM's Neoverse Genesis CSS Platform. These VMs provide up to 1.4 times the CPU performance and 1.5 times better performance for Java-based workloads. Additionally, they deliver up to double the performance for web servers, .NET applications, and in-memory cache applications compared to their predecessors. The VMs also support four times the local storage input/output operations per second (IOPS) with NVMe direct and up to 1.5 times the network bandwidth.
First announced in November 2023, the Azure Cobalt 100 chips were initially used in internal Microsoft products such as Azure SQL servers and Microsoft Teams. Scott Guthrie, EVP of Microsoft's Cloud and AI group, highlighted that these chips offer 40 percent better performance over other ARM chips on the market. Early adopters of the Azure Cobalt 100 include notable companies like Snowflake and Adobe.
During the preview period, the Azure Cobalt 100 VMs are available for free, although charges will apply for associated services like disk storage. Microsoft has yet to disclose the pricing for these VMs once they become generally available. The new VMs can be accessed in specific Azure regions, though the exact locations have not been detailed.
Storage and Operating System Compatibility
The Azure Cobalt 100 VMs support a variety of storage options, including Standard SSD, Standard HDD, Premium SSD, and Ultra Disk storage. Users can utilize Insider Preview versions of both Windows 11 Pro and Enterprise with these VMs. Additionally, the VMs are compatible with several Linux-based operating systems, such as Canonical Ubuntu, CentOS, Debian, and Red Hat Enterprise Linux.
Microsoft previously also introduced the Azure Maia AI 100 chip, designed specifically for AI-based services. However, the availability date for the Azure Maia AI 100 to general customers remains undisclosed. Furthermore, Microsoft plans to offer AMD's Instinct MI300X accelerators through Azure cloud. Announced in December 2023, the MI300X accelerator boasts 1.5 times more memory capacity and 1.7 times more peak theoretical memory bandwidth than its predecessor. According to AMD, the MI300X GPUs surpass the speed of Nvidia's H100 chips, providing 1.3 petaflops of FP16 and 2.6 petaflops of FP8 performance. Scott Guthrie described the MI300X as the “most cost-effective GPU out there right now for Azure OpenAI.”