Nvidia, led by CEO Jensen Huang, has announced a strategic shift in its AI chip development cycle, moving to an annual release schedule. This new approach was disclosed during the company's Q1 2025 earnings call, marking a departure from its previous biennial release pattern.
Transition to Annual Releases
Following the Blackwell architecture, Nvidia will now introduce a new chip architecture every year. This move follows the recent release patterns of Ampere in 2020, Hopper in 2022, and Blackwell in 2024. Analyst Ming-Chi Kuo's earlier report about the next architecture, Rubin, arriving in 2025 aligns with Huang's announcement, suggesting an R100 AI GPU could be available next year.
Huang emphasized that the new AI GPUs will maintain electrical and mechanical backward compatibility, ensuring they run the same software as their predecessors. This compatibility allows customers to transition from H100 to H200 to B100 GPUs within their existing data centers without significant disruptions. Huang also highlighted the financial benefits driving demand for Nvidia's AI GPUs, noting that companies are eager to save and generate revenue by upgrading their infrastructure swiftly.
Expanding Product Line
In addition to AI GPUs, Nvidia plans to accelerate the development of its other chips, including CPUs, networking NICs, and switches, to match the new annual cadence. Huang stated, “We're going to take them all forward at a very fast clip,” indicating a broad expansion of Nvidia's chip portfolio.
Nvidia's CFO mentioned that the automotive sector would become the largest enterprise vertical within the data center segment this year, citing Tesla's purchase of 35,000 H100 GPUs for its full-self driving system. Meanwhile, consumer internet companies like Meta are expected to remain strong growth drivers, with Meta planning to operate over 350,000 H100 GPUs by the year's end.
Financial Performance and Market Demand
Nvidia reported a profit of $14 billion in a single quarter, largely driven by its AI chips. The H100 AI chip, known as Hopper, and the B200, known as Blackwell, are also used in gaming and creator GPUs. Huang mentioned that demand for Nvidia's AI GPUs is expected to outstrip supply for some time as companies are eager to get their infrastructure online. He also highlighted that the next company to reach a major AI milestone gets to announce groundbreaking AI, while the second company will only announce a marginal improvement. Some customers have purchased or plan to purchase over 100,000 H100 GPUs.