The semiconductor industry is experiencing a surge in demand for AI-related technology, as reported by Counterpoint Research. This trend is expected to continue throughout the year, highlighting the growing importance of AI in driving sector growth.
Major cloud service providers, including AWS, Google, Meta, and Microsoft, are planning to expand their AI infrastructure, with total capital expenditure projected to reach $170 billion in 2024. This surge in investment is driving up demand for AI processors and putting pressure on production capacities, particularly at TSMC, which is struggling to keep up with the demand for its CoWoS technology. CoWoS, or Chip-on-Wafer-on-Substrate, is an advanced semiconductor packaging technology designed to improve the performance and efficiency of high-performance computing (HPC) and AI processors.
The larger size of interposers required for the latest AI and HPC processors from Nvidia and AMD means fewer interposers can be obtained from each 300-mm wafer, straining CoWoS production capacity. Additionally, the number of HBM stacks integrated around GPUs is increasing, adding to the production challenges.
HBM stacks, or High Bandwidth Memory stacks, are a type of advanced memory technology designed to provide significantly higher bandwidth compared to traditional memory solutions. HBM technology is used in high-performance computing (HPC), artificial intelligence (AI), graphics processing units (GPUs), and other data-intensive applications.
As the interposer area grows, the capacity to meet GPU demand diminishes, leading to a persistent shortage in TSMC’s CoWoS production capacity.
AI Spurs Semiconductor Growth
Foundry companies reported a 12 percent increase in revenue year over year in the first quarter of 2024. However, this growth was tempered by a 5 percent decline from the previous quarter. Counterpoint Research attributes this mixed performance to a sluggish recovery in non-AI semiconductor demand, which impacts sectors such as smartphones, consumer electronics, IoT, automotive, and industrial applications.
TSMC’s Strategic Adjustments
TSMC has adjusted its growth forecast for the logic semiconductor industry to 10 percent for the remainder of the year. Despite this conservative outlook, TSMC anticipates its revenue from datacenter AI products, particularly GPUs, to more than double. The company is grappling with high demand and does not expect to meet production needs, even with plans to double the capacity of its Chip-on-Wafer-on-Substrate (CoWoS) multi-chip packaging process.
Compared to the same period last year, TSMC’s market share increased from 61 percent to 62 percent. The second-largest player, Samsung saw its share rise from 11 percent to 13 percent. China’s Semiconductor Manufacturing International Corporation (SMIC) also experienced growth, moving from 5 percent to 6 percent. United Microelectronics Corporation (UMC) maintained a stable 6 percent share, while GlobalFoundries’ share declined from 7 percent to 5 percent.
Samsung and SMIC: Performance and Projections
Samsung’s foundry revenue declined in Q1 due to seasonal factors affecting smartphone sales. Despite this, the Galaxy S24 performed well, although demand for mid- and low-end devices remained weak. At the group level, Samsung Electronics reported a 68 percent year-on-year increase in Q1 2024, driven largely by memory sales fueled by AI demand. The company expects a rebound in foundry revenue in Q2, forecasting double-digit growth.
SMIC has exceeded market expectations, securing the third spot among top foundries for the first time. This growth is attributed to a recovery in the Chinese market, with expectations for continued growth in Q2 as inventory restocking expands. Counterpoint projects mid-teen growth for SMIC throughout the year.
Nvidia’s Impact on Production
Nvidia’s latest Blackwell-series processors (GB200, B100, B200) are expected to exacerbate this issue by consuming even more CoWoS capacity. TSMC aims to increase its monthly production to 40,000 units by the end of 2024, a significant rise from the previous year.
Producing HBM stacks is becoming increasingly complex, as more EUV layers are required to build faster HBM memory.
SK Hynix, the leading HBM maker, used a single EUV layer for its 1α process technology but is moving to three to four times more layers with its 1β fabrication process. This could cut cycle times but will clearly increase the cost of new HBM3E memory.
Future Prospects and Innovations
Each new generation of HBM brings an increase in the number of DRAM devices. While HBM2 stacks four to eight DRAMs, HBM3/3E increases this to eight or even 12 devices, and HBM4 will push it further to 16, which will again increase the complexity of HBM memory modules.
In response to these challenges, industry players are exploring alternative solutions. Intel, for instance, is developing rectangular glass substrates to replace traditional 300-mm wafer interposers. However, this approach will require significant research, development, and time before it can become a viable alternative, underscoring the ongoing struggle to meet the current rising demands for AI processor production.