NVIDIA CEO Jensen Huang has expressed skepticism about the reliability of optical chip technology, signaling that copper wiring remains the preferred choice for now.
Discussing the potential use of optical interconnects, Huang emphasized reliability concerns as a key barrier, saying copper connections were “orders of magnitude” more reliable than today’s co-packaged optical connections reports Reuters.
Despite this caution, NVIDIA has not entirely ruled out optical technology. The company has invested in optical chip startup Ayar Labs, highlighting its strategic interest in the technology’s long-term potential.
Based in San Jose, California, the company has developed the industry’s first optical I/O technology that replaces traditional electrical-based interconnects, addressing critical challenges in AI infrastructure such as data transfer bottlenecks, power consumption, and latency.
Ayar Labs’ innovative approach uses silicon photonics to transmit data using light, enabling a 1000x improvement in interconnect bandwidth density at one-tenth the power consumption of conventional methods.
NVIDIA intends to selectively incorporate optical connectivity into upcoming networking chips designed for data center switches, anticipated to deliver three times greater energy efficiency. These products are slated for launch beginning in late 2025, with additional deployments planned through 2026.
IBM Accelerates Optical Technology for AI Workloads
In contrast to NVIDIA’s careful approach, IBM is moving swiftly with its own optical connectivity solutions for AI infrastructure.
IBM recently unveiled its new co-packaged optics module, designed to replace traditional copper-based electrical connections within data centers. The new technology integrates polymer optical waveguides (PWG), dramatically boosting internal data transfer speeds and delivering bandwidth improvements up to 80 times higher than conventional wiring.
IBM says its optical solution promises substantial benefits for AI model training efficiency. The company claims its module could significantly reduce GPU downtime, a major bottleneck during AI training, enabling training cycles to shorten from three months down to approximately three weeks.
Moreover, IBM estimates that energy consumption could decrease by a factor of five, which translates into notable operational and environmental savings. According to IBM, the energy conserved during just one AI training cycle could power roughly 5,000 U.S. households for an entire year.
Rigorous testing under harsh conditions has reinforced IBM’s confidence in its optical technology. The module underwent extensive stress-testing, including exposure to extreme temperatures ranging from -40°C to 125°C, high humidity, and mechanical stresses.
According to IBM Research, the tests demonstrated that the module maintained reliable performance under these rigorous conditions. IBM’s Mukesh Khare, General Manager of the Semiconductors division, underscored the significance of this breakthrough: “Co-packaged optics enables high-speed optical connectivity through fibers assembled in close proximity to accelerators, reducing the communications gap between AI models.”
The optical technology was developed collaboratively at global facilities, notably at the Albany NanoTech Complex in New York and IBM’s Bromont facility in Quebec. These sites were instrumental in prototyping the optical modules, preparing the technology for eventual commercial deployment and licensing.
Challenges Remain for Broader Optical Adoption
Despite IBM’s optimism and technological advancements, optical chip technology still faces substantial hurdles before widespread industry adoption can occur.
NVIDIA CEO Jensen Huang is not alone in highlighting these reliability concerns; other industry experts have also noted significant manufacturing and cost barriers.
Mark Wade, CEO of Ayar Labs, anticipates optical chips will only become the dominant interconnect technology around 2028 or later, given current limitations and associated costs.
To accelerate adoption, IBM is taking proactive steps by licensing its co-packaged optics technology to external chip foundries. This strategy is intended to establish a wider ecosystem of industry partners and applications, extending from generative AI workloads to future telecommunications infrastructure, such as 5G and 6G networks.
Diverging Strategies Shape the Future of AI Hardware
The clear divergence between NVIDIA’s cautious approach and IBM’s assertive strategy highlights a crossroads for AI hardware development.
NVIDIA continues to emphasize reliability and gradual performance improvements using copper interconnects, while IBM’s ambitious adoption of optical technology may redefine industry benchmarks for efficiency, speed, and power consumption.
If co-packaged optics technology from IBM and others can deliver consistently reliable and cost-effective results in large-scale commercial deployments, NVIDIA and other hardware manufacturers may soon reconsider their cautious stance.
For now, though, copper wiring maintains its dominance in NVIDIA’s GPU designs, even as optical solutions increasingly represent a viable alternative.