IBM has announced a detailed roadmap to build a large-scale, fault-tolerant quantum computer by 2029, a move the company claims is based on a fundamental breakthrough that solves the primary scientific obstacles to practical quantum computing. The tech giant asserts it has found the “only realistic path” forward, with vice president of quantum Jay Gambetta framing the development as a pivotal moment: “With this news, we’re confident that large-scale quantum computing is no longer a question of science but an engineering challenge.”
At the heart of the plan is the IBM Quantum Starling, a machine projected to have 200 logical qubits—built from approximately 20,000 physical qubits—and capable of running 100 million quantum operations. According to a report by The Wall Street Journal, this new system will be housed in IBM’s Poughkeepsie, New York data center. The company’s confidence stems from recent advances in a novel error-correction technique and a massive financial commitment to its US-based technology initiatives.
This ambitious timeline aims to finally deliver on the promise of a quantum computer that can solve problems beyond the reach of even the most powerful classical supercomputers. However, the announcement comes amid fierce competition and analyst skepticism about the tangible business value of quantum technology in the near term.
The Error Correction Breakthrough
The core of IBM’s claimed breakthrough is a strategic pivot in how it handles quantum errors, the persistent noise that has plagued the development of stable quantum systems. IBM has abandoned the long-standard “surface code approach” in favor of what it calls “quantum low-density parity check” (qLDPC) codes. This novel “Gross code” approach reportedly offers a tenfold reduction in the number of physical qubits needed per logical qubit.
This new method is not just more efficient but also more practical for hardware design. According to IBM, the architecture requires each qubit to connect to only six others, a simplicity that allows for routing on just two layers within the chip. This development follows the 2023 launch of its latest quantum processor, the 133-qubit Heron, which first signaled a strategic focus on error reduction over raw qubit counts.
To manage this process, IBM has also developed a new heuristic decoder named “Relay-BP,” which is designed to interpret data from the physical qubits and correct errors in real-time using conventional computing resources. The company’s head of quantum process technology, Matthias Steffen, explained that the approach has been validated in two recent research papers published on arXiv.
A Crowded and Divergent Field
IBM’s declaration of a “realistic path” is a bold claim in a field where rivals are pursuing fundamentally different strategies. Microsoft, for instance, is developing its Majorana 1 quantum processor, which uses experimental “topological qubits” designed to be inherently stable. This high-risk, high-reward strategy aims to solve the error problem at the hardware level, though a 2025 report from the Virginia Division of Legislative Services notes it has been met with some skepticism in the academic community.
Meanwhile, Amazon’s AWS has entered the hardware race with its Ocelot chip, which employs “bosonic qubits” to naturally suppress errors. AWS Quantum Hardware Director Oskar Painter highlighted their distinct philosophy: “We didn’t take an existing architecture and then try to incorporate error correction afterwards. We selected our qubit and architecture with quantum error correction as the top requirement.”
Google is also heavily invested, having made progress on error correction with its Willow quantum chip progress, and its own roadmap targets commercial computers by 2030, placing it on a nearly identical timeline to IBM.
The Long Road to 2029
IBM’s path to Starling is paved with several interim systems, as the company firms up its quantum roadmap. The plan includes the IBM Quantum Loom later this year to test the new architecture, followed by Kookaburra in 2026 and, another system named Cockatoo in 2027. This represents an evolution of its public roadmap; a 2022 plan, which followed the announcing its new Osprey system, had originally slated Kookaburra for a 2025 release.
Looking further ahead, IBM has already planned a successor to Starling: a 2,000-logical-qubit machine named Blue Jay, targeted for 2033. Backing this long-term vision is a substantial financial commitment, with over $30 billion of a larger $150 billion US investment plan earmarked for quantum and mainframe R&D. In a press release, IBM Chairman and CEO Arvind Krishna stated that the company’s expertise is “paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges.”
From Scientific Challenge to Commercial Viability
Despite the technical confidence, significant headwinds remain on the path to commercial value. Estimates for when quantum computing will provide a return on investment vary widely, from three to twenty years. Gartner analyst Chirag Dekate noted that the industry has not yet reached a transformative inflection point: “The reality in quantum is that we are not yet at the ChatGPT-like moment where the technology, algorithms and impact become visceral and undeniable.”
A major hurdle is the wider talent gap in quantum technology. A 2022 analysis by McKinsey highlighted that for every three quantum job openings, there is only one qualified candidate, a gap that could stifle the creation of the ecosystem needed to translate quantum advantage into business applications. A senior researcher at IBM acknowledged this, stating, “The challenge isn’t just building these systems but creating an ecosystem around them that can translate quantum advantage into business value.”
Ultimately, IBM’s announcement reframes the quantum race from a contest of scientific discovery to one of methodical engineering and system integration. While the company has unveiled its path, the journey to a truly practical and commercially viable quantum computer will depend as much on building a skilled workforce and a robust software ecosystem as it will on the performance of the hardware itself. The next five years will determine if this “realistic path” can navigate the very real-world challenges that lie ahead.