All TopicsQuantum Computing BasicsTimeline: 1994–2026Leading CompaniesShor’s AlgorithmThreat to CryptocurrencyHarvest Now, Decrypt LaterGrover’s Algorithm & MiningPost-Quantum CryptographyQuantum-Safe WalletsQuantum-Resistant Blockchains
Wiki/Fundamentals
9 min read

Thirty Years of Quantum Computing Milestones from Shor's Paper to Willow

Peter Shor proved in 1994 that a quantum machine could crack RSA encryption. Three decades of hardware breakthroughs — cloud-accessible chips, a disputed supremacy claim, a 1,121-qubit processor, and a chip that corrects its own errors in real time — have turned that theoretical result into an engineering countdown. The quantum computing timeline now reads less like a research calendar and more like a deployment schedule.

Key Takeaways

  • Peter Shor published his quantum factoring algorithm at AT&T Bell Labs in 1994, proving that a quantum computer could break RSA encryption in polynomial time — a result that turned quantum computing from a physics curiosity into an applied engineering race.
  • Google's Sycamore processor completed a calculation in 200 seconds that the company said would take the fastest classical supercomputer 10,000 years, though IBM contested the claim within weeks, arguing that classical simulation could match Sycamore in 2.5 days.
  • The global quantum computing industry generated $1.2 billion in revenue in 2025, according to McKinsey, with IBM, Google, IonQ, Rigetti, and PsiQuantum operating the largest commercial programs.

Shor's 1994 Paper Turned Quantum Computing from Physics into Applied Math

The field existed before Peter Shor, but it lacked a killer application. Richard Feynman had proposed quantum simulation in 1982. David Deutsch formalized the quantum Turing machine in 1985. A handful of physicists and computer scientists saw potential in quantum parallelism, yet no one had identified a problem where a quantum computer would dramatically outperform a classical one on a task people actually cared about.

Shor, a thirty-five-year-old mathematician at AT&T Bell Labs, changed that calculus with a single paper. His algorithm, presented at the 35th Annual Symposium on Foundations of Computer Science in November 1994, demonstrated that a quantum computer could factor large integers in polynomial time — exponentially faster than the best classical algorithms. RSA encryption, the backbone of internet security, relied entirely on the assumption that factoring was computationally infeasible. Shor's result destroyed that assumption in theory and set every intelligence agency on the planet into motion.

"I did not set out to break cryptography," Shor, now a professor of applied mathematics at MIT, said in a 2024 retrospective lecture at the Simons Institute. "I was trying to find a problem where quantum computing had a clear advantage. Factoring happened to be there, and the cryptographic implications were an almost accidental consequence."

The impact was immediate and paradoxical. Shor's algorithm demanded hardware that did not exist — not even close. The largest quantum system at the time consisted of a few trapped ions in a physics lab. Building the millions of stable qubits needed to run the algorithm against real-world key sizes seemed decades away. Yet the mathematical proof was irrefutable, and governments began funding quantum research at levels previously reserved for nuclear physics. DARPA, the NSA, and their counterparts in Europe and China all accelerated timelines. The race, still running in 2026, started here.

IBM and Rigetti Launched the First Cloud-Accessible Quantum Processors

For two decades after Shor's paper, quantum computers remained locked inside university labs and government facilities. Access was a bottleneck. Researchers who wanted to test quantum algorithms had to build their own hardware or negotiate time on someone else's machine — a slow, expensive process that limited the field to a few hundred specialists worldwide.

IBM broke that barrier in May 2016 by launching the IBM Quantum Experience, a five-qubit superconducting processor accessible through a web browser. The machine was noisy, slow, and limited to circuits shallow enough to execute before decoherence scrambled the results. It was also, in the context of the quantum computing timeline, a turning point. Within six months, more than 40,000 users had run experiments on the platform. Academic papers citing IBM's cloud hardware began appearing at a rate that outpaced the combined output of the previous five years.

Rigetti Computing followed in 2017 with Forest, its own cloud-based quantum development environment backed by a 19-qubit superconducting chip. The company, founded by Chad Rigetti after a stint in IBM's quantum lab, positioned itself as the startup alternative to Big Blue's institutional approach. Where IBM emphasized open access and education, Rigetti targeted commercial developers building hybrid quantum-classical applications.

Jay Gambetta, vice president of IBM Quantum, framed the strategic logic in a 2018 interview. "Putting quantum hardware on the cloud was not a research decision," Gambetta said. "It was an ecosystem decision. Algorithms improve faster when thousands of people are testing them, not dozens." That bet paid off. IBM's quantum network grew to over 200 institutional partners by 2020, and the feedback loop between cloud users and hardware engineers accelerated the pace of processor development in ways that isolated lab work could not replicate.

Amazon (Braket), Microsoft (Azure Quantum), and Google (Cirq with processor access) all launched competing cloud quantum services between 2019 and 2021. The democratization of access — imperfect, noisy, and limited as it was — expanded the talent pool from a few hundred PhD specialists to tens of thousands of software engineers experimenting with quantum circuits for the first time.

Google Claimed Quantum Supremacy with Sycamore in October 2019

The claim arrived in a Nature paper on October 23, 2019. Google's 53-qubit Sycamore processor, the company said, had completed a specific random circuit sampling task in 200 seconds that would take Summit — then the world's fastest classical supercomputer, operated by Oak Ridge National Laboratory — approximately 10,000 years to simulate. The term "quantum supremacy," coined by physicist John Preskill in 2012, described the moment a quantum device performs a calculation beyond the practical reach of any classical machine. Google declared that moment had arrived.

Sundar Pichai, CEO of Alphabet, compared the achievement to the Wright brothers' first flight. "The first plane flew for only 12 seconds, and there was no practical application at that point," Pichai said during the announcement. "But it showed the world that powered flight was possible. That is what Sycamore did for quantum computing."

IBM disagreed. Within weeks of Google's announcement, an IBM research team published a rebuttal arguing that Summit could perform the same calculation in approximately 2.5 days — not 10,000 years — by using disk storage to compensate for limited RAM and exploiting tensor-network simulation techniques. The debate, still unresolved in any definitive sense, exposed a fundamental difficulty in defining supremacy: the classical baseline is a moving target. As classical algorithms and hardware improve, the bar for quantum advantage shifts upward.

The controversy did not diminish the result's practical significance. Sycamore demonstrated that a superconducting quantum processor could maintain coherence and execute gates with sufficient fidelity to produce results that were, at minimum, extremely difficult for classical computers to reproduce. The processor operated with single-qubit gate errors of approximately 0.15 percent and two-qubit gate errors of about 0.6 percent — numbers that, while too high for error-corrected computation, represented a substantial improvement over the hardware available even two years earlier.

China's Photonic Processors Challenged Superconducting Dominance in 2021

Superconducting qubits, the technology behind IBM's and Google's processors, were not the only path. In December 2020, a team led by Jian-Wei Pan at the University of Science and Technology of China (USTC) reported that Jiuzhang, a photonic quantum computer using squeezed-state boson sampling, had solved a Gaussian boson sampling problem in 200 seconds that would take the Fugaku supercomputer an estimated 600 million years. The claim of quantum advantage — using photons rather than superconducting circuits — opened a second front in the hardware race.

Jiuzhang operated on a fundamentally different principle. Instead of manipulating individual qubits through gate operations, the photonic approach encoded information in the quantum states of light and processed it through interferometric networks. The system used 76 detected photons in its initial demonstration and scaled to 113 photons in Jiuzhang 2.0, published in October 2021. A parallel effort at USTC produced Zuchongzhi 2.1, a 66-qubit superconducting processor that the team said outperformed Sycamore on the same random circuit sampling benchmark by a factor of one million.

The geopolitical dimension was hard to ignore. China's quantum program, backed by an estimated $15 billion in government investment according to a 2022 report by the Center for Data Innovation, had produced two architectures that each claimed quantum advantage independently. The United States, despite leading in private-sector quantum investment through companies like Google, IBM, and IonQ, did not have a government-funded program of comparable scale. Europe launched the Quantum Flagship initiative in 2018 with a one-billion-euro budget over ten years — substantial, but an order of magnitude below China's spending.

Photonic quantum computing carried a structural advantage: photons do not require cooling to millikelvin temperatures. The dilution refrigerators that superconducting processors demand — machines the size of a room that cost millions of dollars and consume enormous power — are a scaling bottleneck that photonic systems sidestep entirely. PsiQuantum, a California-based startup that raised $665 million by 2024, bet its entire roadmap on photonic qubits manufactured in GlobalFoundries semiconductor fabs, arguing that only photonics could scale to the million-qubit systems needed for fault tolerance.

IBM Hit 1,121 Qubits with Condor but Logical Qubit Counts Lagged

IBM unveiled the Condor processor in December 2023. The chip contained 1,121 superconducting transmon qubits arranged in a honeycomb lattice — the first quantum processor to cross the 1,000-qubit threshold. The announcement dominated headlines. It also illustrated a growing disconnect between raw qubit counts and practical computational power.

Condor's qubits were physical qubits, each subject to gate errors, decoherence, and crosstalk with neighboring qubits. Running Shor's algorithm or any other fault-tolerant computation requires logical qubits — abstract units of information encoded across many physical qubits using error correction codes. At Condor's reported error rates, the processor could not produce a single reliable logical qubit. The 1,121 physical qubits were, for fault-tolerant purposes, equivalent to zero.

Gambetta, who oversaw the Condor project, acknowledged the gap in an interview with MIT Technology Review. "Condor was a hardware milestone, not a computational one," Gambetta said. "It proved that superconducting processors can scale beyond a thousand qubits without losing fabrication yield. The next step — and the harder step — is making those qubits good enough to encode logical information."

IBM pivoted its strategy accordingly. Rather than continuing to race toward higher physical qubit counts, the company shifted focus to its Heron processor, a 133-qubit chip with significantly improved error rates and a modular architecture designed for multi-chip scaling. The roadmap IBM published in 2023 targeted 100,000 physical qubits by 2033 through interconnected modules of Heron-class chips, with the goal of producing enough logical qubits for commercially relevant computations. The pivot reflected a broader realization across the industry: qubit count without qubit quality is a vanity metric.

Google's Willow Chip Demonstrated Real-Time Error Correction in Late 2024

Google's Quantum AI lab announced the Willow chip in December 2024, and the result shifted the conversation from qubit quantity to error-correction quality. Willow, a 105-qubit superconducting processor, achieved something no previous hardware had demonstrated: increasing the number of physical qubits in a surface code actually reduced the logical error rate. The chip operated below the critical error threshold, a boundary that quantum error correction theory had defined decades earlier but that no experimental system had crossed.

Hartmut Neven, founder of Google Quantum AI, described the result as the field's most important hardware milestone since Sycamore. "Error correction on paper is just math," Neven said in a blog post accompanying the announcement. "Error correction on a chip means the machine is improving itself as it grows. That is the difference between a science experiment and an engineering program."

The technical details mattered. Willow achieved a physical error rate of approximately 3.5 times 10 to the negative fourth power — roughly three times better than Sycamore's 2019 performance. At that error rate, each doubling of the physical qubits in a surface code patch cut the logical error rate in half, matching the exponential suppression predicted by theory. Prior attempts at surface code demonstrations, including experiments by IBM and academic groups, had failed to reach this regime; adding more physical qubits had either left the logical error rate flat or, in some cases, made it worse.

Willow also completed a random circuit sampling benchmark in under five minutes that Google estimated would take the world's fastest supercomputer 10 septillion (10 to the 25th) years. The classical estimate, like the Sycamore claim before it, was contested by researchers who argued that improved tensor-network methods could narrow the gap. But the error correction result stood independent of any supremacy argument. Below-threshold operation is a binary achievement: either the logical error rate drops as the code distance grows, or it does not. Willow demonstrated that it does.

Commercial Quantum Computing Revenue Crossed $1.2 Billion in 2025

The money followed the milestones. McKinsey's 2025 Technology Trends report estimated that global quantum computing revenue — encompassing hardware sales, cloud access fees, consulting, and quantum software licenses — reached $1.2 billion, up from $720 million in 2023. The growth was concentrated among five companies: IBM, Google, IonQ, Rigetti, and PsiQuantum.

IBM accounted for the largest single share. Its Quantum Network, a consortium of over 250 organizations paying for priority access to IBM's latest processors and software tools, generated an estimated $340 million in annual revenue according to Bernstein Research. Clients included JPMorgan Chase (portfolio optimization), Boeing (materials simulation), and the Cleveland Clinic (drug discovery screening). None of these applications had achieved quantum advantage — classical computers still outperformed the quantum hardware on every commercial workload. The value proposition, for now, rested on experimentation and readiness: organizations were paying to learn the technology before it became operationally critical.

IonQ, the largest pure-play quantum computing company by market capitalization, generated $43 million in revenue for fiscal year 2024 and projected $75 million to $95 million for 2025. The company's trapped-ion architecture offered lower gate error rates than superconducting competitors, and its #AQ (algorithmic qubits) benchmark — a metric IonQ developed to measure useful computational capacity rather than raw qubit count — reached 36 on the Forte Enterprise system. Rigetti, which went public via SPAC in 2022, generated $15 million in 2024 revenue while burning through cash at a rate that prompted two rounds of restructuring.

Venture capital remained aggressive. PsiQuantum closed a $450 million Series D in early 2024, and the Australian government committed an additional $620 million to host PsiQuantum's first commercial photonic quantum computer in Brisbane. Total private investment in quantum computing companies exceeded $4.2 billion between 2020 and 2025, according to PitchBook data. The capital inflow reflected a bet that fault-tolerant quantum computing was no longer a question of "if" but "when" — and that early movers in hardware, software, and talent acquisition would capture disproportionate value once the machines crossed the utility threshold.

The 2026 Landscape: Five Architectures Competing for Fault Tolerance

As of early 2026, five distinct hardware architectures are contending for dominance in the race toward fault-tolerant quantum computing. Superconducting qubits, the technology behind IBM, Google, and Rigetti processors, hold the largest installed base and the deepest software ecosystem. Trapped-ion systems, championed by IonQ and Quantinuum (a Honeywell subsidiary), offer the highest gate fidelities but face scaling challenges related to ion transport and trap complexity. Photonic quantum computing, led by PsiQuantum and Xanadu, promises room-temperature operation and semiconductor-fab manufacturing but has not yet demonstrated a fault-tolerant logical qubit.

Neutral-atom processors represent a fourth path. Atom Computing, acquired by Infleqtion (formerly ColdQuanta) in 2025, demonstrated a 1,225-qubit neutral-atom array — the highest qubit count of any architecture at the time of its announcement. Neutral atoms can be rearranged dynamically, enabling long-range entanglement without the fixed connectivity constraints of superconducting chips. A fifth architecture, topological qubits based on Majorana fermions, is Microsoft's long-term bet; the company published evidence of a topological phase in a nanowire device in 2023, but a functioning topological qubit suitable for computation has not been publicly demonstrated.

No single architecture has established a clear lead toward fault tolerance. Superconducting systems have the most mature engineering pipeline and the only below-threshold error correction demonstration (Willow). Trapped ions have the best raw gate fidelities. Photonics has the most plausible manufacturing story for million-qubit scale. Neutral atoms have the highest qubit counts and the most flexible connectivity. Topological qubits, if they work, could require orders of magnitude fewer physical qubits per logical qubit, but the underlying physics remains unproven at scale.

The timeline from here depends on which architecture solves the error correction problem first at scale. IBM targets 100,000 qubits by 2033 through modular interconnects. Google's internal roadmap, per its 2025 quantum strategy document, aims for a million physical qubits before 2035. PsiQuantum has said it expects its first fault-tolerant machine to be operational in the late 2020s, a timeline many outside observers consider aggressive. Quantinuum projects 10 logical qubits by 2028 and 100 by 2032.

The last three decades of the quantum computing timeline follow a pattern that resists straight-line extrapolation. Progress has come in bursts — Shor's algorithm, then years of slow hardware progress, then cloud access, then a disputed supremacy claim, then a 1,000-qubit chip that could not compute, then a 105-qubit chip that could correct its own errors. Each milestone redefined what the next milestone needed to be. If the pattern holds, the defining breakthrough of the next decade may not be the one any current roadmap anticipates — and the timeline for a cryptographically relevant quantum computer could compress or stall depending on engineering problems that have not yet been fully characterized.

Check wallet quantum exposure

QuantumShield scans Bitcoin, Ethereum, and Solana addresses for quantum vulnerability markers — including exposed public keys, legacy address formats, and key-reuse patterns.

Scan an Address

This article is part of QuantumShield's quantum computing wiki.

This is not financial advice. Data as of May 3, 2026.

QuantumShield© 2026. All rights reserved.

Powered by blockchain public data. No wallet connection required for basic scan.