Quantum Bits Operate on Rules Classical Computers Cannot Replicate
Richard Feynman proposed in 1981 that only a quantum system could efficiently simulate another quantum system. Four decades later, IBM, Google, and a growing field of trapped-ion startups are building hardware that proves him right—while exposing how far the technology remains from the fault-tolerant machines that theoretical breakthroughs demand.
Key Takeaways
- ◆IBM's 1,121-qubit Condor processor, announced in December 2023, holds the record for the highest qubit count on a single superconducting chip, though logical qubit counts—the metric that determines real computational power—remain under 100 across all platforms.
- ◆Google and Caltech researchers said in late March 2026 that breaking Bitcoin's ECDSA encryption requires fewer than 500,000 physical qubits, a figure that fell from 20 million in 2019 estimates.
- ◆The global quantum computing market generated $1.2 billion in revenue in 2025, according to McKinsey, with projections reaching $80 billion by 2040 if error correction milestones are met on schedule.
John Preskill, the Richard Feynman Professor of Theoretical Physics at Caltech, coined the term "quantum supremacy" in 2012 to describe the moment a quantum processor solves a problem no classical machine can match in reasonable time. That moment arrived—at least by one narrow definition—when Google's Sycamore chip completed a specific sampling task in 200 seconds in October 2019. IBM disputed the claim within 48 hours, arguing that a classical supercomputer could finish the same task in 2.5 days with better programming.
The dispute revealed something important about the field. Quantum computing does not operate as a simple upgrade to classical hardware. It relies on entirely different physical principles, solves a different class of problems, and fails in ways that silicon transistors do not. Understanding those differences is a prerequisite for evaluating the technology's implications for cryptography, finance, and blockchain security.
Transistors store one state while qubits hold two at once
A classical transistor—the fundamental unit of every laptop, server, and smartphone processor—stores information as either a 0 or a 1. Eight transistors form a byte. Billions of them, switching on and off billions of times per second, produce the computational power behind modern software. The model is binary, deterministic, and well understood after 75 years of engineering refinement.
A qubit operates differently. Through a quantum mechanical property called superposition, a single qubit exists in a combination of 0 and 1 simultaneously until it is measured. Measurement collapses the superposition into one definite state. Two qubits in superposition can represent four states at once; three qubits can represent eight. The relationship is exponential: 300 qubits in full superposition could, in principle, represent more states than there are atoms in the observable universe.
"Superposition is not parallelism in the classical sense," Preskill said during a 2024 lecture at the Kavli Institute. "A quantum computer does not try every answer at the same time. It manipulates amplitudes—the probabilities of outcomes—so that correct answers become more likely and incorrect answers cancel out." That distinction matters because it limits the class of problems where quantum machines hold an advantage. Searching a database, running a spreadsheet, or training most machine learning models does not benefit from superposition. Factoring large numbers and simulating molecular interactions does.
Physical qubit implementations vary widely. IBM and Google use superconducting circuits cooled to 15 millikelvin, roughly 100 times colder than deep space. IonQ traps individual ytterbium atoms with electromagnetic fields and manipulates them with lasers. PsiQuantum is betting on photonic qubits encoded in particles of light. Each approach carries trade-offs in coherence time (how long a qubit maintains its quantum state), gate fidelity (how accurately operations execute), and scalability (how many qubits can be linked on a single device).
Entanglement links particle behavior across physical distance
Albert Einstein called it "spooky action at a distance." Entanglement occurs when two or more qubits become correlated so that measuring one instantly determines the state of the other, regardless of the physical separation between them. The phenomenon has been confirmed in experiments spanning hundreds of kilometers using fiber-optic links and satellite relays.
Entanglement is not a communication channel. No information travels faster than light. The correlation, however, enables quantum algorithms to coordinate computations across multiple qubits in ways that have no classical equivalent. Peter Shor's 1994 factoring algorithm—the discovery that put quantum computing on the cryptography community's radar—depends on entanglement to perform a quantum Fourier transform across a register of entangled qubits. Without entanglement, the algorithm does not work.
Maintaining entanglement at scale is one of the hardest engineering challenges in the field. Environmental noise (thermal fluctuations, stray electromagnetic fields, even cosmic rays) can destroy entanglement in microseconds, a process called decoherence. Superconducting qubits at IBM maintain coherence for roughly 100 to 300 microseconds. Trapped-ion systems at IonQ and Quantinuum have demonstrated coherence times exceeding 10 seconds, but their gate operation speeds are correspondingly slower. The tension between coherence time and gate speed defines much of the architectural competition in the industry.
Gate-model quantum processors run circuit-based algorithms
Two dominant paradigms compete in quantum hardware. The gate model, used by IBM, Google, IonQ, Rigetti, and Quantinuum, treats a quantum computation as a sequence of logical gates applied to qubits—analogous to the logic gates in classical processors but operating on quantum states. A Hadamard gate places a qubit into superposition. A CNOT gate entangles two qubits. Combining these and other gates into circuits allows programmers to implement algorithms like Shor's factoring method and Grover's search acceleration.
Jay Gambetta, vice president of IBM Quantum, described the gate model as "the general-purpose architecture for quantum computing" in a February 2026 briefing. "Gate-model machines can run any quantum algorithm. The constraint is not the model itself but the noise in current hardware." IBM's roadmap targets 100,000 qubits by 2033, a trajectory that requires modular chip-to-chip connections because no single cryostat can house that many superconducting qubits.
Gate fidelity—the probability that a gate operation produces the correct result—is the bottleneck. IBM's Eagle processor (127 qubits, launched in 2021) achieved two-qubit gate fidelities around 99.5%. Google's Willow chip, unveiled in December 2024 with 105 qubits, pushed two-qubit fidelities above 99.7%. Those numbers sound high. They are not high enough. A computation requiring 1,000 gate operations at 99.7% fidelity per gate would produce a correct result only about 5% of the time. Error correction exists to close that gap, but it comes at an enormous cost in physical qubits per logical qubit.
Quantum annealers solve optimization but not cryptographic problems
D-Wave Systems, the Canadian company that has shipped quantum hardware since 2011, does not build gate-model processors. Its machines use quantum annealing, a process inspired by the metallurgical technique of slowly cooling metal to remove structural defects. A quantum annealer encodes a problem into the energy landscape of a system of qubits, then allows the system to settle into its lowest-energy state—which corresponds to the optimal or near-optimal solution.
D-Wave's Advantage2 system, expected to ship in late 2026, will feature over 4,000 qubits with higher connectivity than its predecessor. The company has reported results in logistics optimization, materials science, and financial portfolio balancing. However, quantum annealing cannot run Shor's algorithm. It cannot break RSA or ECDSA encryption. The architecture is not universal; it handles optimization problems well but lacks the gate-by-gate control needed for arbitrary quantum circuits.
The distinction matters for evaluating quantum risk to cryptocurrency. Headlines about "quantum computers with thousands of qubits" often reference D-Wave systems, which pose no direct threat to cryptographic security. The gate-model machines from IBM, Google, and the trapped-ion labs—with far fewer qubits but the ability to run Shor's algorithm—are the ones that the cryptography community monitors.
Error correction remains the barrier between theory and production
Every physical qubit is noisy. Thermal interference, control signal imprecision, and crosstalk between adjacent qubits introduce errors at rates that would be unacceptable in classical computing. Quantum error correction (QEC) addresses this by encoding one logical qubit across many physical qubits, using redundancy to detect and fix errors during computation. The overhead is steep: current estimates suggest that 1,000 to 10,000 physical qubits may be needed to produce a single reliable logical qubit, depending on the error rate of the underlying hardware.
Google's Willow chip marked a milestone in December 2024 by demonstrating that adding more qubits to an error-correcting code reduced the overall error rate rather than increasing it—a result called "below threshold" performance. Hartmut Neven, founder and lead of Google Quantum AI, called it "the most important experimental result in quantum error correction so far" in a December 2024 statement. "Willow showed that the surface code works as theory predicts. Scaling it is an engineering problem, not a physics problem."
IBM took a different path. Its 2025 Heron processor (133 qubits) emphasized improved gate fidelities and faster repetition rates over raw qubit counts, aiming to make each physical qubit more useful before layering error correction on top. The approach reflects a philosophical split: Google prioritizes demonstrating QEC at scale, while IBM focuses on reducing the physical-to-logical qubit ratio through better hardware.
Running Shor's algorithm against 256-bit elliptic curve keys—the cryptography securing Bitcoin and Ethereum wallets—requires an estimated 2,000 to 4,000 logical qubits. At a ratio of 1,000 physical qubits per logical qubit, that translates to 2 to 4 million physical qubits. Google and Caltech's March 2026 paper revised the figure downward to fewer than 500,000 physical qubits by proposing more efficient error correction schemes, but no laboratory has demonstrated those schemes at the required scale.
IBM, Google, and trapped-ion labs measure progress differently
Qubit count is the metric that dominates headlines. It is also the least informative. IBM's 1,121-qubit Condor chip, the industry's largest superconducting processor as of early 2026, cannot outperform a classical computer on any commercially relevant task. Raw qubit numbers do not account for connectivity (how many other qubits each qubit can interact with), coherence time, or gate fidelity—all of which determine whether those qubits can do useful work.
IBM introduced a metric called Circuit Layer Operations Per Second (CLOPS) to capture throughput in a way that raw qubit counts cannot. Gambetta has argued that CLOPS, combined with a measure of circuit depth (how many sequential gate operations a processor can execute before errors overwhelm the result), gives a more honest picture of capability. Google prefers to benchmark against specific computational tasks, such as random circuit sampling, where its Willow chip completed in under five minutes a calculation that Google estimates would take the fastest classical supercomputer 10 septillion years.
Trapped-ion companies measure success on yet another axis. Quantinuum's H2 processor, with just 56 qubits, achieved a quantum volume of 65,536 in 2024—a composite metric developed by IBM that accounts for qubit count, connectivity, and gate fidelity together. IonQ's Forte Enterprise system (36 algorithmic qubits) has reported gate fidelities above 99.9% for single-qubit operations, a figure that superconducting systems have not matched at scale. Fewer qubits, fewer errors: that is the trapped-ion value proposition in a sentence.
PsiQuantum, the photonic startup backed by $700 million in funding, has not yet shipped a processor but claims its architecture can scale to 1 million qubits using existing semiconductor fabrication plants. The company expects its first fault-tolerant system by 2029. If that timeline holds, it would place photonic quantum computing on a collision course with the cryptographic thresholds that Google and Caltech identified in their March 2026 paper.
Classical simulations still outperform most quantum hardware in 2026
IBM researchers published a result in June 2023 that complicated the supremacy narrative. Using a 127-qubit Eagle processor, the team simulated the dynamics of a magnetic material and produced results that matched—and in some configurations outperformed—the best classical simulations. The achievement was real. But a team at the Flatiron Institute responded within months with a classical algorithm that matched the quantum result on a conventional supercomputer, raising questions about whether the quantum advantage was durable.
The pattern has repeated. Google's 2019 supremacy claim (Sycamore, 53 qubits) was challenged by IBM and later by Chinese researchers who simulated the same task classically. Google's 2024 Willow result used a more complex sampling task that has so far resisted classical simulation, but the gap between quantum hardware and classical algorithms is a moving target. Classical simulation techniques—tensor network methods, in particular—have improved dramatically in the same period that quantum hardware has scaled up.
Preskill addressed the tension in a January 2026 essay. "The era of quantum advantage is not a single event," he wrote. "It is a gradual process in which quantum processors pull ahead on increasingly practical problems, while classical competitors keep closing the gap on the problems already claimed." No quantum processor has yet demonstrated an advantage on a problem with direct commercial value. Drug discovery, materials simulation, and cryptanalysis—the three applications most often cited as quantum computing's killer use cases—all require logical qubit counts and gate depths that exceed current hardware by orders of magnitude.
McKinsey's 2025 quantum technology report estimated $1.2 billion in industry revenue for the year, the majority from cloud-based quantum access services and government research contracts rather than end-user applications. The firm projected revenue reaching $80 billion by 2040, but conditioned that figure on achieving fault-tolerant quantum computing by the early 2030s—a timeline that depends on error correction milestones no laboratory has yet demonstrated at the required scale.
The implications for cryptocurrency security follow directly from this gap. Shor's algorithm, the quantum method that threatens ECDSA and RSA encryption, requires thousands of logical qubits running for hours without interruption. Current hardware supports tens of logical qubits running for milliseconds. The distance between those two numbers is not a matter of incremental improvement; it requires breakthroughs in error correction, interconnect technology, and qubit stability that have been demonstrated in principle but not at production scale.
Whether that distance closes in five years, ten years, or twenty depends on variables that no single research group controls—government funding levels, talent availability, manufacturing yields for quantum chips, and the possibility of algorithmic shortcuts that reduce qubit requirements in ways that have not yet been discovered. The only certainty, as Preskill has noted repeatedly, is that the physics works. The engineering is what remains.
Related Wiki Entries
Shor’s Algorithm and Public-Key Cryptography
How Peter Shor’s 1994 breakthrough makes ECDSA and RSA vulnerable to quantum factoring attacks.
Quantum Computing Timeline: 1994–2026
Thirty years of milestones from Shor’s paper through Google’s Willow chip and beyond.
The Quantum Threat to Cryptocurrency
Named sources from Coinbase, Bernstein, and Ark Invest weigh in on quantum risk timelines.
Check wallet quantum exposure
QuantumShield scans Bitcoin, Ethereum, and Solana addresses for quantum vulnerability markers—including exposed public keys, legacy address formats, and key-reuse patterns.
Scan an AddressThis article is part of QuantumShield's quantum computing wiki.
This is not financial advice. Data as of May 3, 2026.