Quantum Computing Fundamentals for Security Professionals
Learning Objectives
Explain the physical principles behind qubits (superposition, entanglement) without requiring advanced physics
Compare the major qubit technologies (superconducting, trapped ion, photonic, neutral atom) and their trade-offs
Analyze why decoherence is the fundamental challenge and how error correction attempts to solve it
Interpret quantum hardware specifications (qubit count, gate fidelity, coherence time) and what they mean for cryptographic threats
Evaluate vendor claims by understanding what metrics actually matter for running Shor's algorithm
When Google announced Willow's "below-threshold error correction," did you know what that meant? When IBM touts "1,121 qubits," do you understand why that doesn't threaten your XRP holdings?
Most quantum computing coverage assumes you either have a physics PhD or accept claims at face value. Neither serves investors well. This lesson provides the technical foundation you need—enough to evaluate claims critically, not enough to design quantum hardware yourself.
Think of this as learning to read a car's specifications without becoming a mechanical engineer. You don't need to understand combustion chemistry to know that a claimed 500 horsepower in a compact sedan warrants skepticism. Similarly, you don't need to understand Hilbert spaces to evaluate whether a quantum computing announcement threatens XRPL.
- Understand how qubits differ from classical bits
- Grasp why quantum computers are so fragile
- Learn what error correction actually accomplishes
- Know which specifications matter for cryptographic attacks
Classical Bits:
A classical bit is definitively 0 or 1. It's like a light switch—on or off, nothing in between. All modern computing operates on this principle: billions of transistors, each holding a 0 or 1, manipulated through logical operations.
Classical Bit States:
├── State 0: Switch OFF, no current
├── State 1: Switch ON, current flowing
└── Measurement: Non-destructive, repeatable
└── You can check the state without changing itQuantum Bits (Qubits):
A qubit exploits quantum mechanics to exist in superposition—a probabilistic combination of 0 and 1 simultaneously. It's not "half 0 and half 1" in a classical sense. Rather, the qubit exists in both states until measured, at which point it "collapses" to one definite state.
Qubit in Superposition:
├── State: α|0⟩ + β|1⟩
│ ├── α and β are probability amplitudes (complex numbers)
│ └── |α|² + |β|² = 1 (probabilities must sum to 1)
├── Measurement: Destructive, probabilistic
│ ├── Collapses to 0 with probability |α|²
│ └── Collapses to 1 with probability |β|²
└── Cannot know state without destroying superpositionAnalogy for Non-Physicists:
Imagine a coin spinning in the air. While spinning, it's neither heads nor tails—it's in a "superposition" of both. Only when you catch it (measure) does it become definitively one or the other. But unlike a real coin (which was always going to land one way based on physics), a qubit genuinely has no definite value until measured—the act of measurement determines the outcome.
Why This Matters for Computing:
Superposition allows a quantum computer to process multiple possibilities simultaneously:
Scaling Comparison:
├── 2 classical bits: Represent ONE of 4 states (00, 01, 10, 11)
├── 2 qubits in superposition: Represent ALL 4 states simultaneously
├── 10 classical bits: 1 state out of 1,024
├── 10 qubits: All 1,024 states simultaneously
├── 50 classical bits: 1 state out of ~10^15 (quadrillion)
├── 50 qubits: All ~10^15 states simultaneously
└── This exponential advantage enables Shor's algorithmEntanglement creates correlations between qubits that have no classical analog. When two qubits are entangled, measuring one instantly determines the state of the other, regardless of distance.
Entangled Qubit Pair (Bell State):
├── State: (1/√2)|00⟩ + (1/√2)|11⟩
├── Meaning: 50% chance both are 0, 50% chance both are 1
├── Measurement of first qubit:
│ ├── If result is 0 → Second qubit is definitely 0
│ └── If result is 1 → Second qubit is definitely 1
└── Correlation is instantaneous (but can't transmit information faster than light)For Cryptographic Attacks:
Shor's algorithm requires maintaining entanglement across thousands of qubits while performing billions of operations. Any loss of entanglement (decoherence) ruins the computation. This is why building a cryptographically relevant quantum computer (CRQC) is so challenging.
Here's the fundamental challenge: you can't "peek" at a qubit's state without destroying it. Measurement collapses the superposition, giving you a classical answer (0 or 1) but losing all the quantum information.
- You can't copy quantum states (no-cloning theorem)
- You can't check intermediate results without destroying them
- You must design algorithms to extract useful answers through interference
Shor's Algorithm Cleverness:
Shor's algorithm is designed so that the exponentially many possibilities interfere with each other, amplifying the "correct" answer (the private key) and suppressing wrong answers. This interference pattern survives measurement, allowing you to extract the key. But it requires maintaining coherent superposition throughout—any decoherence destroys the interference pattern and ruins the computation.
Just as classical computers use logic gates (AND, OR, NOT), quantum computers use quantum gates to manipulate qubits:
Key Quantum Gates:
├── X Gate (NOT): Flips |0⟩ to |1⟩ and vice versa
├── H Gate (Hadamard): Creates superposition from |0⟩ or |1⟩
├── CNOT Gate: Entangles two qubits (if first is 1, flip second)
├── T Gate: Adds phase rotation (crucial for universal computation)
└── Measurement: Collapses qubit to classical 0 or 1
Gate Fidelity:
├── Ideal gate: Performs operation perfectly
├── Real gate: Has some error probability
├── Current best: ~99.9% fidelity (0.1% error per gate)
└── Challenge: After millions of gates, errors accumulate
```
Different companies use different physical systems to create qubits. Each has trade-offs that affect the path to cryptographically relevant quantum computers.
How They Work:
Superconducting qubits use electrical circuits made from materials (aluminum, niobium) that conduct electricity with zero resistance at extremely low temperatures (~15 millikelvin, colder than outer space). Quantum states are encoded in the charge, flux, or phase of the circuit using structures called Josephson junctions.
Superconducting Qubit Profile:
├── Operating Temperature: ~15 mK (requires dilution refrigerator)
├── Gate Speed: ~20-100 nanoseconds (very fast)
├── Coherence Time: ~100-500 microseconds
├── Two-Qubit Gate Fidelity: ~99.5-99.9%
├── Connectivity: Fixed, nearest-neighbor (qubits can't move)
├── Current Record: IBM Condor 1,121 qubits (2023)
└── Key Players: Google, IBM, Rigetti, IQM- Fastest gate operations (crucial for completing algorithms before decoherence)
- Mature fabrication using semiconductor industry techniques
- Scalable manufacturing processes (similar to chip fabrication)
- Extensive tooling and software ecosystem
- Extremely sensitive to noise (electromagnetic, thermal, cosmic rays)
- Short coherence times relative to algorithm requirements
- Cooling infrastructure is massive and expensive (~$1-5 million per system)
- Fixed qubit positions limit connectivity (not every qubit can talk to every other)
Recent Milestone (December 2024):
Google's Willow demonstrated below-threshold error correction with superconducting qubits—meaning adding more qubits actually reduced error rates. This is the first demonstration that the fundamental scaling principle works for any technology.
How They Work:
Individual atoms (typically ytterbium-171 or barium-133) are ionized (given a charge) and trapped in electromagnetic fields within a vacuum chamber. Quantum states are encoded in the ion's internal energy levels (electron configurations), manipulated by precisely tuned laser pulses.
Trapped Ion Profile:
├── Operating Temperature: Room temperature chamber, ions laser-cooled
├── Gate Speed: ~1-100 microseconds (slower than superconducting)
├── Coherence Time: Minutes to hours (much longer than superconducting)
├── Two-Qubit Gate Fidelity: ~99.9%+ (highest demonstrated)
├── Connectivity: All-to-all (any ion can interact with any other)
├── Current Systems: IonQ Forte ~36 qubits, Quantinuum H2 ~32 qubits
└── Key Players: IonQ, Quantinuum, Alpine Quantum Technologies- Longest coherence times (minutes vs. microseconds)—more time for computation
- Highest gate fidelities demonstrated (~99.9%+)
- Natural all-to-all connectivity (any qubit can talk to any other directly)
- Identical qubits (all ytterbium-171 atoms are exactly the same)
- Slower gate operations mean algorithms take longer
- Difficult to scale beyond ~50-100 ions in a single trap (ion chain becomes unwieldy)
- Complex laser systems required (dozens of laser beams for control)
- Vacuum and trap engineering challenges
Recent Milestone (2024):
Quantinuum demonstrated 12 fully error-corrected logical qubits on their H2 system, achieving what they claim is the first universal, fully fault-tolerant gate set with repeatable error correction. This represents the highest-quality logical qubits demonstrated to date.
How They Work:
Quantum information is encoded in properties of photons (particles of light)—polarization (horizontal vs. vertical), path (which waveguide), or timing (which pulse). Photons are manipulated using beam splitters, phase shifters, and waveguides on silicon chips.
Photonic Profile:
├── Operating Temperature: Room temperature (major advantage)
├── Gate Speed: Picoseconds (extremely fast)
├── Coherence Time: Effectively infinite (photons don't decohere)
├── Two-Qubit Gate Fidelity: ~95-99% (improving)
├── Connectivity: Programmable via optical routing
├── Challenge: Probabilistic photon generation and gates
└── Key Players: PsiQuantum, Xanadu, ORCA Computing- Room temperature operation (no cryogenics needed—massive cost savings)
- Photons naturally resist decoherence (light travels unaffected)
- Excellent for quantum networking and communication
- Manufacturing leverages existing silicon photonics industry
- Photon generation is probabilistic (can't create single photons on demand reliably)
- Two-photon interactions are very weak (hard to entangle photons directly)
- Requires massive overhead to achieve deterministic operations
- Detection efficiency challenges (single-photon detectors are imperfect)
Strategy:
PsiQuantum is betting on silicon photonics manufacturing at scale, targeting 1 million+ physical qubits. Their approach uses error correction to overcome probabilistic gates—essentially accepting that most operations fail, but doing enough that statistically you succeed.
How They Work:
Individual neutral atoms (not ionized, unlike trapped ions) are held in place using focused laser beams called optical tweezers. Quantum states are encoded in atomic energy levels. To create entanglement, atoms are briefly excited to high-energy "Rydberg" states where they interact strongly.
Neutral Atom Profile:
├── Operating Temperature: Laser-cooled in vacuum
├── Gate Speed: ~1 microsecond
├── Coherence Time: Seconds (good)
├── Two-Qubit Gate Fidelity: ~99.5%+ (improving rapidly)
├── Connectivity: Reconfigurable (atoms can be physically moved)
├── Current Record: 1,000+ atom arrays demonstrated
└── Key Players: Atom Computing, QuEra, Pasqal, Infleqtion- Highly scalable (arrays of 1,000+ atoms demonstrated—more than any other platform)
- Atoms can be physically rearranged mid-computation using tweezers
- Long coherence times (seconds)
- Identical qubits (like trapped ions, all atoms of same isotope are identical)
- Relatively new technology (less mature than superconducting/ions)
- Gate fidelities still catching up to trapped ions
- Complex optical systems required
- Rydberg gate interactions require precise timing
Recent Progress:
Harvard/MIT groups have demonstrated 48 error-corrected logical qubits on neutral atom systems (2023-2024), showing rapid progress. Neutral atoms are considered a potential "dark horse" in the CRQC race.
How They Work:
Quantum information is encoded in the spin of individual electrons or atomic nuclei embedded in silicon. Similar to how a spinning top can point up or down, electron spin can be "up" or "down"—these become the 0 and 1 states.
Silicon Spin Profile:
├── Operating Temperature: ~100 mK (cold, but warmer than superconducting possible)
├── Gate Speed: ~1-100 nanoseconds (very fast)
├── Coherence Time: Seconds (in isotopically purified silicon)
├── Two-Qubit Gate Fidelity: ~99%+ (improving)
├── Connectivity: Fixed, nearest-neighbor
└── Key Players: Intel, Diraq, Silicon Quantum Computing- Leverages existing semiconductor manufacturing (potentially cheapest to scale)
- Very small qubit size (highest density possible)
- Fast gates
- Industry expertise from classical chip manufacturing
- Challenging to achieve high fidelities at scale
- Controlling individual electrons in silicon is difficult
- Less mature than superconducting/ion approaches
Which Technology Might Break Crypto First?
| Technology | Current Strength | Path to CRQC | Key Challenge |
|---|---|---|---|
| Superconducting | Most qubits (1,121) | Massive scale + | Coherence, noise, |
| Below-threshold QEC | error correction | error overhead
| | |
Trapped Ion | Best quality (~99.9%) | Modular ion traps | Scaling beyond
| 12 logical qubits | connected by photons | ~50-100 ions/trap
| | |
Photonic | Room temp, scalable | Million+ qubits with | Probabilistic gates,
| manufacturing | error correction | detection efficiency
| | |
Neutral Atom | 1,000+ atoms, movable | Large arrays with | Gate fidelity,
| 48 logical qubits | Rydberg gates | error correction
| | |
Silicon Spin | Semiconductor fab | Billions of qubits | Control precision,
| compatible | using existing fabs | fidelity at scale
Near-term Leaders: Superconducting (qubit count), Trapped Ion (quality)
Dark Horse: Neutral Atoms (rapid scaling + quality)
Long-term Potential: Photonic/Silicon (manufacturing advantages)
```
Honest Assessment: No technology has demonstrated a clear path to the millions of physical qubits needed for Shor's algorithm on 256-bit keys. Each faces unique scaling challenges. The "winner" may be a hybrid approach or a technology not yet mature.
Quantum states are extraordinarily sensitive to their environment. Any interaction with the outside world—thermal vibrations, electromagnetic noise, cosmic rays, even the measurement apparatus—can destroy the delicate superposition. This process is called decoherence.
Analogy: Imagine trying to balance a pencil on its tip. Any tiny vibration knocks it over. Qubits are like that pencil, except the "vibrations" are thermal fluctuations and electromagnetic noise, and "falling over" means losing the quantum state.
Environmental Noise Sources:
├── Thermal fluctuations
│ └── Even at 15 mK, residual heat disturbs qubits
├── Electromagnetic interference
│ └── Stray fields from wiring, nearby electronics, other qubits
├── Material defects
│ └── Impurities in superconducting films create noise
├── Cosmic rays
│ └── High-energy particles disrupt entire qubit arrays
├── Control errors
│ └── Imperfect laser/microwave pulses don't do exactly what's intended
├── Crosstalk
│ └── Operations on one qubit inadvertently disturb neighbors
└── Measurement backaction
└── The act of measuring auxiliary qubits affects nearby onesThe critical constraint: your algorithm must complete before qubits decohere.
Current State (Approximate):
├── Superconducting coherence: ~100-500 microseconds
├── Trapped ion coherence: ~1-60 seconds
├── Neutral atom coherence: ~1-10 seconds
│
├── Shor's algorithm for 256-bit ECC (secp256k1):
│ ├── Required logical qubits: ~2,330
│ ├── Required operations: ~10^11-10^12 (100 billion to 1 trillion)
│ ├── At 100 ns per gate: ~10,000-100,000 seconds
│ └── That's 3-30 hours of coherent quantum computation
│
└── Gap: Algorithm takes 10,000-1,000,000× longer than coherence time
Solution Required: Quantum Error Correction
```
Every quantum operation introduces errors. Gate fidelity measures how accurately a quantum gate performs its intended operation.
Understanding Error Rates:
├── 99.9% fidelity = 0.1% error per gate
│ ├── After 1,000 gates: ~63% chance of at least one error
│ ├── After 10,000 gates: ~99.995% chance of at least one error
│ └── After 1,000,000 gates: Errors are guaranteed many times over
│
├── 99.99% fidelity = 0.01% error per gate
│ └── After 10,000 gates: ~63% chance of at least one error
│
└── Shor's algorithm needs 10^11+ gates
└── Even 99.9999% fidelity would produce billions of errors
└── Error correction is absolutely essentialThe Error Correction Threshold:
There exists a critical error rate (the "threshold") below which error correction can suppress errors faster than they accumulate. Above the threshold, more qubits just means more errors—you can never build a working computer.
Surface Code Threshold: ~1% error per gate
Current Status (2024-2025):
├── Best superconducting: ~0.1-0.5% error (below threshold ✓)
├── Best trapped ion: ~0.03-0.1% error (below threshold ✓)
├── Best neutral atom: ~0.3-0.5% error (below threshold ✓)
├── Google Willow demonstrated: 0.143% per error correction cycle
└── Implication: Error correction is becoming viable
Classical error correction is conceptually simple: copy data multiple times and use majority voting. If you store "1" three times and one copy gets corrupted to "0", you still know the original was "1" because 2 out of 3 copies agree.
Why This Doesn't Work for Qubits:
- No-cloning theorem: You literally cannot copy a qubit's quantum state
- Measurement destroys: Checking the state collapses the superposition
- Continuous errors: Quantum errors aren't just bit flips; they're rotations in continuous space
Quantum Error Correction Solution:
Encode one "logical" qubit across many "physical" qubits in such a way that errors can be detected and corrected without measuring the logical state. This is done by measuring correlations (called "syndromes") between physical qubits, which reveal errors without revealing the actual data.
Surface Code Architecture (Most Common):
├── Arranges physical qubits in a 2D grid
├── Data qubits hold the quantum information
├── Ancilla qubits measure error syndromes
├── Errors show up as patterns in syndrome measurements
├── Classical computer interprets syndromes and applies corrections
│
├── Code Distance (d): Size of the grid
│ ├── Distance-3: 17 physical qubits → 1 logical qubit
│ ├── Distance-5: 49 physical qubits → 1 logical qubit
│ ├── Distance-7: 101 physical qubits → 1 logical qubit
│ └── Distance-d: ~2d² physical qubits → 1 logical qubit
│
└── Higher distance = can correct more errors, but more overheadGoogle's Willow announcement emphasized "below threshold" performance. Here's why this matters:
Above Threshold Behavior:
├── Physical error rate is too high
├── Error correction can't keep up with error generation
├── Adding more qubits (higher distance) increases total errors
├── Scaling is counterproductive—you can never build a large computer
└── No path to fault-tolerant computing
Below Threshold Behavior:
├── Physical error rate is low enough
├── Error correction outpaces error generation
├── Adding more qubits (higher distance) DECREASES logical error rate
├── Each distance increase provides exponential improvement
└── Path to arbitrarily low error rates exists (given enough qubits)
Willow demonstrated: λ = 2.14 ± 0.02
├── λ is the "error suppression factor"
├── Each increase in code distance by 2 reduces logical errors by ~2.14×
├── Distance-3 → Distance-5: Errors reduced by ~2.14×
├── Distance-5 → Distance-7: Errors reduced by another ~2.14×
├── This is exponential suppression—the key requirement for CRQC
```
Even below threshold, the overhead is enormous:
Physical Qubit Overhead for Cryptographic Attacks:
Target: ~2,330 logical qubits (for secp256k1)
Required logical error rate: < 10^-10 per operation (for reliable computation)
At λ = 2.14 (Willow's demonstrated rate):
├── Current per-cycle error: 0.143% = 1.43 × 10^-3
├── Need to reduce to: < 10^-10
├── Gap: ~10^7 improvement needed
├── Requires: Distance ~25-30 code
│ └── ~1,500-2,000 physical qubits per logical qubit
└── Total: 2,330 × 1,500 = ~3.5-5 million physical qubits
At λ = 10 (aspirational target with improved hardware):
├── Faster error suppression per distance increment
├── Lower distance required (~15-20)
├── ~500-800 physical qubits per logical qubit
└── Total: ~1-2 million physical qubits
Current Reality (December 2024):
├── Willow: 105 physical qubits, demonstrated ~1 logical qubit
├── Gap to CRQC: ~10,000-50,000× current capability
├── IBM Condor: 1,121 physical qubits (but not error-corrected)
└── Honest assessment: Significant progress, but vast gap remains
Breaking cryptography isn't just about qubits—you need time:
Time to Break 256-bit ECC (secp256k1):
Estimates from literature:
├── 1 hour attack: ~317 million physical qubits needed
├── 1 day attack: ~13 million physical qubits needed
├── 1 week attack: ~4 million physical qubits needed
│
├── These assume:
│ ├── Surface code error correction
│ ├── 1 microsecond code cycle time
│ ├── 10^-3 physical error rate
│ └── Optimized circuit implementations
│
└── For XRPL:
├── Transaction finality: ~4 seconds
├── To attack in-flight transaction: Would need sub-second attack
├── Implication: Even with CRQC, attacking active transactions is harder
└── Main threat: Long-term "harvest now, decrypt later" on exposed keys
When evaluating quantum computing announcements, focus on these specifications:
Key Metrics for CRQC Progress Assessment:
1. Logical Qubit Count (not just physical!)
1. Logical Error Rate
1. Error Suppression Factor (λ)
1. Two-Qubit Gate Fidelity
1. Coherence Time
1. Connectivity
1. Gate Speed
Signs of Genuine Progress (Green Flags):
✓ Published in peer-reviewed journals (Nature, Science, Physical Review Letters)
✓ Demonstrates logical qubit improvements (not just physical qubit count)
✓ Shows below-threshold error correction scaling (λ > 1)
✓ Independent verification of results by other groups
✓ Clear metrics with confidence intervals and methodology
✓ Acknowledges limitations and remaining challenges
✓ Comparison to prior work with honest assessmentSigns of Hype (Red Flags):
✗ Emphasizes physical qubit count without mentioning logical qubits
✗ Compares to classical computers on artificial/irrelevant benchmarks
✗ Uses "quantum supremacy/advantage" on problems with no practical use
✗ Announces "breakthrough" without peer review
✗ Claims imminent relevance to cryptography without technical justification
✗ Funding announcement timed with technical claims
✗ No mention of error rates, coherence times, or gate fidelities
✗ Comparisons to "age of universe" or other irrelevant timescalesLet's apply our framework to Google's Willow announcement:
Claim 1: "Computation that would take 10 septillion years"
├── Red Flag: Irrelevant benchmark (random circuit sampling)
├── This is NOT about cryptography
├── Classical computers are bad at simulating quantum randomness
├── No practical application—purely a demonstration
└── Assessment: Overhyped for marketing, not CRQC-relevant
Claim 2: "Below-threshold error correction with λ = 2.14"
├── Green Flag: Peer-reviewed in Nature
├── Green Flag: Demonstrates fundamental scaling principle works
├── Green Flag: Clear metrics with uncertainty (±0.02)
├── Green Flag: First demonstration of this for any platform
├── Honest context: Still only ~1 logical qubit demonstrated
└── Assessment: Genuine milestone, but early stage
Claim 3: "Logical qubit lifetime exceeds physical qubit by 2.4×"
├── Green Flag: Error correction extending coherence
├── Green Flag: Quantified improvement
├── Honest context: 2.4× is modest; need orders of magnitude more
└── Assessment: Real progress, long way to go
Overall: Significant scientific achievement, not an imminent cryptographic threat.
Google's own statement: "Willow is incapable of breaking modern cryptography."
✅ Multiple qubit technologies have demonstrated below-threshold error correction. Both superconducting (Google Willow) and trapped ion (Quantinuum) have shown that adding qubits can reduce logical errors—the fundamental requirement for scaling.
✅ Gate fidelities are improving consistently. Two-qubit gate fidelities have improved from ~90% a decade ago to ~99.9% today, following a trajectory similar to early semiconductor development.
✅ The physics works. Quantum computers have demonstrated computational tasks impossible for classical computers. The theoretical foundation for Shor's algorithm is mathematically proven.
✅ Error correction overhead is understood. Researchers know what's needed: millions of physical qubits for cryptographic attacks. The challenge is engineering, not unknown physics.
⚠️ Whether current approaches can scale to millions of qubits. No technology has demonstrated more than ~1,100 physical qubits with acceptable quality. The path to millions remains unclear.
⚠️ Whether error correction overhead can be substantially reduced. Current estimates require 1,000-2,000+ physical qubits per logical qubit. Novel error correcting codes or much higher fidelities could reduce this.
⚠️ Which technology will "win." Superconducting leads in qubit count, trapped ions in quality, neutral atoms in recent scaling, photonics in manufacturing potential. The winner may be a hybrid or something new.
⚠️ Timeline for CRQC. Expert estimates range from "never" to "2030" with most probability mass in 2035-2045. Breakthroughs could accelerate; obstacles could delay.
🔴 Equating physical qubit count with cryptographic threat. IBM's 1,121-qubit Condor is far less concerning than a hypothetical 100 high-fidelity logical qubits would be.
🔴 Assuming linear extrapolation. Quantum computing progress is not guaranteed to follow predictable curves. Breakthroughs or fundamental limits could dramatically change timelines.
🔴 Ignoring the overhead problem. Even with below-threshold error correction demonstrated, the overhead to reach cryptographic relevance is enormous. Progress is real but the gap remains vast.
🔴 Dismissing the threat entirely. The physics is proven, progress is real, and "harvest now, decrypt later" is a current threat for long-term secrets. Complacency is as dangerous as panic.
Quantum computing has reached genuine milestones: below-threshold error correction, high-fidelity gates, and scaling to over a thousand physical qubits. These are necessary steps toward CRQC. But they're also early steps.
Analogy: We're at roughly the "first transistor" or "Wright Brothers" stage—proof of concept achieved, but commercial airliners (or cryptographic attacks) require orders of magnitude more development.
The gap from ~1 demonstrated logical qubit to ~2,330 required logical qubits at 10^-10 error rates is enormous. Current systems are like having built a 10-foot prototype airplane and claiming transatlantic flight is imminent. The principles are proven; the engineering remains formidable.
Assignment: Create a comprehensive tracking system for evaluating quantum computing progress toward cryptographic relevance.
Requirements:
- All five qubit technologies (superconducting, trapped ion, photonic, neutral atom, silicon spin)
- Current leaders for each technology (company, system name)
- Key metrics: physical qubits, logical qubits, coherence time, gate fidelity, gate speed, connectivity
- Advantages and disadvantages for Shor's algorithm specifically
- Recent major milestones (2023-2024)
Include citations for all data points.
- Physical qubit counts by technology (trend chart)
- Best demonstrated logical qubit counts
- Two-qubit gate fidelity improvements
- Error suppression factors achieved
- Key milestone timeline
Format as either a spreadsheet or visual infographic.
Calculate the current gap to cryptographic relevance for the two leading technologies:
- Current physical qubit count
- Estimated physical qubits per logical qubit (given current error rates)
- Implied current logical qubit capacity
- Required logical qubits: ~2,330
- Gap factor (required ÷ current)
- Historical scaling rate (qubits per year)
- Naive extrapolation to CRQC (years to close gap at current rate)
- Critical assumptions and limitations of this extrapolation
- Technical accuracy: 30%
- Completeness of comparison: 20%
- Clear visualization: 20%
- Honest gap assessment: 20%
- Quality of sources: 10%
Time Investment: 4-5 hours
Value: This tracker becomes your reference for evaluating any future quantum announcement's relevance to XRPL security.
1. Superposition Understanding:
A qubit in superposition state |ψ⟩ = 0.6|0⟩ + 0.8|1⟩ is measured. What is the probability of measuring the state |0⟩?
A) 60%
B) 36%
C) 64%
D) 80%
Correct Answer: B
Explanation: The probability of measuring each state is the square of its amplitude (|amplitude|²). For |0⟩: P(0) = |0.6|² = 0.36 = 36%. For |1⟩: P(1) = |0.8|² = 0.64 = 64%. Note that 0.36 + 0.64 = 1, as probabilities must sum to 1. Option A confuses the amplitude (0.6) with the probability (0.36). The amplitude is the square root of probability, not the probability itself.
2. Technology Trade-offs:
Which statement BEST describes the fundamental trade-off between superconducting and trapped ion qubits?
A) Superconducting has more qubits and higher fidelity; trapped ions are slower and less accurate
B) Superconducting has faster gates but shorter coherence times; trapped ions have longer coherence but slower gates
C) Trapped ions require cryogenic cooling; superconducting qubits operate at room temperature
D) Both technologies have essentially identical characteristics; only the manufacturing process differs
Correct Answer: B
Explanation: The fundamental trade-off is speed versus coherence. Superconducting qubits perform gate operations in 20-100 nanoseconds (very fast) but decohere in ~100-500 microseconds (relatively short). Trapped ions have gate times of ~1-100 microseconds (slower) but maintain coherence for seconds to minutes (much longer). Option A is incorrect because trapped ions actually have higher fidelity (99.9%+). Option C has the temperature requirements completely backwards—superconducting qubits require ~15 mK (cryogenic), while trapped ions operate with room-temperature chambers.
3. Error Correction Threshold:
Google's Willow demonstrated "below-threshold" error correction with λ = 2.14. What does this mean?
A) The logical error rate has been reduced to exactly zero
B) Adding more physical qubits to the error correction code increases the total error rate
C) Adding more physical qubits to the error correction code decreases the logical error rate exponentially
D) The system no longer requires any error correction
Correct Answer: C
Explanation: "Below threshold" means physical error rates are low enough that quantum error correction works as intended—specifically, that increasing the code distance (adding more physical qubits) exponentially suppresses logical errors rather than making things worse. The λ = 2.14 means each increase in code distance by 2 reduces logical errors by approximately 2.14×. This is exponential suppression—the essential requirement for scalable, fault-tolerant quantum computing. Options A and D are incorrect (errors still exist and error correction is still required). Option B describes above-threshold behavior, which is the opposite of what was demonstrated.
4. Physical vs. Logical Qubits:
IBM's Condor processor has 1,121 physical qubits. Approximately how many error-corrected logical qubits suitable for running Shor's algorithm does this represent?
A) 1,121 logical qubits (same as physical)
B) About 100-200 logical qubits
C) About 10-50 logical qubits
D) Approximately 0-1 logical qubits for Shor's algorithm
Correct Answer: D
Explanation: Current error correction schemes require roughly 1,000-2,000+ physical qubits per logical qubit to achieve the error rates needed for Shor's algorithm (~10^-10 per operation). With 1,121 physical qubits, this yields approximately 0-1 logical qubits suitable for cryptographic attacks. IBM's Condor is designed for near-term quantum applications and research, not fault-tolerant operation of Shor's algorithm. This is precisely why headlines about "1,000+ qubits" don't indicate imminent cryptographic threats—the relevant metric is logical qubits, not physical qubits.
5. Evaluating Quantum Announcements:
A startup announces "revolutionary quantum breakthrough with 99.99% gate fidelity on 50 qubits." An investor should be:
A) Highly concerned—this likely enables breaking XRPL's cryptography within months
B) Skeptical—50 physical qubits at any fidelity can create at most ~1-2 logical qubits, far short of the ~2,330 needed
C) Dismissive—gate fidelity is irrelevant to cryptographic security
D) Confident this is fraudulent—99.99% fidelity is physically impossible
Correct Answer: B
Explanation: Even with perfect 99.99% two-qubit gate fidelity (which is excellent and would be a legitimate achievement), 50 physical qubits would support at most 1-2 logical qubits using current error correction codes. Shor's algorithm for secp256k1 requires ~2,330 logical qubits. The gap between 1-2 and 2,330 means this represents no near-term threat to cryptocurrency security. However, it would represent genuine scientific progress worth monitoring. Option A dramatically overestimates the threat. Option C is false—fidelity is crucial for error correction. Option D is incorrect—99.99% fidelity has been demonstrated in trapped ion systems.
Technical Foundations:
- Google Quantum AI, "Quantum error correction below the surface code threshold" (Nature, February 2025) — The Willow technical paper
- Roetteler et al., "Quantum Resource Estimates for Computing Elliptic Curve Discrete Logarithms" (IACR 2017) — Qubit requirements for ECC
- Gidney & Ekerå, "How to Factor 2048-bit RSA Integers in 8 Hours Using 20 Million Noisy Qubits" (2021) — Physical resource estimates
Technology Comparisons:
- The Quantum Insider, "Harnessing the Power of Neutrality: Comparing Neutral-Atom Quantum Computing With Other Modalities" (2024)
- DARPA QBI Program Announcements (2024-2025) — U.S. government assessment of competing technologies
- IEEE Quantum Week 2024 Workshop Proceedings
Accessible Introductions:
- Preskill, "Quantum Computing in the NISQ Era and Beyond" (2018) — Excellent overview of near-term quantum computing limitations
- Nielsen & Chuang, "Quantum Computation and Quantum Information" — The standard textbook (dense but comprehensive)
For Next Lesson:
Lesson 3 examines Shor's algorithm in depth—not the full mathematical derivation, but how it actually attacks elliptic curve cryptography and exactly what resources it requires. We'll build a concrete model of CRQC requirements for breaking XRPL's secp256k1 signatures.
Instructor Notes:
This lesson builds quantum literacy without requiring physics background. Key pedagogical goals:
- Demystify specifications — Students should leave able to read a quantum computing announcement and identify what matters
- Establish the gap — The difference between physical and logical qubits is crucial; reinforce repeatedly
- Balance optimism and skepticism — Progress is real; threat is distant but not dismissible
- Connect to XRPL — Every technical concept should tie back to cryptocurrency security implications
- "More qubits = more dangerous" (physical vs. logical distinction)
- "Quantum computers are just faster classical computers" (fundamentally different)
- "Error correction solves everything" (massive overhead remains)
- "One technology will clearly win" (race is genuinely open)
End of Lesson 2
Total words: ~6,200
Estimated reading time: 55 minutes
Deliverable time: 4-5 hours
Key Takeaways
Qubits exploit superposition and entanglement
for exponential parallelism. Unlike classical bits (definitively 0 or 1), qubits exist in both states simultaneously until measured. This enables algorithms like Shor's to search exponentially large spaces efficiently.
Five main qubit technologies compete:
Superconducting (fast, noisy, most qubits), trapped ion (precise, slow, best quality), photonic (room-temp, probabilistic), neutral atom (scalable, rapidly improving), and silicon spin (semiconductor-compatible). None has demonstrated a clear path to CRQC.
Decoherence is the fundamental challenge.
Qubits lose their quantum properties within microseconds to seconds due to environmental noise. Algorithms need hours. Error correction bridges this gap, but only if physical error rates are below threshold.
Below-threshold error correction is the critical milestone.
Google's Willow demonstrated that adding qubits reduces logical errors—the essential requirement for scaling. But the overhead (thousands of physical qubits per logical qubit) remains enormous.
Physical qubits ≠ cryptographic threat.
The relevant metric is logical qubits with sufficient fidelity to run Shor's algorithm. Current systems: ~1-50 logical qubits. Required for CRQC: ~2,330 logical qubits. Gap: ~50-2,000×. ---