The Technology of Privacy - What's Actually Possible | Privacy vs. Control in CBDCs | XRP Academy - XRP Academy
3 free lessons remaining this month

Free preview access resets monthly

Upgrade for Unlimited
Skip to main content
beginner60 min

The Technology of Privacy - What's Actually Possible

Learning Objectives

Explain the core mechanism of zero-knowledge proofs, blind signatures, and secure multi-party computation in accessible terms

Apply the Technical Privacy Options Matrix to compare different privacy-preserving approaches across privacy level, compliance capability, maturity, and scalability

Assess Technology Readiness Levels for privacy-preserving CBDC features, distinguishing between production-ready, deployment-ready, and theoretical

Evaluate CBDC privacy claims by identifying whether proposed privacy technologies are specified, mature, and scalable

Identify the gap between what privacy-preserving technology enables and what political/institutional constraints will likely permit

Privacy advocates and surveillance advocates both assume a binary choice: either transactions are visible or they aren't. But cryptographic research over the past 40 years has developed techniques that potentially offer a third path—proving things about transactions without revealing the transactions themselves.

Can you prove you're not money laundering without showing your transactions?
Can you prove you paid taxes without revealing your income?
Can you verify compliance without enabling surveillance?

The theoretical answer is: Yes, mostly.
The practical answer is: Sort of, sometimes, at a cost.

  • What works today at scale
  • What works in limited deployment
  • What's demonstrated in research
  • What remains theoretical
  • What the cost/performance trade-offs are
  • Why these technologies are rarely implemented despite availability

Understanding these technologies protects you from two traps: dismissing privacy-preserving CBDCs as impossible, and accepting privacy claims without understanding their technical basis.


A zero-knowledge proof (ZKP) allows one party to prove to another that a statement is true without revealing any information beyond the truth of that statement.

The Classic Illustration: The Cave Example

THE ALI BABA CAVE PROBLEM:

- A cave with two paths (A and B) that meet at a door
- The door can only be opened with a secret password
- Peggy (prover) claims she knows the password
- Victor (verifier) wants proof without learning the password

1. Victor waits outside the cave entrance
2. Peggy enters, randomly chooses path A or B
3. Victor enters, calls out "Come out via path A" or "Come out via path B"
4. If Peggy knows the password, she can always comply
5. If Peggy doesn't know the password, she can only comply 50% of the time

- If Peggy doesn't know password: Probability of passing all = (1/2)^20 ≈ 0.0001%
- If Peggy always succeeds: Victor is convinced she knows the password
- Victor learned NOTHING about the password itself

KEY INSIGHT:
Victor is convinced Peggy knows the password
But Victor cannot prove this to anyone else
And Victor has not learned the password

In the CBDC context, ZKPs could prove compliance without revealing transaction details:

  • Transaction amounts below reporting threshold, OR
  • Transaction patterns don't match known laundering typologies, OR
  • Counterparties are not on sanctions lists
  • Actual transaction amounts
  • Specific counterparties
  • Transaction history

  • Income does not exceed reported income
  • Deductions are mathematically valid
  • Reported taxes are consistent with transactions
  • Actual income amount
  • Specific transactions
  • Financial details

  • Funds originated from legitimate sources
  • Chain of custody is clean
  • No mixing with suspicious funds
  • Specific source
  • Transaction path
  • Amounts at each step

How ZKPs Actually Work (Simplified)

THE MATHEMATICS (conceptual):

1. STATEMENT TO PROVE:

1. WITNESS (secret):

1. COMPUTATION:

1. VERIFICATION:

- Completeness: Valid proofs always verify
- Soundness: False proofs (almost) never verify
- Zero-knowledge: Verifier learns nothing except truth of statement

Current ZKP Implementations

PRODUCTION SYSTEMS:

- Full transaction privacy via ZKPs (zk-SNARKs)
- Deployed since 2016
- Proves: Valid transaction without revealing amounts/parties
- Trade-off: High computational cost

- Uses ZKPs for scalability
- Proves: Batch of transactions correctly executed
- Trade-off: Privacy is secondary to scalability

- ZKPs for identity verification
- Proves: Person is unique human
- Trade-off: Controversial biometric collection

- ZKPs work at scale for specific use cases
- Computational cost is significant but manageable
- Specialized expertise required for implementation
- Not "drop-in" solutions—require purpose-built systems

Honest Assessment of ZKP Limitations

COMPUTATIONAL COST:

- zk-SNARKs: Seconds to minutes per proof
- zk-STARKs: Seconds to minutes per proof
- Modern optimizations improving rapidly
- Still significant vs. simple signature

- Generally fast (milliseconds)
- But scales with number of verifications
- At CBDC scale: Billions of verifications daily

- zk-SNARKs: 100-300 bytes (compact)
- zk-STARKs: 10-100 KB (larger)
- Blockchain storage implications

---

TRUSTED SETUP PROBLEM:

  • Initial parameters generated by trusted party
  • If setup compromised, false proofs possible
  • "Powers of tau" ceremonies attempt to distribute trust
  • Newer systems (STARKs) avoid this but with trade-offs

COMPLEXITY LIMITATIONS:

  • Mathematical statements (sums, thresholds, set membership)

  • Deterministic computations

  • Subjective judgments ("suspicious activity")

  • Dynamic rule changes

  • Human-interpretable compliance


REGULATORY ACCEPTANCE:

Question: Will regulators accept cryptographic compliance?

  • Some regulators exploring
  • Most prefer visible compliance
  • "Trust the math" is hard political sell
  • Liability questions unresolved

A blind signature allows an authority to sign a message without seeing its content—like signing a document through carbon paper in a sealed envelope.

How Blind Signatures Work

THE PROTOCOL (conceptual):

1. PREPARATION:

1. SIGNING:

1. UNBLINDING:

1. SPENDING:

KEY PROPERTY:
Central bank signs valid currency
Central bank cannot trace how it's spent
Currency is anonymous like cash but digital

eCash: The Original CBDC Privacy Proposal

David Chaum invented blind signatures in 1982 specifically for electronic cash:

CHAUM'S ECASH SYSTEM:

1. User withdraws $100 from bank
2. Bank signs blinded tokens (100 × $1 tokens)
3. User unblinds tokens
4. User spends tokens at merchants
5. Merchants deposit tokens at bank
6. Bank verifies signatures, credits merchants
7. Bank cannot link withdrawal to spending

- Bank knows user withdrew $100
- Bank knows merchant received $100
- Bank CANNOT link these events
- Transaction is truly anonymous

- Required special software
- Users didn't value privacy enough
- Merchant adoption difficult
- Bank had no incentive to offer it

- Technical solution existed 40 years ago
- Privacy was possible all along
- Political/economic barriers, not technical ones

GNU Taler: CBDC-Ready Privacy Technology

SYSTEM DESIGN:

- Exchange (could be central bank) issues coins
- Blind signature withdrawal
- Merchant deposits reveal amount but not payer
- Buyer privacy, merchant transparency

- Buyers: Fully anonymous (like cash)
- Merchants: Fully identified (for tax compliance)
- Exchange: Sees merchant revenue, not buyer spending

- Double-spend prevention
- Income tracking (merchant deposits visible)
- Tax enforcement on businesses
- No transaction surveillance

- Open source, available now
- Limited pilot deployments
- No major central bank adoption
- Technically mature, politically ignored

- Reduces their visibility
- Shifts compliance to merchants only
- "Cash-like" scares regulators
- Political preference for visibility

Honest Assessment of Blind Signatures

WHAT BLIND SIGNATURES SOLVE:
✓ Transaction unlinkability (central bank can't trace)
✓ Buyer privacy (like cash)
✓ Double-spend prevention (cryptographic)
✓ Merchant compliance (deposits visible)

WHAT BLIND SIGNATURES DON'T SOLVE:
✗ Large cash transaction reporting (no amount visibility)
✗ Pattern analysis (can't see patterns you can't link)
✗ Source of funds (withdrawal visible but not prior history)
✗ AML on buyer side (only merchants visible)

  • "We can't see anything" is politically difficult
  • Relies entirely on merchant-side compliance
  • Buyers could be terrorists, tax evaders, criminals
  • How do you investigate without visibility?
  • Offline payments more complex
  • Token denomination creates usage patterns
  • Storage requirements for large values
  • User responsibility for token security

Instead of binary privacy (all or nothing), tiered systems offer different privacy levels for different transaction sizes or types.

The Tier Structure

TIERED PRIVACY MODEL:

TIER 1: Cash-Like (small transactions)
├─ Amount threshold: $0-200 per transaction
├─ Identity requirement: None
├─ Visibility: None (even to central bank)
├─ Mechanism: Token-based or blind signature

TIER 2: Pseudonymous (medium transactions)
├─ Amount threshold: $200-5,000 per transaction
├─ Identity requirement: Basic verification
├─ Visibility: Transaction data without full identity
├─ Mechanism: Account-based with limited disclosure

TIER 3: Identified (large transactions)
├─ Amount threshold: $5,000+ per transaction
├─ Identity requirement: Full KYC/AML
├─ Visibility: Complete transaction and identity data
├─ Mechanism: Standard banking visibility

- Thresholds may be per-transaction or cumulative
- Time windows (e.g., monthly limits per tier)
- Activity-based escalation (unusual patterns → higher tier)

Digital Euro Approach

ECB PROPOSED TIERS:

- Amount: €0-150 per transaction
- Daily limit: €300
- Identity: Device-linked but not identified
- Visibility: None at transaction time
- Later audit: Possible but not automatic

- Amount: €150-1,000
- Identity: Basic account verification
- Visibility: Payment service provider sees
- ECB visibility: Aggregate statistics only

- Amount: €1,000+
- Identity: Full verification
- Visibility: Standard AML/KYC applies
- ECB visibility: As needed for policy

- Maximum holding: €3,000 (proposed)
- Prevents bank disintermediation
- Limits privacy for wealthy users

ASSESSMENT:
More privacy than current banking
Less privacy than cash
Political compromise position

eCNY Approach (China)

CHINA'S "CONTROLLABLE ANONYMITY":

Wallet Tiers Based on Verification:

  • Registration: Phone number only

  • Daily limit: ¥5,000 (~$700)

  • Balance limit: ¥10,000 (~$1,400)

  • Anonymity: "To counterparty only"

  • Central bank: FULL VISIBILITY

  • Registration: Name + ID number

  • Higher limits

  • Central bank: FULL VISIBILITY

  • Registration: Bank account linkage

  • Even higher limits

  • Central bank: FULL VISIBILITY

  • Registration: In-person bank verification

  • Highest limits

  • Central bank: FULL VISIBILITY

KEY INSIGHT:
"Anonymity" means counterparties don't see each other
NOT that government doesn't see everything
Tiers are about LIMITS, not PRIVACY
Central bank has full visibility at all tiers
```

Problems with Tiered Approaches

STRUCTURING/SMURFING:
Problem: Users split transactions to stay in low-visibility tiers
Example: $10,000 payment as 50 × $200 transactions
Detection: Requires pattern analysis (which requires visibility)
Tension: Preventing gaming requires the surveillance tiers avoid
  • Device tracking? Privacy concern
  • Connectivity requirement? Not truly offline
  • Honor system? Will be gamed

THRESHOLD GAMING:
$4,999 transactions to avoid $5,000 reporting
Pattern visible to system but "compliant"
Creates enforcement dilemmas

TIER SHOPPING:
Multiple identities to access multiple tier limits
Identity verification required to prevent
But verification reduces tier 1 privacy

POLITICAL PRESSURE ON THRESHOLDS:
Initial thresholds: "€3,000 holding limit"
Over time: "We need to lower it for [reason]"
Tiered privacy erodes toward surveillance
```


Secure multi-party computation (MPC) allows multiple parties to jointly compute a function over their inputs while keeping inputs private.

How MPC Works

SIMPLE EXAMPLE: Millionaires' Problem
  • Alice and Bob want to know who is richer
  • Neither wants to reveal their wealth
  • One reveals to the other (privacy lost)
  • Trusted third party (privacy to third party)
  • No solution without some disclosure
  • Alice and Bob engage in cryptographic protocol
  • Protocol outputs only: "Alice" or "Bob" (who is richer)
  • Neither learns the other's actual wealth
  • No third party needed

KEY PROPERTY:
Compute useful function
Without revealing individual inputs
Result is correct and private
```

Potential MPC Applications

AGGREGATE STATISTICS:
Compute total spending in a sector
Without revealing individual transactions
Central bank sees: "Consumer spending up 3%"
Not who spent what

THRESHOLD DETECTION:
Multiple entities jointly determine if threshold crossed
No single entity sees complete picture
Example: AML alert if pattern spans multiple banks

PRIVACY-PRESERVING AUDIT:
Verify compliance across institutions
Without centralizing transaction data
Audit result: "Compliant" or "Issue detected"
Not transaction-level visibility

WATCHLIST MATCHING:
Check if counterparty is on sanctions list
Without revealing complete sanctions list
Or complete customer list
Result: Match or no match only
```

Honest Assessment

MATURITY LEVEL:

Research: Mature (decades of development)
Protocols: Many designs, some standardized
Libraries: Available (MP-SPDZ, SCALE-MAMBA, etc.)
Production: Very limited deployment
CBDC scale: Unproven

- Orders of magnitude slower than direct computation
- Communication overhead between parties
- Setup costs for each computation
- Not suitable for every transaction

- Aggregate statistics (compute once, serve many)
- Threshold alerts (check periodically, not per-transaction)
- Audits (infrequent, high-value computations)

- Per-transaction privacy (too slow)
- Real-time pattern detection
- High-frequency operations

- Nascent (most regulators don't understand it)
- "I can't see the data" concern
- Liability for incorrect computations unclear

---

We need a structured way to compare privacy-preserving approaches:

TECHNICAL PRIVACY OPTIONS MATRIX

CRITERIA:

  1. PRIVACY LEVEL (for users)

  2. COMPLIANCE CAPABILITY (for regulators)

  3. MATURITY (implementation readiness)

  4. SCALABILITY (transactions per second, cost)

Matrix Application

APPROACH             | PRIVACY | COMPLIANCE | MATURITY   | SCALABILITY
---------------------|---------|------------|------------|------------
Full Transparency    | None    | Full       | Production | Excellent
Account-Based+ACL    | Low     | High       | Production | Good
Tiered Anonymity     | Medium  | Medium     | Pilot      | Adequate
Blind Signatures     | V.High  | Low        | Deployment | Adequate
ZKP Compliance       | High    | Medium     | Research   | Limited
MPC Aggregates       | Medium  | Medium     | Research   | Poor

OBSERVATIONS:

  1. Privacy and Compliance trade off

  2. Maturity inversely correlates with privacy

  3. Scalability challenges for privacy

Detailed Readiness Assessment

PRODUCTION READY (TRL 9) - Deployable now:
┌─────────────────────────────────────────────────────┐
│ • Full transaction visibility                        │
│ • Account-based with access controls                │
│ • Standard encryption for data at rest/transit      │
│ • Role-based access controls                        │
│ • Audit logging                                     │
└─────────────────────────────────────────────────────┘

DEPLOYMENT READY (TRL 7-8) - Near-term feasible:
┌─────────────────────────────────────────────────────┐
│ • Offline token payments (limited) │
│ • Hardware security modules for key storage │
│ • Tiered identity verification │
│ • Selective disclosure for specific use cases │
│ • Pseudonymous accounts with breakable privacy │
└─────────────────────────────────────────────────────┘

PILOT READY (TRL 5-6) - Requires more development:
┌─────────────────────────────────────────────────────┐
│ • ZK proofs for simple compliance (threshold checks)│
│ • Blind signature variants (GNU Taler-style) │
│ • MPC for aggregate statistics │
│ • Privacy-preserving watchlist matching │
└─────────────────────────────────────────────────────┘

RESEARCH PHASE (TRL 3-4) - Significant work needed:
┌─────────────────────────────────────────────────────┐
│ • ZK proofs for complex compliance (AML patterns) │
│ • Scalable MPC for transaction-level privacy │
│ • Post-quantum privacy-preserving protocols │
│ • Revocable anonymity with strong guarantees │
└─────────────────────────────────────────────────────┘

THEORETICAL (TRL 1-2) - May never be practical:
┌─────────────────────────────────────────────────────┐
│ • Perfect privacy with full compliance │
│ • Untraceable CBDC with AML effectiveness │
│ • Decentralized CBDC architecture │
│ • Privacy preserved against quantum adversaries │
└─────────────────────────────────────────────────────┘
```


The technology exists. Why isn't it used?

Political Economy of Privacy Technology

WHO WOULD DEPLOY PRIVACY TECHNOLOGY?
  • Benefit from visibility (policy, stability)
  • Accountable for AML compliance
  • Risk-averse, prefer proven approaches
  • No competitive pressure for privacy
  • Benefit from visibility (enforcement)
  • Career risk from undetected crimes
  • Don't understand cryptographic compliance
  • Benefit from visibility (tax revenue)
  • Fear "soft on crime" attacks
  • Voters don't demand privacy (revealed preference)
  • Benefit from customer data
  • Compliance costs for novel technology
  • Regulatory uncertainty

WHO WOULD DEMAND PRIVACY TECHNOLOGY?

  • State preference for privacy
  • Reveal preference for convenience
  • Don't understand technical options
  • Diffuse interest, poor organizing
  • Strong preference for privacy
  • Limited political influence
  • Technical expertise not matched by political access

RESULT:
All actors with implementation power prefer visibility
Actors preferring privacy lack implementation power
Gap between technical possibility and political reality
```

Conditions for Privacy Technology Adoption

POTENTIAL CATALYSTS:

- High-profile CBDC surveillance abuse
- Public outrage driving political demand
- "We need to fix this" moment

- Privacy-preserving alternatives gain adoption
- CBDCs lose market share to crypto
- Central banks forced to compete on privacy

- Constitutional challenges to CBDC surveillance
- Privacy ruled fundamental right
- Forced architectural changes

- Privacy-preserving CBDC becomes standard
- Countries must comply to interoperate
- Race to the top instead of bottom

- Leader makes privacy priority
- Public mandate for privacy-preserving CBDC
- Resources dedicated to implementation

HONEST ASSESSMENT:
No strong catalyst currently visible
Path of least resistance remains surveillance
Privacy technology remains available but unused

Due Diligence Questions

WHEN A CBDC CLAIMS "PRIVACY":

- Is specific privacy technology named?
- Is it mature enough for deployment?
- Have they demonstrated it working?
- If vague: Likely marketing, not architecture

- Privacy for which transactions?
- Privacy from whom?
- What exceptions exist?
- "Privacy" often means "privacy from merchants" not "privacy from government"

- How can claims be verified?
- Is code open source?
- Independent security audit?
- Without verification: Trust-based, not technical

- Privacy features available at launch?
- Or "planned for future versions"?
- Roadmap specificity?
- Delayed privacy often means no privacy

- Legislative backing for privacy?
- Constitutional protection?
- Just central bank policy (easily changed)?
- Political commitment predicts durability

- "Privacy by design" without technical specification
- "Balanced approach" without defining balance
- "User choice" without default privacy
- "Similar to cash" without blind signatures or equivalent

---

Privacy-preserving technology exists and works. Zero-knowledge proofs, blind signatures, and related technologies are mathematically proven and have been implemented in production systems (Zcash, GNU Taler, etc.). The technology is real.

Trade-offs exist between privacy and compliance. No approach delivers both maximal privacy and full compliance capability. Every design choice involves trade-offs. This is fundamental, not a solvable engineering problem.

Maturity varies significantly across approaches. Full transparency is production-ready; privacy-preserving alternatives range from deployment-ready (blind signatures) to theoretical (perfect privacy + full compliance). The maturity gap is real.

Political economy favors surveillance over privacy. All actors with deployment power benefit from visibility; those preferring privacy lack power. This explains the gap between technical possibility and actual implementation.

⚠️ Whether scalability challenges will be solved. ZKPs and MPC have performance limitations. Whether engineering improvements will make them CBDC-scale practical in 5-10 years is uncertain.

⚠️ Whether regulators will accept cryptographic compliance. The technical capability to prove compliance without revealing data exists, but regulatory acceptance is unproven. Cultural and legal barriers may be insurmountable.

⚠️ Whether any catalyst will shift the political economy. Privacy scandals, competitive pressure, or court decisions could change the dynamic, but none is currently visible on the horizon.

⚠️ Whether tiered approaches will actually protect privacy. Theoretical tier designs look privacy-preserving; actual implementations (eCNY) show "tiers" can mean limits without privacy. Implementation matters more than design documents.

🔴 The gap between possibility and implementation is growing. Privacy technology has advanced significantly; CBDC designs have not incorporated it. The gap suggests political choice, not technical limitation.

🔴 "Privacy" claims are often misleading. Many CBDC proposals claim privacy without specifying mechanisms. When examined closely, "privacy" often means privacy from other users, not from the state.

🔴 Default settings drive outcomes. Even when privacy options exist, defaults determine usage. If surveillance is default and privacy requires action, most users will be surveilled.

🔴 First-mover CBDCs set expectations. eCNY's "controllable anonymity" (surveillance with user-facing opacity) may become the model, foreclosing privacy-preserving alternatives.

The technology to build privacy-preserving CBDCs exists. Blind signatures are 40 years old; ZKPs are mature for many use cases; tiered approaches are technically straightforward. The barrier is not technical—it's political and institutional.

Understanding this is crucial: when evaluating CBDC privacy claims, the question is not "Is privacy possible?" (it is) but "Is privacy chosen?" (usually not). The Technical Privacy Options Matrix and Technology Readiness Levels help distinguish genuine privacy architectures from marketing language.

The most likely outcome is CBDCs with surveillance architecture and privacy as marketing claim, not technical reality. Privacy-preserving CBDCs are possible but would require deliberate political choice against institutional interests. Betting on this choice is optimistic given current trajectory.


Assignment: Evaluate one privacy-preserving technology's applicability to CBDCs, including honest assessment of current limitations and realistic deployment timeline.

  1. Zero-knowledge proofs (zk-SNARKs or zk-STARKs)
  2. Blind signatures (Chaumian eCash model)
  3. Secure multi-party computation
  4. Tiered anonymity systems

Requirements:

  • Explain how the technology works (accessible but accurate)

  • Describe what privacy properties it provides

  • Identify what it proves/enables without revealing

  • How would this technology apply to CBDC design?

  • What specific privacy features would it enable?

  • What compliance capabilities would remain?

  • Provide a concrete use case scenario

  • Current scalability constraints (quantified if possible)

  • Maturity level (using TRL framework)

  • Regulatory acceptance challenges

  • Implementation complexity

  • Realistic timeline for CBDC-scale deployment

  • What would need to happen for adoption?

  • Probability assessment of actual implementation

  • Comparison to likely actual CBDC trajectory

  • Technical accuracy (25%)

  • Honest limitation assessment (25%)

  • Realistic deployment analysis (25%)

  • Quality of CBDC application analysis (15%)

  • Clear communication of complex concepts (10%)

Time investment: 4-5 hours
Value: This analysis methodology applies to any privacy technology claim you'll encounter. Building the skill to assess technical privacy claims protects against marketing language.

Submission format: Document of 2,000-2,500 words


Knowledge Check

Question 1 of 2

(Tests ZKP Understanding):

  • Ben-Sasson et al., "SNARKs for C: Verifying Program Executions Succinctly" - Technical foundation
  • Zcash Technical Documentation - Production ZKP implementation
  • ZK Podcast - Accessible explanations
  • Chaum, David, "Blind Signatures for Untraceable Payments" (1982) - Original paper
  • GNU Taler Documentation - Modern implementation
  • Chaum, Grothoff, Moser, "How to Issue a Central Bank Digital Currency" (2021)
  • BIS, "Privacy in CBDC Systems" Working Paper
  • ECB, "Digital Euro Privacy Analysis"
  • Bank of England, "CBDC Technology Considerations"
  • Electric Coin Company, "Zcash Privacy Technology"
  • Signal Protocol Documentation - Related privacy technology
  • Academic surveys of privacy-preserving computation
  • EFF, "Privacy Concerns in CBDC Design"
  • Various central bank working papers on privacy trade-offs
  • Academic criticism of CBDC privacy claims

For Next Lesson:
In Lesson 5, we examine the legal and constitutional frameworks for financial privacy across jurisdictions. Can legal protections succeed where technology alone hasn't? We'll assess whether rights-based approaches offer durable protection or merely reflect current political preferences.


End of Lesson 4

Total words: ~6,600
Estimated completion time: 60 minutes reading + 4-5 hours for deliverable

Key Takeaways

1

Privacy-preserving technology exists and is proven.

Zero-knowledge proofs can prove compliance without revealing transactions. Blind signatures enable cash-like digital currency. These aren't theoretical—they've been implemented. The barrier to private CBDCs is political, not technical.

2

Every approach involves trade-offs between privacy and compliance.

The Technical Privacy Options Matrix shows no approach delivers both maximal privacy and full compliance. Understanding this trade-off is essential for evaluating any CBDC design.

3

Maturity varies dramatically across approaches.

Full surveillance is production-ready; privacy-preserving alternatives range from deployment-ready (blind signatures) to research-phase (scalable ZKP compliance). The path of least resistance leads to surveillance.

4

Political economy explains the implementation gap.

All actors with deployment power benefit from visibility; those who prefer privacy lack power. This is why 40-year-old privacy technology remains undeployed in mainstream payment systems.

5

Evaluating privacy claims requires technical specificity.

When CBDCs claim "privacy," ask: what technology, what scope, what verification, what timeline, what political commitment? Vague claims usually mean surveillance with marketing language. ---