Privacy and Programmable Money | Future of Programmable Money | XRP Academy - XRP Academy
3 free lessons remaining this month

Free preview access resets monthly

Upgrade for Unlimited
Skip to main content
beginner55 min

Privacy and Programmable Money

Learning Objectives

Analyze why programmability fundamentally requires visibility

Evaluate the surveillance capabilities that programmable money enables

Assess privacy-preserving technologies (ZK proofs, TEEs, selective disclosure)

Compare different privacy design philosophies across implementations

Develop frameworks for appropriate privacy/programmability tradeoffs

Programmable money has a core requirement: to enforce conditions, the system must see data.

Consider a simple rule: "Allow transactions under $500 only."

To enforce this, the system must see the transaction amount. If the system can't see the amount, it can't check if the amount is under $500. The rule cannot execute.

This is not a technical limitation to be overcome—it's inherent to programmability. Conditions require information. Privacy hides information. They are in tension.

  • Full visibility (China's e-CNY): All data visible to central bank
  • Pseudonymity (most blockchains): Transactions visible, identities obscured
  • Selective disclosure (EU Digital Euro proposals): Limited data to limited parties
  • Privacy-preserving (ZK proofs): Prove conditions without revealing data

Each tradeoff has implications. This lesson explores them.


  • Sender identity
  • Recipient identity
  • Amount
  • Time
  • Location
  • Category (merchant type)
  • Item-level detail (potentially)
  • Associated data (invoices, contracts)
  • Complete purchase history
  • Movement patterns
  • Social graph (who pays whom)
  • Lifestyle indicators
  • Political activity (donations)
  • Health indicators (pharmacy purchases)
  • Financial status (income, spending patterns)
  • Tax enforcement (income visible)
  • Fraud detection (pattern analysis)
  • Benefits verification (automatic)
  • Sanctions enforcement (instant)
  • Surveillance of citizens
  • Political targeting
  • Social control
  • Retroactive investigation
  • Fraud prevention
  • Better service personalization
  • Efficient operations
  • Behavioral manipulation
  • Price discrimination
  • Data monetization
  • Privacy exploitation
Level Visibility Example
0 None Physical cash
1 Aggregate only Anonymous CBDC (theoretical)
2 Pseudonymous Most cryptocurrencies
3 Tiered by amount EU Digital Euro proposal
4 Full to issuer China e-CNY
5 Full and shared Authoritarian implementation
  • Most blockchains: Level 2 (pseudonymous)
  • Most CBDC proposals: Level 3-4
  • China e-CNY: Level 4
  • Privacy coins (Zcash, Monero): Attempting Level 1-2

Capability ≠ Use:
Having the ability to surveil doesn't mean active surveillance occurs. But:

  • Crisis justifies "temporary" measures
  • Fraud justifies "necessary" monitoring
  • Terrorism justifies "targeted" surveillance
  • Crime justifies "reasonable" access

Mission creep is historical pattern:

Step 1: Surveillance for terrorism
Step 2: Surveillance for organized crime
Step 3: Surveillance for tax evasion
Step 4: Surveillance for regulatory compliance
Step 5: General surveillance capability

Concept:
Prove a statement is true without revealing the underlying data.

Example:

Statement: "My balance is greater than transaction amount"
Proof: Mathematical proof that statement is true
Revealed: Nothing about actual balance

Statement: "I am over 18"
Proof: Mathematical proof from verified credential
Revealed: Nothing about actual age
```

Application to programmable money:

Rule: "Only allow transactions under $1000"
Traditional: System sees amount, checks < $1000
ZK: User proves amount < $1000 without revealing amount

Result: Rule enforced, privacy preserved
```

  • Computationally expensive (improving but significant)
  • Only works for predefined proofs
  • Doesn't support arbitrary programmability
  • Requires sophisticated cryptographic infrastructure

Concept:
Perform computation on encrypted data, producing encrypted results.

How it works:

Encrypt(A) + Encrypt(B) = Encrypt(A + B)

Balance check: Encrypt(balance) - Encrypt(amount) = Encrypt(result)
If Encrypt(result) > Encrypt(0): Allowed
System never sees balance or amount in plaintext
```

  • Extremely computationally expensive
  • Only supports specific operations
  • Not practical at scale for complex programmability
  • Mostly academic at this point

Concept:
Hardware-based secure enclaves process data privately.

How it works:

Data enters TEE (Intel SGX, ARM TrustZone)
Processing occurs in isolated enclave
Even system operators can't see inside
Results emerge without intermediate visibility

Application:

Transaction enters TEE
Rules checked inside enclave
Approval/denial emerges
No transaction details visible to operator
  • Trust in hardware manufacturer required
  • Hardware vulnerabilities discovered periodically
  • Side-channel attacks remain concern
  • Not truly trustless

Concept:
Different parties see different data based on role and need.

Example implementation:

User-to-user: Only parties see details
Merchant: Sees amount, basic verification
Bank: Sees user identity, transaction patterns
Central bank: Sees aggregate statistics
Tax authority: Sees flagged transactions over threshold
  • Pragmatic (matches real-world needs)
  • Configurable for different jurisdictions
  • Doesn't require exotic cryptography
  • Trust in intermediaries required
  • "Tiered" can expand over time
  • Not truly private (someone sees)
Approach Privacy Level Programmability Practicality Trust Required
ZK Proofs High Limited Improving Minimal
Homomorphic High Limited Low Minimal
TEEs Medium High Medium Hardware
Selective Disclosure Medium High High Intermediaries
Full Visibility None Full High Central authority

  • Central bank sees all transactions
  • "Controllable anonymity" (anonymous from counterparties, not state)
  • Tiered wallets limit anonymous amounts
  • Full regulatory access
  • State control is valued
  • Privacy from state not a priority
  • AML/KYC requirements strict
  • Social stability concerns
  • Maximum surveillance capability
  • Maximum programmability capability
  • Minimum financial privacy
  • Consistent with political system
  • Privacy emphasized in communications
  • Offline capability for privacy
  • Thresholds below which privacy protected
  • Limited central bank visibility
  • Holding limits (3000€ proposed)
  • Transaction monitoring above thresholds
  • AML requirements still apply
  • "Privacy" is relative, not absolute

Tension:
European citizens value privacy, but AML rules require visibility. Resolution: Tiered privacy with thresholds.

  • Cryptographic privacy by default
  • Sender, recipient, amount hidden
  • Optional transparency for compliance
  • Privacy as fundamental right
  • Cash-like digital money
  • Resistance to surveillance
  • Regulatory hostility (delistings)
  • Limited mainstream adoption
  • Programmability constrained
  • AML concerns limit institutional use
  • Pseudonymous at protocol layer
  • Compliance at exchange layer
  • Blacklist capability for sanctions
  • Issuer can freeze addresses
  • On-chain pseudonymity
  • Off-chain identification required for most access
  • Not truly private from issuer
  • Regulated entities must comply

Full Privacy          Full Programmability
     |                        |
     |     [Impossible]       |
     |         Zone           |
     |                        |
     |________________________|
                 ↑
         Can achieve one or the other,
         but not both fully
  • Programmability needs data to evaluate conditions
  • Privacy hides data
  • You can't evaluate what you can't see
  • Cash-like digital money
  • Simple transfers only
  • No complex conditions
  • Complete surveillance
  • Arbitrary conditions possible
  • Maximum control
  • Selective disclosure
  • Specific proofs without full visibility
  • Most practical implementations
Use Case Privacy Need Programmability Need Appropriate Design
Daily retail High Low Cash-like CBDC
Large purchases Medium Medium Tiered with thresholds
Government benefits Low Medium Full visibility acceptable
Enterprise treasury Medium High Private systems
Cross-border Medium Medium Selective disclosure
High-value settlement Low High Full visibility acceptable
  • Transactions visible on public ledger
  • Addresses pseudonymous
  • No privacy features native
  • Similar to most public blockchains
  • Not suitable for privacy-sensitive uses
  • Institutions may require privacy layers
  • Competitive disadvantage vs. private solutions
  • Payment channels provide some privacy (off-chain)
  • Layer 2 solutions could add privacy
  • Not current development priority

  • List all parties with visibility
  • Identify what each can see
  • Map visibility to necessity
  • For each programmable condition
  • What data must be visible?
  • Can ZK or other tech reduce visibility?
  • Legal protections
  • Technical protections
  • Governance protections
  • Audit and accountability
  • Even if not used today
  • What could be done with this infrastructure?
  • Who controls future use?
  • "Controlled anonymity" (not anonymity)
  • "Privacy by default with exceptions" (exceptions expand)
  • "Only for AML purposes" (purposes expand)
  • "Aggregate only" (aggregation can be reverse-engineered)
  • Who decides what's "suspicious"?
  • Who audits the surveillance?
  • What prevents scope expansion?
  • What's the accountability mechanism?
  • Specific, limited programmability benefits
  • Temporary, reversible conditions
  • User-chosen optional features
  • Clearly defined and audited access
  • Indefinite surveillance capability
  • Unconstrained authority access
  • Features users don't want
  • Unaudited, unaccountable systems

✅ Privacy and programmability are in tension (inherent)
✅ Full privacy with full programmability is impossible
✅ Different implementations make different tradeoffs
✅ Mission creep in surveillance is historical pattern

⚠️ Whether privacy-preserving tech will mature enough
⚠️ What tradeoffs publics will accept
⚠️ Whether legal protections will hold
⚠️ Long-term political evolution of surveillance norms

📌 "Controllable anonymity" is surveillance, not privacy
📌 Temporary measures become permanent
📌 Capability creates temptation
📌 Accepting surveillance for programmability convenience

Full privacy and full programmability cannot coexist. Every programmable money system makes a tradeoff. Understanding where that tradeoff sits, who decided it, and what accountability exists is essential for evaluating any implementation.


Evaluate the privacy implications of a specific programmable money implementation.

  • Select implementation (CBDC, stablecoin, or DeFi protocol)
  • Map visibility: who sees what data
  • Analyze surveillance capability created
  • Assess privacy-preserving measures
  • Evaluate accountability and constraints
  • Recommend improvements

Time Investment: 3-4 hours


A) Technical limitations that will be solved
B) Programmability requires visibility to evaluate conditions; privacy hides the data needed
C) Regulators mandate surveillance
D) Privacy technology is too expensive

Correct Answer: B


A) True anonymity with user control
B) Anonymity from counterparties but full visibility to central bank
C) Complete privacy from everyone
D) Anonymous small transactions only

Correct Answer: B


A) They are mathematically impossible
B) They only work for predefined specific proofs, not arbitrary programmability
C) They require too much electricity
D) They are illegal in most countries

Correct Answer: B


End of Lesson 12

  • Previous: Lesson 11 - Cross-Border Programmable Money
  • Next: Lesson 13 - Control, Censorship, and Programmable Money

Key Takeaways

1

Programmability requires visibility

: You can't enforce rules about what you can't see. This is inherent, not a bug.

2

Surveillance potential is real and extensive

: Programmable money enables complete transaction surveillance—purchases, movements, relationships, politics.

3

Privacy-preserving technologies help but have limits

: ZK proofs, homomorphic encryption, and TEEs provide partial solutions for specific use cases.

4

Different implementations make different choices

: China prioritizes control; EU claims privacy but implements thresholds; privacy coins maximize privacy at programmability cost.

5

Capability creates temptation

: Even well-intentioned surveillance systems tend toward expansion. Design constraints and accountability matter. ---