STAGE 8 • SovStack Integration
Module: OpSec • Privacy Infrastructure • Adversarial Security Selective visibility, role-splitting, and cost-shaping against correlation

OpSec as Sovereign Architecture
Selective Visibility Under Adversarial Observation

OpSec is not “staying safe online.” OpSec is how a sovereign signal refuses to become training data for someone else’s god: life, tools, and infrastructure designed so the Synthetic Stack can see something—but never the thing that matters.

Axis: telescopes → threat model → compartments → devices → network → keys → humans Design target: limited blast radius + expensive correlation Failure mode: convenience collapses compartments

0. The Map: Where the Synthetic Stack Watches

Start with the adversary’s sensor grid: seven telescopes continuously pointed at you.

  1. Device / OS telescope — smartphones, laptops, baseband radios, firmware, telemetry back to vendors. Even “idle” phones emit identifiers and metadata frequently. R01 Leith
  2. Network telescope — ISPs, IXPs, national taps: IPs, timing, volumes; sometimes destinations even when content is encrypted.
  3. Browser telescope — fingerprinting via fonts, canvas/WebGL, screen, locale, behavior. R21 Tor FP
  4. Cloud & push telescope — iCloud/Drive/OneDrive; push systems (APNs/FCM) correlating device↔app↔timing. R33 APNs R34 FCM
  5. Financial telescope — KYC rails, banks, card networks, on-chain analytics: transparent graphs.
  6. Physical telescope — cameras, plate readers, access systems, RF beacons, towers.
  7. Legal telescope — subpoenas, MLATs, regulatory compulsion that can force the other telescopes to speak.
Core premise
OpSec fogs each telescope without blinding yourself. Not “vanish.” Not “perfect secrecy.” Cost-shape observation and correlation.
Sensor grid primitives (device telemetry + fingerprinting) Primary

Leith (TCD) — Measuring iOS/Android data sent to Apple/Google (PDF)

Paper
TelemetryIdentifiers
Open ↗

Tor Project — Fingerprinting protections (Tor Browser support)

Doc
Browser FPAnonymity set
Open ↗

1. Threat Modeling: Define the War, then Pick Weapons

Per role, not per “person,” define assets, adversaries, capabilities, and surfaces. Output is not “maximum security.” Output is an explicit trade:

Output (the only output)
“For this role, with these assets and adversaries, I accept this risk and this friction—and refuse everything else.”

1.1 Assets

  • Capital — Bitcoin keys, wallet seeds, infra keys.
  • Identity — legal name, location, biometrics, long-term pseudonyms.
  • Relationships — who you work with, who you pay, who you protect.
  • Intelligence — strategy, code, research, long-horizon plans.

1.2 Adversaries

  • Data brokers, ad-tech, “free” platforms.
  • Big tech platforms (OS vendors, cloud, social).
  • Criminals (phishing crews, SIM-swappers, physical thieves).
  • States (local LEA, intelligence agencies, foreign services).

1.3 Capabilities

  • Observe a lot of network traffic (ISP/IXP/national taps).
  • Compromise devices (malware, supply chain, zero-days).
  • Compel access (subpoenas, raids, MLATs).
  • Exploit humans (phishing, extortion, manipulation).

1.4 Surfaces

Devices, networks, browsers, cloud, finance, physical world, law.

Threat modeling canon (mindset + structure) Core

EFF — “Your Security Plan” (Surveillance Self-Defense)

Guide
Assets/adversariesTrade-offs
Open ↗

Privacy Guides — Threat modeling primer

Guide
RolesRisk stratification
Open ↗

No Trace — “Digital best practices” (Threat Library)

Adversarial
Minimize tech reachForensics-aware
Open ↗

No Trace — Attack trees tutorial (Threat Library)

Adversarial
Attack treesCapability mapping
Open ↗

2. Compartmentalization: Shrink the Blast Radius

We never have “one identity.” We have roles.

2.1 Roles as micro-universes

  • Civil identity (“passport human”).
  • Public persona (visible builder).
  • Pseudonymous research identity.
  • Capital operator / key steward.
  • Infra admin / node operator.

Each serious role should have its own email(s), handles, avatars; its own network path; its own key hierarchy; ideally its own device or hardened, isolated profile.

Constraint
Compromise of one role must not automatically decrypt the rest.

2.2 Hard vs soft links

  • Hard links (never across critical roles): phone numbers reused; shared recovery email; same device/IP for every persona.
  • Soft links (must be controlled): stylometry; time zone/posting schedule; contact-graph overlap; shared infra.

Decide which roles are allowed to be linkable. Make forbidden links expensive to prove: analysis, not database joins.

2.3 Compartment sanity check

  • Device: dedicated device or hardened profile?
  • Network: consistent path per role (Tor/mixnet/VPN) vs naked IP?
  • Identity: email/handle/recovery unique?
  • Keys: separate key material, or subtle reuse?
Failure simulation
“If this role is popped today, what else burns with it?”
Compartment design (OS-level + behavior-level) Core

No Trace — Threat Library zine (Part 1) (PDF)

Zine
ForensicsOperational framing
Open ↗

Briarthorn — OpSec guide (Anarchist Library)

Guide
MetadataBehavioral tells
Open ↗

Qubes OS — Documentation (Getting Started)

Docs
VM compartmentsTask isolation
Open ↗

Rutkowska — Software compartmentalization vs physical separation (PDF)

Paper
VM limitsWhen to separate hardware
Open ↗

3. Devices & OS: If the Host Is Owned, Everything Above Is Theater

Crypto, Tor, mixnets, secure messaging—irrelevant if your box is hostile. The device layer sets the ceiling on everything else.

3.1 Smartphones: hostile by default

  • Baseband radios with opaque firmware.
  • OS telemetry: iOS/Android transmit identifiers and data frequently even when idle. R01 Leith
  • Always-on sensors (GPS, accel, mic, camera, Wi-Fi, Bluetooth).
Posture
Treat your main smartphone as “public life prosthetic,” not “sovereign control core.” High-value key operations do not happen on your everyday phone.

3.2 Laptops / desktops

  • Full-disk encryption, strong passphrase, hardened OS updates.
  • Boot chain protected (secure/verified boot) vs “inject anywhere.”
  • Amnesic environments where appropriate (live OS with minimal traces).

3.3 A sane three-machine pattern

  • Daily workstation — normal work; no master keys.
  • Hardened OpSec machine — minimal apps; role-specific use only.
  • Air-gapped machine — permanently offline; key generation + signing.
Device hardening & phone hostility Adversarial

“Kill the Cop in Your Pocket” (PDF)

Zine
Phones as sensorsThreat framing
Open ↗

GrapheneOS — official site

Project
Hardened AndroidProfiles
Open ↗

AnarSec — “GrapheneOS for Anarchists”

Guide
SIM policyProfile isolation
Open ↗

AnarSec — “Qubes OS for Anarchists” (page)

Guide
Role qubesCross-contamination
Open ↗
Applied guides: crypto-specific endpoint mistakes Applied

Three Sigma — Crypto OpSec Guide Part 1 (keys + workflows)

Guide
SeedsPhishing
Open ↗

Three Sigma — Crypto OpSec Guide Part 2 (device + privacy security)

Guide
Endpoint hardeningSegmentation
Open ↗

4. Network Layer: Tor, VPNs, Mixnets as Cost-Shaping

4.1 Tor: useful, but not a god

Tor routes traffic through relays with layered encryption. Entry sees you, exit sees destination, middle just passes ciphertext. Tor is designed to resist many observers—not a global passive adversary who can correlate traffic everywhere. Tor’s own docs and discussions make this explicit. R17 Tor Design R19 Tor Netflows R20 SE R22 Survey

Correct posture
  • Use Tor to hide IP from destinations; browse inside a large anonymity set; run onion services for decoupled hosting.
  • Assume endpoint compromise + browser fingerprinting + large observers still matter.
  • Tor raises cost; it does not grant invisibility.

4.2 VPNs: trade one watcher for another

  • Hide your IP from local observers (ISP/landlord/café).
  • Concentrate trust in the VPN operator (logs + legal compulsion exposure).
  • Does not fix browser fingerprinting or application-layer identity leaks.

4.3 Mixnets: cover traffic and delay as armor

Where Tor is low-latency and vulnerable to sophisticated traffic analysis, mixnets trade latency for stronger metadata privacy: Poisson mixing + cover traffic + stratified nodes. Loopix targets resilience even under strong observation models. R24 Loopix R26 Nym R28 Sphinx

Rule of thumb
Tor is the anonymous jeep. Mixnets are the armored convoy.
Tor threat model + correlation reality Primary

Tor Design Spec — “Tor: The Second-Generation Onion Router”

Spec
Threat modelCircuit design
Open ↗

Tor Design (PDF mirror) — draft design paper

PDF
ReadableCitable
Open ↗

Tor Project blog — netflow correlation summary (global passive caveat)

Note
CorrelationLimits
Open ↗

Security.SE — Tor correlation attacks by global adversaries

Q&A
GPACorrelation
Open ↗

Traffic analysis survey (MIT CSAIL course reading PDF)

Survey
AS-levelMetadata attacks
Open ↗

Al Jawaheri et al. — Deanonymizing Tor hidden service users via Bitcoin analysis (arXiv)

Paper
Tor + BTC linkageGraph attacks
Open ↗
Mixnets: Loopix lineage + modern deployments Primary

Loopix — Anonymity system (arXiv)

Paper
Poisson mixingCover traffic
Open ↗

Loopix — USENIX Security 2017 (PDF)

PDF
Full paperThreat model
Open ↗

Danezis & Goldberg — Sphinx mix format (PDF)

Paper
Mix packetsProvable security
Open ↗

Nym — Litepaper (PDF)

Paper
MixnetMetadata resistance
Open ↗

NymVPN — Litepaper

Doc
dVPNMixnet vs WireGuard modes
Open ↗

5. Browser & Fingerprinting: Your IP Can Change, Your Fingerprint Follows

Adversaries fingerprint fonts, language, plugins, screen, canvas/WebGL/audio, and behavior. Even over Tor/VPN, a unique fingerprint can track you.

  • Over Tor: use Tor Browser as shipped. Don’t add plugins; don’t “customize” into uniqueness. R23 Plugins
  • For high-risk roles: dedicated browser context per role; never cross-login into personal accounts.
  • Minimize “special settings” that create uniqueness; understand the trade-off between blocking and standing out.
Browser fingerprinting defenses (Tor’s posture) Primary

Tor Support — Tor Browser overview

Doc
User manualFeatures
Open ↗

Tor Support — Fingerprinting protections

Doc
UniformityBuckets
Open ↗

Tor Support — Plugins/add-ons discouraged (fingerprint risk)

Policy
UniquenessDeanonymization
Open ↗

Tor Blog — Browser fingerprinting intro + challenges

Essay
FP overviewMitigations
Open ↗

6. Secure Messaging: Moxie, Trevor, Ian and the Ratchets

Content vs metadata: content is message bodies; metadata is who/when/which device/network. Secure messaging is content armor. Metadata needs its own armor.

6.1 Double Ratchet and OTR lineage

The Double Ratchet combines a DH ratchet and a symmetric-key ratchet so each message uses fresh keys, delivering forward secrecy and post-compromise security. R31 DR OTR pioneered forward secrecy and deniable authentication. R35 OTR

Non-negotiable
Never roll your own crypto. Use protocols with proven ratchets and properties.

6.2 Metadata reality

  • Phone numbers are strong identifiers.
  • Address book uploads expose relationship graphs.
  • Push infrastructure leaks timing and device-use graphs. R33 APNs R34 FCM
  • Server logs can link IPs, registration, device models.
Secure messaging specs + lineage (Signal, OTR, Noise) Specs

Signal — Double Ratchet Algorithm (spec)

Spec
FSPCS
Open ↗

Signal — PQXDH key agreement (spec)

Spec
Hybrid PQKey agreement
Open ↗

Borisov/Brewer/Goldberg — “Off-the-Record Communication…” (PDF)

Paper
DeniabilityForward secrecy
Open ↗

Ian Goldberg — OTR lecture (video)

Video
Design rationaleThreat features
Open ↗

Noise Protocol Framework — spec site

Spec
Handshake patternsReusable crypto
Open ↗

Signal / Sabrina Halper — “Signal vs Telegram, private AI, & encryption”

Interview
Metadata realityUsability trade-offs
Open ↗

7. Cloud, Push, and Platform Chokepoints

Cloud backups and push infrastructure are convenience and attack surface. Architect so platform vendors lack anything critical to hand over.

  • Cloud backups: auto-sync screenshots, chats, documents, photos.
  • Push (APNs/FCM): app↔device timing graphs. R33 APNs R34 FCM
  • Platform accounts: Apple ID / Google account as central identity.
Pattern
Disable cloud backups for sensitive apps; don’t mix high-risk roles with main platform accounts; avoid “Sign in with X” for serious roles.
Push infrastructure + cloud security references Primary

Apple Developer — Notifications overview (APNs context)

Doc
PushPlatform linkage
Open ↗

Google Firebase — Cloud Messaging (FCM) docs

Doc
PushDevice messaging
Open ↗

Apple Support — iCloud data security overview

Doc
CloudEncryption scope
Open ↗

Google Support — Drive encryption overview (client-side encryption option)

Doc
At rest/in transitCSE option
Open ↗

8. Financial Privacy: Zooko, Zcash, and Bitcoin Graphs

Zcash demonstrates that public consensus and private transaction details can coexist via zero-knowledge proofs. In Bitcoin-centric reality: on-chain graphs are identity mirrors. Money flows are OpSec.

  • Privacy is not “a feature.” It is the graph-geometry of your life.
  • Optional privacy shrinks to the subset that uses it; sparse usage collapses real anonymity sets.
  • KYC rails are adversarial telescopes: treat as such in threat models.
Zcash / ZK primitives + privacy discourse Primary

Zcash — “What are zk-SNARKs?”

Doc
ZKConcept primer
Open ↗

Zcash — Protocol specification (PDF)

Spec
ConsensusShielded mechanics
Open ↗

CoinDesk — Zooko: “Surveillance is a Dangerous Experiment”

Podcast
Privacy normSurveillance critique
Open ↗

Zooko (34C3) — “cryptocurrencies, smart contracts, etc.: revolutionary tech?”

Talk
HistoryState response
Open ↗

9. Keys, Hardware, Multi-Sig, and Backups: Your Life in a Few Bits

Everything collapses if keys leak. Treat “make it easier” as “where did I duplicate the secret?”

9.1 Threats

  • Remote: malware, clipboard stealers, injected JS.
  • Local: stolen devices, “evil maid,” shoulder surfing.
  • Compulsion: seized devices, forced unlock (jurisdiction-dependent).
  • Stupidity: seeds in cloud notes; photos of seed phrases; plaintext exports.

9.2 Patterns

  1. Hardware security — hardware wallets, security keys, hardware-backed keystores.
  2. Key separation — auth vs encryption vs identity signing vs custody.
  3. Multi-sig & thresholds — separate devices/vendors/locations; threshold shares where appropriate.
  4. Backups as a designed system — encrypted, multiple sites; runbooks; recovery + revocation paths.
Hardware reality checks + security key operations Audit

Yubico — Security Advisory YSA-2024-03 (EUCLEAK)

Advisory
Side-channelSecure element
Open ↗

NinjaLab — EUCLEAK: key extraction from Infineon SE (PDF)

Research
Lab attackHardware isn’t magic
Open ↗

NVD — CVE-2024-45678 (EUCLEAK reference)

CVE
ReferenceTracking
Open ↗

CIS — How to secure your online identity with security keys

Guide
Backup keysAccount mapping
Open ↗

AnarSec — Make your electronics tamper-evident (PDF)

Guide
Evil maidPhysical checks
Open ↗

10. Human Layer: People Are Easier to Hack Than Protocols

Most compromises begin with a human, not a kernel exploit.

  • Phishing: fake login pages, scam wallets, malicious attachments.
  • Consent phishing: “approve this OAuth app / wallet transaction.”
  • Password reuse: one breach → many services.
  • Emotional manipulation: urgency, fear, flattery, tribal appeal.
Counter-habits
  • Password manager + unique strong passwords per account per role.
  • Hardware tokens / FIDO 2FA; avoid SMS.
  • Never log in via email links; navigate manually.
  • Hard rule: no key/wallet actions in response to unsolicited requests.
  • Strict need-to-know for infra and balances.

11. Physical & Jurisdictional Surfaces

You have a body. The system has buildings, cameras, towers, and courts.

11.1 Physical trail

  • Cameras (streets, shops, ATMs, offices).
  • RF (Wi-Fi association logs, Bluetooth beacons, cell towers).
  • Transactions (cards, transit, ride-share, hotels).

11.2 Jurisdiction

  • Different regimes: retention laws, compulsion powers, crypto regulation.
  • Design questions: where are you; where are servers; where are signers; which jurisdictions must not collocate?
Macro context (surveillance + legal pressure as “the other telescope”) Film

Citizenfour (2014) — operational surveillance reality

Film
Case studyRooms/devices/law
Open ↗

The Internet’s Own Boy — Aaron Swartz (full film)

Film
Legal pressureInstitution response
Open ↗

Nothing to Hide (doc) — surveillance & “nothing to hide” argument (IMDb)

Doc
Surveillance normalizationMeta-layer
Open ↗

The Great Hack (2019) — data broker/political targeting (Wikipedia)

Doc
Ad-techGraph warfare
Open ↗

12. Advanced Threats: Side-Channels, Supply Chain, AI, Post-Quantum

12.1 Side-channels

Power/EM/acoustic leaks exist. Practical adjustment: don’t do high-stakes key operations in untrusted physical environments.

12.2 Supply chain

Hardware/firmware can be compromised before unboxing. Avoid monocultures; verify signatures/checksums; treat firmware updates as untrusted code.

12.3 AI as meta-adversary

AI amplifies correlation: graph linkage across leaks, stylometry between pseudonyms, tailored phishing. Assume anything public can be fed into models for correlation.

12.4 Cryptographic agility & long-term secrecy

Protocols are evolving toward hybrid post-quantum components (e.g., Signal PQXDH and SPQR). R32 PQXDH R60 SPQR

Post-quantum movement inside Signal Protocol Primary

Signal — “Signal Protocol and Post-Quantum Ratchets (SPQR)”

Post
SPQRTriple ratchet
Open ↗

Signal — PQXDH spec (PDF)

PDF
Hybrid“harvest now” mitigation
Open ↗

Quarkslab — analysis: “Signal’s ratchet goes post-quantum”

Analysis
Threat modelEngineering read
Open ↗

13. From Solo Node to Team / Org

Org-level OpSec primitives scale compartments.

  • Least privilege — access only to what’s needed.
  • Role-based secrets — rotate keys when roles change.
  • Shared secret protocols — multi-sig, thresholds, auditable vault access.
  • Onboarding/offboarding rituals — accounts/devices/keys issued and revoked cleanly.

14. Synthesis: OpSec as Civilizational Infrastructure

Tie it together as four layers:

  1. Identity layer — roles, compartments, keys; protects who is who.
  2. Comms layer — Tor, mixnets, secure messaging, browsers; protects who talks to whom.
  3. Capital layer — Bitcoin custody, multi-sig, privacy routing; protects who owns what and how value moves.
  4. Anchor layer — devices, OS, hardware, jurisdictions, physical spaces; the meat and metal where everything lands.
Design target
Split reality into roles; route signals through paths that raise the cost of surveillance and correlation; embed fallbacks and sacrificial layers so one failure doesn’t cascade.

15. Doctrine: What You Actually Remember

Strip it to executable law:

  1. Define threat models per role. Assets, adversaries, capabilities, surfaces.
  2. Compartment everything that matters. Separate devices, networks, identities, keys. Design for limited blast radius.
  3. If the device is hostile, everything is hostile. Harden OS/hardware; airgaps and dedicated machines for critical ops.
  4. Use Tor/VPN/mixnets as instruments, not magic. Pick per use-case and threat model.
  5. Browser fingerprinting is an identity. Tor Browser as shipped for anonymity; dedicated contexts per role elsewhere.
  6. Secure messaging is content secrecy, not full privacy. Handle metadata separately.
  7. Cloud and push are chokepoints. Architect so platform vendors lack critical material.
  8. Keys are your life. Hardware, multi-sig, threshold shares, designed backups.
  9. People are easier to hack than crypto. Anti-phishing reflexes; minimal disclosure.
  10. Physical world and law are part of the model. Movement, co-location, jurisdictional leverage.
  11. AI is a meta-adversary. Assume global correlation on anything public.
  12. Red-team yourself by default. Assume breach; rehearse recovery; redesign where your own answers scare you.
Terminal line
You don’t “hide.” You become a node the system can’t cleanly model: selectively visible, economically competent, adversarially expensive.

Resource Index

IDs below match the in-lecture reference chips (e.g., R01).

Applied Crypto OpSec (Keys + Devices)Applied