Fiat Lux
UC Berkeley - where it all started
Let There Be Light

Most people assume the privacy battle
is already lost.

It's not.

Your email provider holds the keys to your inbox.

Your carrier tracks your location to the yard and stores it for years.

Your data is sold by companies you have never heard of.

The encryption that could stop all of it is under legislative attack.

Privacy is not a product you buy.
It is an architecture you build.

Research & Analysis

Privacy Research

Privacy research, investigative reporting, and field-level analysis of encryption protocols and jurisdiction. Written by David for Orion Private LLC.

8 reports published · Actively maintained · New research in progress
Featured
Investigative Report

The FBI director used Gmail. Iranian hackers read his email for years.

On March 27, 2026, an Iranian state-linked group published over 300 emails from FBI Director Kash Patel's personal Gmail. The FBI confirmed the breach. The emails were not obtained by breaking encryption. They were obtained because a consumer email account had credentials in prior data breaches.

GmailCredential StuffingIranOPSEC
March 28, 2026 · 10 min NEWRead
Analysis

The FTC was never enough: How America's last privacy enforcer was designed to fail, then gutted on purpose

The FTC's consent decrees were the closest thing Americans had to enforceable privacy law. In 2025, commissioners were fired, guidance was deleted, and consent orders were reopened. Chris Jay Hoofnagle's academic work predicted every failure point.

FTCSection 5EnforcementHoofnagle
February 2026 · 20 minRead
Email Privacy

Proton Mail vs Tuta Mail - What a privacy advocate looks at

Both encrypt your inbox. But jurisdiction, metadata exposure, encryption scope, and real world court orders tell a more complicated story.

EmailEncryptionJurisdiction
January 2026 · 22 minRead
Privacy Research

What your notes app knows about you, even when your notes are encrypted

End-to-end encryption protects your note content. It does not protect the metadata your notes generate.

NotesMetadataE2EE
January 2026 · 15 minRead
All Research
Analysis

The government is buying your data because it cannot legally collect it

The FBI confirmed to Congress this month that it purchases location data from commercial brokers. No warrant. No judge. No notification. This has been happening for years, across multiple agencies and administrations. Here is the full story.

Data BrokersFourth AmendmentSurveillanceFISA
March 2026 · 15 minRead
Privacy Research

SMS vs iMessage vs WhatsApp vs Telegram vs Signal: What the FBI's own document tells us

In 2021 the FBI produced an internal guide detailing exactly what data it can legally obtain from nine messaging apps. The document was obtained via FOIA. Here is a field-by-field analysis of five of the most common services.

MessagingE2EEFBIMetadata
December 2025 · 25 minRead
Privacy Research

What porn sites know about you, and who they tell

Private browsing does not make you private. A UC Berkeley study found Google on almost every adult site, search terms leaked in plaintext, and sexual preferences encoded in cookies.

TrackingMetadataBrowser Privacy
November 2025 · 10 minRead
Privacy Research

Your photo library knows more about you than your journal: Ente vs iCloud vs Google Photos vs OneDrive

Every photo records your GPS coordinates, device serial number, and exact timestamp. When you upload to the cloud, the question is who can see it, who can be compelled to hand it over, and whether AI is training on it.

PhotosEncryptionAIEXIF
November 2025 · 25 minRead
Privacy Research

How encryption actually works, why it matters, and where it stops protecting you

Symmetric vs asymmetric, AES-256 vs quantum computers, what HTTPS actually protects, and why none of it matters if your phone is compromised. The foundational reference for everything else on this site.

EncryptionAES-256QuantumOPSEC
October 2025 · 25 minRead
Coming Soon
COMING SOONApril 2026

The VPN trust problem - What "No logs" actually means under subpoena

Every VPN claims no logs. Court records tell a different story.

COMING SOONMay 2026

Signal vs Session vs SimpleX - A metadata comparison

Which messenger leaks the least?

Field Assessment

Proton Mail vs Tuta Mail - What a privacy advocate looks at

Both encrypt your inbox. Both market themselves as the antidote to Gmail. But jurisdiction, metadata exposure, encryption scope, and real world court orders tell a more complicated story. Here is a field by field comparison.

D David
January 2026
Last updated: March 24, 2026
Privacy Research
~22 min read
Both providers promise the same thing: end-to-end encryption, open source code, European jurisdiction, and a guarantee that not even the company itself can read your email. Proton Mail (established in Switzerland, 2014) and Tuta Mail (established in Germany, 2011) are the two most recommended encrypted email providers on virtually every privacy list. But recommendation lists rarely examine the differences that matter under legal pressure, specifically what metadata each provider can see, what a court can compel them to hand over, how their encryption protocols actually differ at a technical level, and what has happened in the real cases where governments came knocking. This analysis covers all of that. Sources include official privacy policies, published transparency reports, court records, open source code repositories, and the companies' own security documentation.
PROTON MAIL (Switzerland)
TUTA MAIL (Germany)
ADVANTAGE / ENCRYPTED
PARTIAL / CONDITIONAL
EXPOSED / RISK

Company & Jurisdiction

Jurisdiction determines what a government can legally force a provider to do. It is arguably more important than the encryption itself, because encryption protects content, but jurisdiction determines what happens to everything around it.

Legal Entity
Proton MailProton AG, Geneva, Switzerland
Tuta MailTutao GmbH, Hanover, Germany
Governing Privacy Law The primary legislation that determines what data can be compelled
Proton MailSwiss Federal Act on Data Protection (FADP) + BÜPF (telecom surveillance law) [1]
Tuta MailGerman BDSG + EU GDPR + StPO (criminal procedure code) [2]
Intelligence Alliance Membership Whether the country participates in multinational intelligence sharing ⚠ MED - shared intelligence increases the pool of agencies that may request your data
Proton MailNOT A MEMBER of Five, Nine, or Fourteen Eyes
Tuta MailFOURTEEN EYES member
Foreign Data Request Process How a foreign government (e.g. FBI, Europol) obtains your data
Proton MailMust go through Mutual Legal Assistance Treaty (MLAT), then Swiss court approval required [3]
Tuta MailMust obtain a German court order. German companies cannot hand data directly to foreign agencies [2]
Data Retention Law for Email ⚠ HIGH - mandatory retention creates data that exists solely for law enforcement access
Proton MailNO Swiss court ruled (Oct 2021) email providers are not telecoms and not subject to retention [4]
Tuta MailNO No data retention law applies to email in Germany. Two courts affirmed Tuta is not subject to ISP retention rules [5]
Gag Order Risk Whether authorities can prevent the provider from telling you about a data request
Proton MailPOSSIBLE Temporary delay, but Swiss law requires eventual notification [3]
Tuta MailNO German law does not permit forcing companies to submit to a gag order (per Tuta's transparency report) [6]
Encryption Backdoor Risk Whether a law exists or is proposed that could compel weakening of encryption
Proton MailUNCLEAR No current law, but Switzerland is not bound by ECHR encryption ruling
Tuta MailPROTECTED ECHR (2024) ruled EU nations cannot mandate weakening of E2EE. Germany is bound by this. [2]
Legal Orders Received (latest available) ⚠ MED - volume indicates how often the provider's data is targeted
Proton Mail~11,000+ in 2024, compliance rate ~94% [7]
Tuta MailSignificantly lower volume (exact figures in semi-annual transparency report) [6]
User Base
Proton Mail100+ million accounts
Tuta Mail~10 million accounts
NOTE Everyone points to Switzerland and the Five Eyes when comparing Proton to US-based providers. And yes, Switzerland is not in the alliance. But that does not mean Swiss data is untouchable. Foreign agencies request Swiss data through mutual legal assistance treaties all the time. Proton received over 11,000 legal orders in 2024 and complied with roughly 94% of them. What Switzerland gives you is friction, not immunity. A warrant has to clear more hurdles to reach your inbox. But the hurdles are not walls. And lately, they are getting shorter. In early 2025, the Swiss Federal Council proposed changes to surveillance law that would require services with over 5,000 users to identify their customers, retain data for six months, and decrypt communications on request when the provider holds the keys. Proton's CEO called the proposal "extreme" and said it would make Swiss services less private than Google. Proton responded by beginning to move physical infrastructure out of Switzerland, starting with its AI assistant Lumo, which launched on German servers in July 2025. When the company that built its entire brand on Swiss jurisdiction starts relocating away from Swiss jurisdiction, that tells you everything about how much weight to put on location alone.

Encryption Protocols - The Technical Layer

Both providers use end-to-end encryption. The difference is in what standard they chose, what that standard can and cannot encrypt, and how far ahead each has moved on post-quantum readiness. These are not cosmetic differences, they determine the scope of what the server can never see.

Encryption Standard The foundational protocol governing how messages are encrypted
Proton MailOpenPGP (public standard) [8]
Tuta MailTutaCrypt (proprietary hybrid protocol) [9]
Symmetric Encryption
Proton MailAES-256
Tuta MailAES-256 (CBC mode + HMAC-SHA-256) [9]
Asymmetric Encryption (classical)
Proton MailRSA-2048 / RSA-4096 (user selectable)
Tuta MailX25519 (ECDH), replaced RSA-2048 in 2024 [9]
Post-Quantum Encryption Protection against "harvest now, decrypt later" attacks by future quantum computers ⚠ CRITICAL - data encrypted today with only classical algorithms may be decryptable within 10 to 15 years
Proton MailIN DEVELOPMENT No production deployment for email yet
Tuta MailLIVE since March 2024, CRYSTALS-Kyber (Kyber-1024) + X25519 hybrid [10]
Key Derivation Function How your password is converted into encryption keys
Proton Mailbcrypt
Tuta MailArgon2 (more resistant to GPU/ASIC attacks) [9]
PGP Interoperability Whether you can exchange encrypted mail with any PGP user worldwide
Proton MailYES Full PGP/MIME and PGP/Inline support [8]
Tuta MailNO Proprietary protocol, no PGP support [11]
Open Source
Proton MailYES All client apps + encryption libraries
Tuta MailYES All client apps + encryption libraries
NOTE Proton chose OpenPGP for interoperability, meaning any PGP user worldwide can exchange encrypted mail with a Proton user. Tuta chose a proprietary protocol for encryption scope, meaning it can encrypt subject lines, sender names, and more metadata that PGP structurally cannot touch. This is a genuine trade-off with no objectively correct answer. It depends on your threat model.

What Gets Encrypted - Field by Field

This is where the choice of encryption protocol creates real, measurable differences. Proton's adherence to OpenPGP means certain email headers are structurally excluded from end-to-end encryption. Tuta's proprietary protocol lets it encrypt fields that PGP cannot. Every field listed here as "not E2EE" is a field the provider can theoretically read or hand over under court order.

Email Body + Attachments ✓ Core content, encrypted by both
Proton MailE2EE
Tuta MailE2EE
Subject Line ⚠ HIGH - subject lines reveal what you're writing about without reading the body. "Meeting with attorney re: whistleblower complaint" is devastating metadata.
Proton MailZERO-ACCESS encrypted at rest, NOT end-to-end encrypted. Proton can technically access it under court order. [12]
Tuta MailE2EE Encrypted on device before transmission. Tuta cannot read it. [11]
Sender & Recipient Names The display name associated with the sender/recipient, not the email address itself
Proton MailZERO-ACCESS only, not E2EE
Tuta MailE2EE [11]
Sender & Recipient Email Addresses Structurally required by SMTP, no provider can E2EE this ⚠ HIGH - reveals who is communicating with whom
Proton MailPLAINTEXT required by email protocol
Tuta MailPLAINTEXT required by email protocol
Timestamps (sent/received) ⚠ MED - reveals when you communicate and how often
Proton MailPLAINTEXT
Tuta MailPLAINTEXT
Contacts / Address Book
Proton MailPARTIAL Some fields E2EE, email addresses and names NOT encrypted [13]
Tuta MailFULLY E2EE All fields including name, phone, address, birthday [11]
Attachment Names ⚠ MED - "whistleblower_evidence_final.pdf" is informative even without reading it
Proton MailACCESSIBLE to Proton per privacy policy [14]
Tuta MailE2EE
KEY DIFFERENCE Proton Mail's privacy policy explicitly states it has access to: sender and recipient email addresses, the IP address incoming messages originated from, attachment names, message subjects, and message sent/received times. [14] Tuta encrypts subject lines, sender/recipient names, and attachment data end-to-end. The only metadata Tuta cannot encrypt is email addresses and timestamps, because the email protocol itself requires them to route messages. [11]
UNDER A COURT ORDER
What Can Actually Be Handed Over
This is the section that matters most. Encryption protects content. But when a court order arrives, the question becomes: what does the provider actually have? What exists on their servers in a form they can read?
Email Content (body + attachments)
ProtonCANNOT HAND OVER E2EE, no keys
TutaCANNOT HAND OVER E2EE, no keys
Email Subject Lines ⚠ HIGH - the single biggest practical difference between these two providers
ProtonCAN HAND OVER Not E2EE. Proton confirms this explicitly. [12]
TutaCANNOT HAND OVER E2EE
Sender/Recipient Email Addresses + Timestamps
ProtonCAN HAND OVER
TutaCAN HAND OVER
IP Address ⚠ HIGH - IP address reveals physical location and ISP
ProtonNOT logged by default. CAN be compelled to start logging for a specific account going forward. [4]
TutaNOT logged by default. CAN be compelled via TKÜ order. Session IPs encrypted and auto-deleted after 1 week. [6]
Recovery Email Address ⚠ HIGH - if you used a personal email as recovery, this single field can deanonymize you entirely
ProtonCAN HAND OVER (if user provided one, it is optional) [15]
TutaNOT APPLICABLE Tuta does not use recovery emails. Uses recovery code instead.
Payment Data ⚠ HIGH - credit card info links directly to real identity
ProtonCAN HAND OVER if paid by credit card. Accepts crypto and cash as alternatives. [16]
TutaCAN HAND OVER if paid by credit card. Crypto via gift card partner (Proxystore). No direct crypto.
Unencrypted Incoming Emails (future, real-time surveillance) ⚠ CRITICAL - this is a live surveillance capability, not historical data
ProtonUNCLEAR Swiss law allows interception orders. Proton's architecture encrypts incoming mail immediately with zero-access.
TutaYES German BGH ruled (2021) Tuta must provide unencrypted copies of future non-E2EE mail for specific accounts under TKÜ orders. E2EE mail remains protected. [17]
Attachment Names
ProtonCAN HAND OVER [14]
TutaCANNOT E2EE
CRITICAL Tuta's 2021 German Federal Court ruling is the most significant jurisdiction test either provider has faced. Tuta was ordered to build a function that copies unencrypted incoming and outgoing emails from a specific account before they are encrypted. This only affects non-E2EE mail (mail to/from external providers). Emails between Tuta users remain fully E2EE and were explicitly excluded from the court's reach. Tuta fought this to the highest German court and lost on the monitoring question, but won on the principle that E2EE data cannot be compromised. [17]

Jurisdiction Under Pressure - Real Cases

Marketing copy is written for good days. Court orders arrive on bad ones. These are the documented cases where each provider's jurisdiction was tested by law enforcement, listed chronologically. Every case here is drawn from court records, official company statements, or investigative journalism.

2021 PROTON French Climate Activist (Youth for Climate)

French authorities, working through Europol and Swiss MLAT channels, obtained a Swiss court order compelling Proton to begin logging the IP address of a specific Proton Mail account associated with the Youth for Climate movement in Paris. The activist was involved in occupying buildings in the Place Sainte Marthe area. Proton complied, as it had no legal mechanism to refuse. [4]

DATA HANDED OVER: IP address (prospective logging) + recovery email address + device type information.
RESULT: Activist was identified and arrested by French police.
AFTERMATH: Proton removed "we do not keep any IP logs" from its website. Updated its privacy policy to state that IP logging can be compelled by Swiss court order. Won a Swiss court ruling (Oct 2021) that email providers are not telecoms, limiting future data retention obligations.
2020–2021 TUTA Cologne Blackmail Case (Bundesgerichtshof ruling)

A blackmail email was sent from a Tuta account to an auto supplier. The Cologne Regional Court ordered Tuta to develop a monitoring function for the specific account, copying unencrypted incoming and outgoing emails before they were encrypted. Tuta argued it was not a telecommunications provider and should not be subject to telecom interception laws. The Hanover Regional Court had previously agreed with this position. But the German Federal Court of Justice (BGH) ruled against Tuta, finding that "over the top" email services qualify as telecoms under German criminal procedure law. [17]

DATA HANDED OVER: Tuta was compelled to build a forward looking monitoring capability for non-E2EE mail on specific accounts.
PROTECTED: All previously stored encrypted email remained inaccessible. E2EE emails between Tuta users were explicitly excluded from the court order. The court could not compel decryption.
AFTERMATH: Tuta publicly stated this proves why E2EE matters, as only non-encrypted mail was exposed. Tuta continues to fight the classification of email providers as telecoms.
2024 PROTON Catalan Independence Activist (Democratic Tsunami)

Spanish police (Guardia Civil) sought to identify a pseudonymous member of the Catalan pro-independence movement known as "Xuxo Rondinaire," suspected of planning protest actions related to King Felipe VI's visit. The request went through Europol to Swiss authorities, who issued a binding order to Proton. Proton handed over the only user-identifiable information it had: the recovery email address on the account, which was an Apple iCloud address. Apple then provided Spanish authorities with the individual's full name, two home addresses, and a linked Gmail account. [15]

DATA HANDED OVER: Recovery email address (iCloud).
RESULT: Full identification and arrest via Apple's data linked to the recovery email.
KEY LESSON: Proton's encryption held, no email content was exposed. The deanonymization came entirely from a user-provided recovery email. Proton's CEO noted: "Proton provides privacy by default, not anonymity by default."
2025–2026 PROTON Stop Cop City / Defend the Atlanta Forest (FBI)

The FBI, investigating arson, vandalism, and doxxing linked to the Stop Cop City movement in Atlanta, used the MLAT process to compel Proton through Swiss authorities to hand over payment data associated with a specific Proton Mail account affiliated with Defend the Atlanta Forest. The payment data, a credit card transaction, was sufficient to identify the account holder. The individual does not appear to have been charged with a crime at the time of the disclosure. [16]

DATA HANDED OVER: Payment/credit card transaction data linked to the account.
RESULT: FBI identified the account holder. Search warrant was prepared for execution at Atlanta's airport.
KEY LESSON: Payment metadata is not protected by E2EE. Proton accepts cryptocurrency and cash payments as alternatives. If this user had paid with Monero or cash, the FBI would have had nothing.

The pattern across all four cases is consistent: end-to-end encryption held every time. No government obtained the contents of a single encrypted email from either provider. What was obtained in every case was metadata, including IP addresses, recovery emails, payment data, and non-E2EE message copies. The lesson is not that these providers failed. The lesson is that encryption protects content, and only content. Everything around it, every data point you provide at signup, every payment method you choose, every field that falls outside the encryption envelope, is fair game under a court order.

Features, Ecosystem & Pricing

Security architecture is the priority. But people also need to use these products daily. This section covers the practical differences that affect everyday use.

Free Plan Storage
Proton Mail1 GB (shared with Drive)
Tuta Mail1 GB
Paid Plan Starting Price
Proton Mail€3.99/month (Mail Plus), 15 GB
Tuta Mail€3/month (Revolutionary), 20 GB
Bundled Ecosystem
Proton MailEXTENSIVE VPN + Drive + Calendar + Pass (password manager) + Wallet, all under one account
Tuta MailLIMITED Calendar only. Tuta Drive planned (€1.5M German gov grant), no release date.
Desktop Clients
Proton MailDesktop app (paid only) + Bridge for IMAP/SMTP (paid only)
Tuta MailNative desktop clients for Linux, Windows, macOS (free)
IMAP / SMTP Support Needed for third party email client integration (Thunderbird, Outlook, Apple Mail)
Proton MailYES via Proton Bridge (paid plans only)
Tuta MailNO Proprietary protocol only
Anonymous Signup
Proton MailPARTIAL May require phone/email verification in some cases
Tuta MailYES No personal info required. No recovery email concept. [6]
F-Droid Availability (no Google dependencies)
Proton MailAvailable on F-Droid
Tuta MailAvailable since 2018 First email provider on F-Droid. Zero Google dependencies, no Google Push. [18]
Encrypted Full-Text Search
Proton MailSubject/metadata only on web. Content search on desktop/mobile.
Tuta MailYES Encrypted full-text search on all platforms
NOTE Proton's ecosystem is substantially broader, with VPN, Drive, Calendar, Pass, and Wallet all under one account. If you want a single provider replacing Google's entire suite, Proton is the only realistic option. Tuta's focus is narrower but deeper on the email specific encryption front.

Per-Provider Summary

🟣 Proton Mail

BESTEcosystem breadth. VPN, Drive, Calendar, Pass, and Wallet under one account, the closest thing to a privacy respecting Google replacement that exists.

BESTPGP interoperability. You can exchange encrypted email with any PGP user on Earth, not just other Proton users. This matters for journalists and researchers who correspond with varied sources.

PROSwiss jurisdiction requires MLAT for foreign requests, creating meaningful procedural friction. Switzerland is not in any intelligence sharing alliance.

PROTor onion site for IP anonymous access. Combined with Proton VPN, the IP logging risk is fully mitigable by the user.

CONSubject lines are NOT end-to-end encrypted. Under a valid Swiss court order, Proton can hand over the subject lines of every email in your inbox. This is a PGP limitation, not a Proton decision, but it is a real exposure.

CONRecovery email is a deanonymization vector. The 2024 Catalan case proved this. Recovery email is optional, but Proton prompts users to add one.

CONPayment data exposure. The 2026 Stop Cop City case showed that credit card payment metadata alone was sufficient for the FBI to identify a user.

WARN~11,000 legal orders in 2024 with a ~94% compliance rate. Partly a function of scale, but volume matters.

🔴 Tuta Mail

BESTEncryption scope. Subject lines, sender names, recipient names, attachment data, and the entire address book are E2EE. Tuta encrypts more fields than any other email provider on the market.

BESTPost-quantum encryption, live since March 2024. TutaCrypt's hybrid protocol (CRYSTALS-Kyber + X25519) is the first production deployment of quantum resistant email encryption.

BESTNo recovery email concept. Uses a recovery code instead. The Catalan case deanonymization vector does not exist in Tuta's architecture.

PROArgon2 key derivation, more resistant to GPU/ASIC brute force attacks than Proton's bcrypt.

PROECHR encryption protection. As an EU based provider, Tuta is protected by the 2024 European Court of Human Rights ruling banning laws that weaken E2EE.

CONGermany is a Fourteen Eyes member. Intelligence sharing between allied nations means German agencies could theoretically receive and act on foreign intelligence.

CONBGH ruling created a real-time monitoring precedent. Tuta can be ordered to copy future non-E2EE mail before encryption for specific accounts.

CONNo PGP support. You cannot exchange encrypted email with PGP users outside the Tuta ecosystem.

WARNNarrower ecosystem. Email + Calendar only. No VPN, no file storage (yet), no password manager.

Recommendations by Threat Model

Choose Tuta Mail if:
Your primary threat is the content of your communications being subpoenaed or surveilled, meaning you are a journalist protecting sources, a lawyer handling sensitive cases, a whistleblower, or an activist operating under government scrutiny. Tuta's encryption envelope is objectively wider. Subject line encryption alone could be the difference between a subpoena that reveals "re: SEC complaint draft, final version" and one that reveals nothing. The absence of a recovery email field eliminates the single most damaging deanonymization vector demonstrated in real cases. Post-quantum encryption means data intercepted today cannot be decrypted by future quantum computers. If maximum encryption scope is your priority, Tuta is the stronger choice.
Choose Proton Mail if:
You need a comprehensive privacy ecosystem that replaces Google Workspace, you correspond with PGP users, or your threat model prioritizes jurisdictional friction over encryption scope. Proton's VPN, Drive, Calendar, Pass, and Wallet under a single account is unmatched. PGP interoperability means you can receive encrypted mail from any security conscious sender, not just Proton users. Switzerland's non-membership in intelligence alliances and its MLAT process create real procedural barriers, not immunity, but friction that matters. If you need a fully private daily driver ecosystem and communicate with diverse contacts, Proton is the more practical choice.
Regardless of which you choose:
Never add a personal recovery email. The Catalan case proved this is the single most dangerous field on any account. Use a recovery code or, if you must add a recovery email, create a separate anonymous email for that sole purpose.

Never pay with a credit card if anonymity matters. The Stop Cop City case proved payment metadata alone can identify you. Use cryptocurrency (Proton accepts it directly; Tuta accepts it via gift cards) or cash.

Access via Tor or a trustworthy VPN. Both providers can be compelled to log IP addresses for specific accounts under court order. The only way to neutralize this is to never connect from your real IP in the first place.

Understand that E2EE only protects E2EE traffic. Emails to and from Gmail, Outlook, or Yahoo are not end-to-end encrypted. Tuta's BGH ruling demonstrated that non-E2EE mail can be intercepted before encryption. If your correspondent is on Gmail, the email content is visible to Google regardless of what you use.

Both providers are excellent and light years ahead of Gmail, Outlook, or Yahoo. The differences between them matter for high threat users. For the vast majority of people, switching from Gmail to either Proton or Tuta is the single biggest privacy improvement available in email today.

References

  1. [1] Proton, "Information for Law Enforcement" - proton.me/legal/law-enforcement
  2. [2] Tuta, "EU Data Privacy Protections: Why Tuta Is Based in Germany" - tuta.com/blog/data-privacy-germany
  3. [3] Proton, "Transparency Report" - proton.me/legal/transparency
  4. [4] Proton, "Important Clarifications Regarding Arrest of Climate Activist" - proton.me/blog/climate-activist-arrest
  5. [5] CyberScoop, "Court Rules Encrypted Email Provider Tutanota Must Monitor Messages" (2021) - cyberscoop.com
  6. [6] Tuta, "Transparency Report & Warrant Canary" - tuta.com/blog/transparency-report
  7. [7] Proton Transparency Report, aggregate legal order statistics (2017–2024) - proton.me/legal/transparency
  8. [8] Proton, "How Proton Mail Messages Are Encrypted" - proton.me/support/proton-mail-encryption-explained
  9. [9] Tuta, "Everything You Need to Know About Tuta's Encryption" - tuta.com/encryption
  10. [10] Tuta, "Tuta Launches Post Quantum Cryptography For Email" (March 2024) - tuta.com/blog/post-quantum-cryptography
  11. [11] Tuta, "Security at Tuta" - tuta.com/security
  12. [12] Proton, "Does Proton Mail Encrypt Email Subjects?" - proton.me/support/does-protonmail-encrypt-email-subjects
  13. [13] Proton, "Proton's End-to-End Encryption" - proton.me/security/end-to-end-encryption
  14. [14] Proton, "Proton Mail Privacy Policy" - proton.me/mail/privacy-policy
  15. [15] TechCrunch, "Encrypted Services Apple, Proton and Wire Helped Spanish Police Identify Activist" (May 2024) - techcrunch.com
  16. [16] 404 Media, "Proton Mail Helped FBI Unmask Anonymous 'Stop Cop City' Protester" (March 2026) - 404media.co
  17. [17] TechCrunch, "German Secure Email Provider Tutanota Forced to Monitor an Account" (Dec 2020) + CyberScoop BGH ruling follow-up (May 2021) - techcrunch.com
  18. [18] Wikipedia, "Tuta (email)" - en.wikipedia.org
Field Assessment

What your notes app knows about you, even when your notes are encrypted

End-to-end encryption protects your note content. It does not protect the metadata your notes generate. Here is an exhaustive breakdown of every metadata field collected by Standard Notes, Notesnook, Cryptee, and Joplin - and what it reveals.

January 2026
Privacy Research
~15 min read
The common assumption: if a notes app uses end-to-end encryption, the service cannot see what you write. That is true. What is also true is that every note you create, modify, or delete leaves a trail of metadata that your notes app can see clearly - timestamps, file sizes, sync patterns, structural relationships, and in Joplin's case, GPS coordinates. This analysis covers that trail field by field, across four of the most privacy-focused notes apps available. Sources include official documentation, published privacy policies, sync API specifications, and open-source code repositories.
STORED PLAINTEXT on server
PARTIALLY EXPOSED or conditional
ENCRYPTED - server sees ciphertext only
NOT STORED / NOT APPLICABLE
CLIENT-SIDE ONLY - never synced

Timestamps

Timestamps are the most operationally significant metadata a notes app collects. They do not reveal what you wrote - but they reveal when, how often, and for how long. In an investigative context, that is often enough.

Field Standard Notes Notesnook Cryptee
created_at Exact timestamp when a note was first created ⚠ HIGH - reveals when you began writing about a topic STORED STORED NOT STORED STORED
updated_at / user_updated_time Every modification timestamp, used for sync conflict resolution ⚠ HIGH - reveals editing cadence and co-editing patterns STORED STORED VERSION IDs only STORED
user_created_time vs. server_created_time SN and Joplin store both client-reported AND server-recorded timestamps as two separate records ⚠ MED - discrepancy between the two can reveal offline editing periods BOTH STORED BOTH STORED N/A BOTH STORED
deleted_at When a note was trashed - item is marked deleted, not immediately purged ⚠ MED - reveals when you removed sensitive material STORED STORED NOT CONFIRMED STORED
sync_token / cursor_token SN's internal sync clock - increments on every create or update event, acting as a proxy for edit frequency even without readable timestamps ⚠ MED - editing frequency is inferable from token progression STORED (SN only) N/A N/A N/A
NOTE Cryptee uses sequential version IDs for conflict resolution rather than human-readable ISO timestamps. This is meaningfully less revealing - the server cannot determine when a note was written, only that one version followed another.

Item Identifiers & Structure

Identifiers are low-risk in isolation. Their value to an adversary comes from correlation - mapping UUIDs across requests, sessions, and time to build a behavioral profile without reading a single word.

Field Standard Notes Notesnook Cryptee
UUID / Item ID Unique identifier per note, tag, and notebook. Stored in plaintext as a database key. ✓ LOW alone - meaningless without content, but correlatable across requests STORED STORED STORED STORED
content_type Whether an item is a Note, Tag, Component, or Preference. Stored in plaintext for server-side routing. ⚠ MED - server knows you have X notes, Y tags, Z preferences without reading any of them STORED ENCRYPTED PARTIAL STORED
items_key_id (SN only) Associates each encrypted note payload with the Items Key UUID that encrypted it ✓ LOW - only reveals which key encrypted which note. Useful for key rotation, not revealing. STORED N/A N/A N/A
parent_id / notebook relationship The structural link between a note UUID and its notebook UUID ⚠ MED - reveals organizational grouping even when folder names are encrypted ENCRYPTED ENCRYPTED FOLDER COUNT VISIBLE STORED (E2EE off) / ENCRYPTED (E2EE on)
tag relationships UUID-to-UUID associations between notes and tags ✓ LOW if tag names are encrypted - count is inferable but meaning is not ENCRYPTED ENCRYPTED ENCRYPTED STORED (E2EE off) / ENCRYPTED (E2EE on)
is_deleted flag Soft-delete marker. The item remains in the database as a tombstone until a hard purge runs. ⚠ MED - deleted notes persist server-side. SN: 14 days. Others: unstated. 14-day tombstone Window unstated NOT CONFIRMED Window unstated

Note Content Fields

This is where all four products perform equally well. Title, body, tags, and folder names are encrypted in every product when E2EE is properly active. The server holds ciphertext and cannot read any of it.

Field Standard Notes Notesnook Cryptee
Note titleThe visible title of the note✓ Encrypted in all four when E2EE is active ENCRYPTED ENCRYPTED ENCRYPTED ENCRYPTED (E2EE on)
Note body / contentFull note text✓ Encrypted in all four when E2EE is active ENCRYPTED ENCRYPTED ENCRYPTED ENCRYPTED (E2EE on)
Tag namesHuman-readable labels applied to notes✓ Server sees UUIDs only - names are encrypted ENCRYPTED ENCRYPTED ENCRYPTED ENCRYPTED (E2EE on)
Notebook / folder namesNames of organizational containers✓ Server sees UUIDs only - names are encrypted ENCRYPTED ENCRYPTED ENCRYPTED ENCRYPTED (E2EE on)
NOTE Content protection is consistent across all four products. The meaningful privacy differences are entirely in the metadata layers above and below this table - not in the content encryption itself.

Structural & Size Metadata

Encrypted blob sizes correlate with content length even after encryption. A 200KB blob versus a 2KB blob reveals relative note length. Attachment types reveal what kind of material you work with. None of this requires decrypting anything.

Field Standard Notes Notesnook Cryptee
Encrypted payload byte size Size of the encrypted blob on the server - correlates with note length even after encryption ⚠ MED - short vs. long writing is inferable from blob size VISIBLE VISIBLE EXPLICITLY STORED VISIBLE
Total note count Number of items stored under the account - inferable from item listings ✓ LOW - reveals productivity habits but not topics INFERABLE INFERABLE STORED PER FOLDER INFERABLE
Attachment MIME types File type of attachments - PDF, image, audio, etc. - known before encryption wrapping ⚠ MED - "this note has an audio attachment" or "a scanned document" is contextually revealing NOT CONFIRMED NOT CONFIRMED EXPLICITLY STORED STORED
Attachment count per note How many files are attached to a given note ✓ LOW alone - correlatable with other behavioral patterns INFERABLE INFERABLE VISIBLE STORED
Folder color / archive status (Cryptee) Whether a folder is archived and its color - structural decoration metadata ✓ LOW - cosmetic only N/A N/A STORED (stated in policy) N/A

Revision History Metadata

Revision history is perhaps the most underappreciated metadata risk. Every saved version carries its own timestamp, creating a granular editing timeline that persists on the server independently of the note's content.

Field Standard Notes Notesnook Cryptee
Revision history stored server-side Whether previous versions of a note are held on the server ⚠ HIGH - each revision carries a timestamp, creating a full editing timeline server-side 1yr (Productivity) / Unlimited (Professional) Limited history NOT STATED LOCAL ONLY
Timestamp per revision Every stored version carries the modification time of that specific edit ⚠ HIGH - for a journalist, this shows exactly when a draft was touched relative to external events YES - per revision YES - per revision N/A LOCAL ONLY
Revision content encrypted Whether stored past versions are encrypted on the server YES - encrypted blobs YES - encrypted blobs N/A N/A - local only
Nightly email backup (SN paid, opt-in) SN paid plans can send an encrypted nightly backup to your email - this creates a second metadata trail at the email provider ⚠ MED - timing, recipient address, and volume are visible to your email provider even if content is encrypted OPT-IN - paid only NO NO NO
NOTE Joplin stores revision history locally only - no server-side edit history exists. Standard Notes Professional stores an unlimited, timestamped edit history of every note indefinitely on Proton's servers. The content is encrypted. The timeline is not.

Joplin-Specific Note Fields

Joplin's data model was inherited partly from Evernote's ENEX format and carries more metadata fields per note than any other product here. Several of these fields are unique to Joplin and have no equivalent in the other three apps.

Field Standard Notes Notesnook Cryptee
latitude / longitude / altitude GPS coordinates embedded directly into note properties. On by default in the mobile app. ⚠ CRITICAL - exact location at note creation. Embedded in the note body itself, not just a server log. Travels with exports. Must be disabled manually in Settings → Note → Geolocation. NOT COLLECTED NOT COLLECTED NOT COLLECTED DEFAULT ON (mobile)
source / source_application Which Joplin client created the note - e.g. "joplin-desktop" or "net.cozic.joplin-mobile". Stored in note properties. ⚠ MED - device-type fingerprinting at the note level. Syncs plaintext if E2EE is off. NOT COLLECTED NOT COLLECTED NOT COLLECTED EMBEDDED IN NOTE
source_url When a note is created from a web clip, the source URL is embedded in note properties ⚠ HIGH - if E2EE is off, syncing a web-clipped note exposes the exact URL in plaintext. Directly reveals research activity. NOT COLLECTED NOT COLLECTED NOT COLLECTED EMBEDDED IN NOTE
author field Optional field that can be populated from your OS account username automatically ⚠ HIGH - if auto-populated from system username, your OS account name may be embedded in every note you create NOT COLLECTED NOT COLLECTED NOT COLLECTED CHECK IF AUTO-POPULATED
is_todo / todo_due / todo_completed Todo status, due date, and completion flag ✓ LOW if encrypted. Reveals task patterns if not. N/A ENCRYPTED N/A STORED (E2EE off) / ENCRYPTED (E2EE on)
markup_language Whether a note uses Markdown or HTML. Stored in note JSON properties. ✓ LOW - cosmetic formatting preference only N/A N/A N/A STORED in note JSON
CRITICAL With E2EE properly enabled, Joplin's dangerous fields - GPS, source_url, source_application - are encrypted in sync and at rest on the server. However, they remain present in the local SQLite database in plaintext regardless of E2EE state. Anyone with physical access to the device can read them.

Sync & Operational Metadata

Sync events are visible at the API level regardless of content encryption. Active editing bursts, multi-device conflicts, and session counts paint a behavioral picture even when nothing is readable.

Field Standard Notes Notesnook Cryptee
Sync frequency / timing How often notes are pushed to the server - inferable from API request logs even when content is encrypted ⚠ MED - active editing bursts are visible as sync events with no content required INFERABLE INFERABLE INFERABLE INFERABLE
Conflict records Created when two devices edit the same note simultaneously - links two item versions server-side ✓ LOW - reveals multi-device use pattern, not content STORED STORED NOT STATED STORED
Active session count How many concurrent device sessions are authenticated to the account ✓ LOW - reveals device count, not content STORED STORED STORED STORED

Per-Product Summary

🔵 Standard Notes

BESTMost transparent documentation. The sync API spec is publicly available and explicit about every stored field.

BESTPrivate username mode orphans all timestamps from a real identity - the data exists but is linked to a meaningless hash, not a person.

WARNRevision history timestamps are the biggest note-level risk. Professional plan stores an unlimited timestamped edit history indefinitely on Proton's servers.

WARNSync token progression is unique to SN - a proxy clock that reveals editing frequency independently of timestamps.

WARNNightly email backup (opt-in, paid) creates a second metadata trail at your email provider - timing, volume, and recipient visible to them even if content is encrypted.

🟢 Notesnook

BESTcontent_type fully encrypted - the server cannot distinguish a note from a tag from a preference. More thorough than SN and Joplin on this specific field.

PROPer-note sync toggle - notes kept local-only generate zero server-side metadata whatsoever.

WARNTimestamps are tied to your email address at registration. No private username equivalent means modification times are always attributable to an identifiable account.

CONDevice fingerprint metadata captured at the network level makes sync events more attributable than any other product here. Notesnook collects device IDs, OS type, and IP address in its own systems - the only app here that does all three.

Cryptee

BESTNo ISO timestamps on notes. Sequential version IDs are used for conflict resolution - readable as "this came after that," not as a datetime.

BESTNo server-side revision history. Previous versions of notes are not retained on the server at all.

WARNFolder count and file sizes explicitly stored. The number of documents per folder and attachment MIME types are visible to the server, per the published privacy policy.

WARNGhost Folders partially mitigates the folder count exposure - hidden folders are excluded from visible metadata - but the overall structure is still known to the server.

Final Ranking - Most to Least Minimal Note Metadata

1
💗 Cryptee
No real timestamps. No server-side revision history. No GPS or source fields. Folder count and MIME types are visible, but the per-note metadata footprint is the lightest of the four. The one caveat: the backend is closed source, so these claims cannot be independently verified through server-side code review.
2
🟠 Joplin (E2EE on + geolocation disabled)
No server-side revision history is a meaningful advantage. With E2EE enabled and geolocation disabled, the most dangerous Joplin-specific fields are neutralized. What remains is basic timestamps and UUIDs. If either condition is not met, Joplin becomes the worst of the four - the gap between best and worst configuration is larger here than in any other product.
3
🔵 Standard Notes (private username mode)
Timestamps and revision history are more extensive than Joplin's, but private username mode orphans all of it from a real identity. Timestamps linked to a hash with no associated PII are far less actionable than timestamps linked to an email address. Best transparency of the four on what is actually stored. The sync token mechanism is a unique risk not present in the other products.
4
🟢 Notesnook
The content_type encryption is genuinely better than SN and Joplin, and the per-note sync toggle is a useful mitigation. But timestamps are tied to a required email address with no anonymization equivalent, IP addresses are stored in Notesnook's own systems, and device identifiers make sync events more attributable than any other product here. The richest note-level metadata profile of the four when combined with account-level data.
Analysis

The government is buying your data because it cannot legally collect it

On March 18, 2026, FBI Director Kash Patel told the United States Senate that the FBI purchases commercially available location data from data brokers. He declined to commit to stopping. This is not new. It has been happening for years, across multiple administrations, involving the FBI, the Department of Defense, ICE, the IRS, and the Secret Service. But for the first time, the current FBI director said it out loud, on the record, under oath. Here is what is happening, how long it has been happening, who it affects, and what you can do about it.

D David
March 2026
Analysis
~15 min read
The Fourth Amendment is supposed to protect you from warrantless government surveillance. In 2018, the Supreme Court affirmed this in Carpenter v. United States, ruling that law enforcement needs a warrant to obtain cell phone location data from carriers. That was supposed to settle it. But the government found a workaround almost immediately: instead of demanding the data from your carrier, it buys the same data (and often more detailed data) from commercial data brokers. No warrant. No judge. No notification. Just a purchase order. The apps on your phone collect your location every few seconds through advertising SDKs embedded in their code. That data flows to aggregators who package and sell it. And one of their biggest customers is the United States government. This is the story of how a $200 billion advertising data industry became a backdoor into the private lives of millions of Americans, and why privacy-respecting tools are not a luxury. They are a necessity.

This Has Been Happening for Years

The NPR report from today, March 25, 2026, is not a revelation. It is the latest chapter in a pattern that stretches back at least six years. Here is a condensed timeline of what has been publicly documented.

2018: Carpenter v. United States
The Supreme Court rules 5-4 that the government must obtain a warrant to access historical cell site location information (CSLI) from carriers. Chief Justice Roberts writes that location data provides "an intimate window into a person's life, revealing not only his particular movements, but through them his familial, political, professional, religious, and sexual associations." Government agencies immediately begin arguing that the ruling applies only to data obtained from carriers, not data purchased on the open market from brokers. [1]
2020: The Muslim Prayer App Scandal ⚠ This is where it became impossible to ignore
Vice's Motherboard publishes an investigation revealing that the U.S. military purchased granular location data from Muslim Pro, a prayer and Quran app with over 98 million downloads. The data flowed from the app to a broker called X-Mode, which sold it to defense contractors, who sold it to U.S. Special Operations Command. The app sent GPS coordinates, Wi-Fi network names, and timestamps every time it was used. Muslim Mingle, a Muslim dating app, was also found sending precise geolocation to X-Mode. The Council on American-Islamic Relations called for a congressional inquiry into "the government's use of personal data to target the Muslim community." Senator Wyden's office confirmed that X-Mode admitted to selling location data harvested from U.S. phones to military customers. Apple and Google subsequently banned X-Mode from their app stores. [2]
2020: BLM Protesters Tracked by Data Broker
BuzzFeed News reports that data broker Mobilewalla tracked approximately 17,000 Black Lives Matter protesters by harvesting location data from their phones during demonstrations. Mobilewalla used the data to create demographic profiles of protesters, including breakdowns by race, age, and gender. The Brennan Center for Justice later noted that the FBI renegotiated its purchase contract with a data broker around the same time as the BLM protests, though the specific use of the data remains unclear. [3]
2021: DIA Admits Warrantless Searches of Americans
A memo obtained by Senator Wyden's office reveals that the Defense Intelligence Agency purchased access to a commercial database of smartphone location data and that DIA analysts had searched for Americans' movements without a warrant at least five times in the preceding two and a half years. The memo explicitly stated: "D.I.A. does not construe the Carpenter decision to require a judicial warrant endorsing purchase or use of commercially available data for intelligence purposes." [4]
2023: FBI Director Wray Steps Back (Temporarily)
Then-FBI Director Christopher Wray indicates to Congress that the Bureau had backed away from using "commercial database information that includes location data derived from internet advertising." This appeared to be a concession to growing bipartisan concern. It lasted less than three years.
January 2025: FTC Takes Action Against Gravy Analytics
The Federal Trade Commission takes enforcement action against Gravy Analytics and its subsidiary Venntel, alleging the company collected and sold location data tied to more than one billion mobile devices daily. The data included visits to health clinics, places of worship, domestic violence shelters, and military installations. A subsequent breach at Gravy Analytics (reported by Wired) compromised location records tied to apps including Candy Crush, Tinder, and MyFitnessPal. The FTC's order would ban the company from selling sensitive location data, but the broader market remains largely unregulated. [5]
March 18, 2026: FBI Director Patel Confirms Purchases ⚠ THIS HAPPENED THIS MONTH
At the Senate Intelligence Committee's annual threats hearing, Senator Ron Wyden asks FBI Director Kash Patel if he will commit to not buying Americans' location data without a warrant. Patel declines, stating: "We do purchase commercially available information that's consistent with the Constitution and the laws under the Electronic Communications Privacy Act, and it has led to some valuable intelligence for us." This reverses the position Wray had taken in 2023 and confirms that the FBI is actively purchasing location data from commercial brokers. [6]

You might assume that if the Supreme Court said warrants are required for location data, the government cannot just buy it instead. The legal reality is more disturbing than that.

The Fourth Amendment Only Restricts Government Action
A 2024 analysis published in the Yale Law and Policy Review ("End-Running Warrants: Purchasing Data Under the Fourth Amendment and the State Action Problem") identified the core legal issue: the Fourth Amendment prohibits unreasonable searches by the government, but a purchase on the open market is not a "search" in the constitutional sense. When a data broker sells your location data to the FBI, the violation of your privacy was committed by the broker (a private company), not by the government. And the Fourth Amendment does not regulate private actors. As the Yale analysis concluded: even though users retain a reasonable expectation of privacy in this data, the Fourth Amendment simply does not apply to a market transaction. The result is a legal gap wide enough to drive the entire surveillance state through. [7]
The Columbia Law Review's "Laundering Data" Analysis
In 2022, the Columbia Law Review published "Laundering Data: How the Government's Purchase of Commercial Location Data Violates Carpenter and Evades the Fourth Amendment." The paper argues that purchasing location data is functionally identical to the warrantless surveillance that Carpenter was supposed to prevent. The data is often more detailed than what carriers hold (GPS coordinates vs. cell tower approximations), covers longer time periods, and can be queried retroactively. The government is obtaining the same "intimate window into a person's life" that Chief Justice Roberts described, through a mechanism that the court did not anticipate. Courts have not yet ruled on this specific question. No federal court has addressed whether Carpenter's warrant requirement extends to data broker purchases. The legal gray area persists. [8]
The Landlord's Key Analogy
Jake Laperruque, deputy director of the Center for Democracy and Technology's Security and Surveillance Project, offered what may be the clearest analogy in an interview with FedScoop on March 20, 2026: "We certainly wouldn't imagine a scenario where the police said, 'We're going to search your house. We don't have a warrant, but we paid your landlord $100 to give us a spare key. So now we're searching your house without a warrant.'" That is exactly what is happening with data broker purchases. The FBI cannot compel your carrier to hand over your location data without a warrant. But it can pay a data broker for the same data (often more detailed data) without any judicial oversight at all. [9]

AI Makes This Exponentially Worse

Data broker purchases are not new. What is new is the ability to process them. Artificial intelligence transforms bulk location data from a filing cabinet into a searchable intelligence platform.

From Data to Dossiers
Representative Warren Davidson (R-Ohio) stated it plainly in the March 2026 hearing: artificial intelligence "can harvest and collect the data in a way that humans never could and do it amazingly fast." The CDT's Laperruque expanded on this: "What kind of new Pandora's box do we open when we not only have these huge quantities of data, but we have tools that can start to scan and analyze patterns in unprecedented ways and at an unprecedented scale that you can never do from human analysts." [6]

The 130 civil society organizations that signed a letter to Congress ahead of the FISA 702 reauthorization specifically cited the potential for the data broker loophole to be used to "supercharge AI-powered surveillance." The concern is not speculative. Laperruque noted that the recent tensions between Anthropic and the Department of Defense highlighted the potency of combining AI with government-purchased consumer data. When you pair AI's pattern recognition capabilities with location histories covering billions of device-days, you get something that no human analyst team could replicate: the ability to identify behavioral patterns, predict movements, map social networks, and flag anomalies across an entire population, automatically, continuously, and in real time. [6]

The Legislative Response (and Why It Might Actually Happen This Time)

The Government Surveillance Reform Act
On March 13, 2026, a bipartisan, bicameral group of lawmakers introduced the Government Surveillance Reform Act. Senator Ron Wyden (D-OR) and Senator Mike Lee (R-UT) co-authored it in the Senate; Representative Zoe Lofgren (D-CA) and Representative Warren Davidson (R-OH) lead it in the House. The bill would require federal agencies to obtain a warrant before purchasing Americans' personal data from brokers. It would also close the "backdoor search" loophole that allows agencies to search Americans' communications swept up during foreign intelligence collection without a warrant. [6]

The timing is not coincidental. Section 702 of the Foreign Intelligence Surveillance Act expires on April 20, 2026. Advocates say the reauthorization debate is the best vehicle for attaching data broker reform. Sean Vitka, executive director of Demand Progress, called it "very likely the only chance that Congress has this year to vote for meaningful privacy protections." [6]

But the bill faces opposition. The White House and House Speaker Mike Johnson are pushing for a clean FISA reauthorization with no changes. Some Democrats have also indicated support for a clean extension to avoid letting the law lapse. The window is narrow and closing.

Why This Matters for You (and What You Can Do Right Now)

You may be reading this and thinking: "I have nothing to hide." That is not the point. The point is that a system exists where the government can purchase a detailed record of your physical movements, your associations, your habits, and your patterns of life, without ever asking a judge for permission, without ever notifying you, and without any meaningful oversight. Whether or not you have something to hide, you have a Fourth Amendment right not to be subjected to this. Here is what you can do today.

The Data Broker Pipeline Starts on Your Phone
Disable location permissions for every app that does not require them. On iOS: Settings → Privacy and Security → Location Services. Review every app. Set anything that does not need constant location access to "Never" or "While Using the App." On Android: Settings → Location → App location permissions. Be ruthless. A weather app does not need your GPS coordinates. A game does not need to know where you are. Every app with location access is a potential data source for brokers.

Delete your advertising ID. This is the identifier that ties your location data to your device across apps and brokers. On iOS: Settings → Privacy and Security → Tracking → toggle off "Allow Apps to Request to Track." On Android: Settings → Privacy → Ads → Delete advertising ID. This is the single most impactful technical step you can take to reduce your data broker exposure.

Uninstall free, ad-supported apps you do not actively use. Each one is a potential data source. The Gravy Analytics breach proved that even games like Candy Crush fed data into the broker ecosystem.

Use privacy-respecting alternatives. This is the entire reason this site exists. Every technical comparison we publish (messaging apps, email providers, notes apps, photo storage, encryption) comes back to this: the tools you use determine who has access to your data. Signal does not generate the metadata that WhatsApp hands to the FBI every 15 minutes. Ente Photos does not give anyone access to your photo library because the server cannot see it. Tuta Mail does not have the ability to read your email. These are not theoretical differences. They are architectural decisions that determine whether your data exists in a form that can be purchased, subpoenaed, or exploited.

Use a VPN. A reputable VPN masks your IP address and prevents your ISP from logging the sites you visit. It does not stop app-level location tracking (that requires disabling location permissions), but it adds a meaningful layer against network-level surveillance.
THE REAL MESSAGE The government does not need to hack your phone. It does not need to break your encryption. It does not need a warrant, a subpoena, or a court order. It just needs a credit card and a data broker. The only way to prevent this is to stop the data from being generated in the first place. Audit your phone. Revoke location permissions. Delete your advertising ID. Switch to tools that do not collect the data to begin with. The math behind AES-256 is unbreakable. The law behind the Fourth Amendment is apparently not. Your tools are your last line of defense.

References

  1. [1] Carpenter v. United States, 585 U.S. ___ (2018). Supreme Court of the United States.
  2. [2] Cox, J., "How the U.S. Military Buys Location Data from Ordinary Apps." Vice Motherboard, Nov. 16, 2020. | vice.com
  3. [3] Haskins, C., "Almost 17,000 Protesters Had No Idea a Tech Company Was Tracing Their Location." BuzzFeed News, June 25, 2020. Mobilewalla data broker tracking of BLM protesters.
  4. [4] Savage, C. and Edmondson, C., "Defense Intelligence Agency Admits to Buying Americans' Location Data." The New York Times, Jan. 22, 2021. DIA memo obtained by Sen. Wyden.
  5. [5] FTC, Enforcement action against Gravy Analytics / Venntel. January 2025. 1 billion+ devices tracked daily.
  6. [6] Joffe-Block, J., "Your data is everywhere. The government is buying it without a warrant." NPR, March 25, 2026. | npr.org
  7. [7] "End-Running Warrants: Purchasing Data Under the Fourth Amendment and the State Action Problem." Yale Law and Policy Review. | yalelawandpolicy.org
  8. [8] "Laundering Data: How the Government's Purchase of Commercial Location Data Violates Carpenter and Evades the Fourth Amendment." Columbia Law Review, 2022. | columbialawreview.org
  9. [9] FedScoop, "Privacy advocates sound alarm on 'data broker loophole' used by FBI, other federal agencies." March 20, 2026. Laperruque landlord analogy. | fedscoop.com
  10. [10] Brennan Center for Justice, "Closing the Data Broker Loophole." | brennancenter.org
  11. [11] Brennan Center for Justice, "Federal Agencies Are Secretly Buying Consumer Data." | brennancenter.org
  12. [12] POGO (Project on Government Oversight), "Fact Sheet: Closing the Data Broker Loophole." Jan. 2026. | pogo.org
  13. [13] Lawfare, "Data Broker Sales and the Fourth Amendment." March 2024. | lawfaremedia.org
  14. [14] Government Surveillance Reform Act, introduced March 13, 2026, by Sen. Wyden (D-OR), Sen. Lee (R-UT), Rep. Lofgren (D-CA), Rep. Davidson (R-OH).
Investigative Report

What porn sites know about you, and who they tell

Private browsing does not make you private. A UC Berkeley study cataloged the tracking infrastructure on the most popular adult websites in the US and found Google on almost all of them, search terms leaked in plaintext, and sexual preferences encoded in cookies. Here is what the research actually found, what has changed since, and what has not.

D David
November 2026
Privacy Research
~10 min read
Most people think private browsing covers their tracks. Open an incognito window, visit what you visit, close it, done. That is not how it works.In 2016, researchers at UC Berkeley and the University of Münster, including Chris Jay Hoofnagle, analyzed tracking on every adult website ranked in the Alexa US Top 500. What they found was about as bad as you would expect and worse in the details. Google's tracking scripts showed up on nearly every site. Search terms were sent in plaintext to third parties. Sexual preferences were encoded directly into cookies. And most of these sites were not even using HTTPS, which means every URL, including the specific content being viewed, was visible to your ISP and anyone else watching the connection.That was a decade ago. Some things have improved. The ones that haven't are worth understanding.

The Study

The researchers examined all eleven adult sites in the Alexa US Top 500, both manually in Firefox using mitmproxy to capture every connection, and through Mezzobit, a cloud-based tool that maps third-party communications. They supplemented the work with Netograph and Palantir Contour for link and statistical analysis. The paper was submitted to the FTC's PrivacyCon 2017.

SOURCE Altaweel, I., Hils, M., & Hoofnagle, C.J. (2016). "Privacy on Adult Websites." Submitted to FTC PrivacyCon 2017. SSRN ID 2851997.

Key Findings - What The Researchers Found

Google Trackers Present on Nearly Every Site ⚠ HIGH - Google can trivially cross-reference this data with your Gmail, YouTube, Search, and Android activity
Finding9 of 11 sites had Google Analytics and/or DoubleClick scripts. Google was the dominant third-party tracker across the entire sample. No other mainstream ad network had comparable presence.
Why it mattersIf you are logged into a Google account in your normal browser and visit an adult site in the same browser, Google can associate the visit with your identity. Even without a login, Google's fingerprinting and cookie infrastructure can link sessions across sites.
Search Terms Leaked in Plaintext to Third Parties ⚠ HIGH - your exact search query on an adult site was transmitted, readable, to companies including Google and Yandex
Finding7 of 11 sites leaked search terms "in the clear." When a user searched for content on the site, the query string was transmitted in plaintext to third parties including Google Analytics, DoubleClick, and Russia-based Yandex.
Why it mattersThese are not abstract identifiers. These are literal search strings describing sexual preferences, transmitted to advertising infrastructure operated by companies that also know your name, email, phone number, and home address from other services.
Sexual Preferences Encoded in Cookies in Plaintext ⚠ HIGH - category tags like specific sexual interests were stored as readable text, not anonymized codes
FindingCategory tags - the labels describing content types a user clicked on - were often encoded in cookies in plaintext rather than as opaque identifiers. The researchers noted specific examples of human-readable preference labels stored in cookies.
Why it mattersA cookie reading "category=blonde" or "category=trans" is not an anonymous data point. It is a plaintext record of sexual interest stored on your device and transmitted to third-party servers. Anyone with access to the cookie store - malware, a shared computer, forensic analysis - can read it.
Most Sites Did Not Use HTTPS ⚠ HIGH - every URL string, including specific video titles and search queries, was visible to ISPs and network operators
FindingOnly 2 of 11 sites used HTTPS by default. 8 sites either would not load over HTTPS or redirected to HTTP. Ironically, the researchers found that third-party tracking scripts on these sites were more likely to use HTTPS than the adult content itself.
Why it mattersWithout HTTPS, the full URL of every page visited is visible to your ISP, Wi-Fi operator, employer (on a work network), and any intelligence agency with access to the connection. The trackers protected their own data collection with encryption while leaving the user's browsing completely exposed.
Fewer Trackers Than Mainstream Sites - But Not For Privacy Reasons
FindingAdult sites averaged 4 third-party connections (median), compared to 20–33 on comparably popular mainstream and medical sites. Facebook was present on only 1 of 11 adult sites, compared to over half of the top 1,000 sites generally.
Why it mattersThe researchers concluded this was not because adult sites prioritize privacy. It is because mainstream advertisers do not want their brands next to pornographic content, and pornographic preferences have limited value for targeting non-pornographic ads. The privacy benefit is a side effect of marketability, not policy.
Flash Used to Read HTTP Cookie Values
FindingFlash was detected on 5 of 11 sites. In most cases it was being used to read HTTP cookie values from the same domain. The researchers did not find evidence of Flash cookies respawning deleted HTTP cookies.
2026 statusFlash was officially discontinued in December 2020 and is no longer supported by any major browser. This specific vector is dead.

The Nuance Most People Miss

Google Can Re-Identify You Trivially The researchers stated this explicitly ⚠ CRITICAL
The paper states: "Some of these parties, particular Google, could trivially and secretly re-identify these users by relying on data collection from other sites." This is not speculation. If Google Analytics is on an adult site and Google has your identity from Gmail, Search, or Android, the re-identification is a database join away. Private browsing does not prevent this if you are logged into any Google service in any tab.
The Trackers Encrypted Their Traffic - Users Did Not Get The Same Protection ⚠ MED - the companies collecting your data protected their collection pipeline better than the site protected your browsing
On sites that served content over HTTP, the researchers found that third-party tracking scripts and ad delivery were often transmitted over HTTPS. The advertising infrastructure encrypted its own data collection while the user's actual browsing - including specific video URLs and search queries - traveled unencrypted. The trackers considered their data worth protecting. The user's privacy was not extended the same courtesy.
Medical Sites Were Worse
The researchers compared adult sites to a top-500 medical website and found the medical site had 33 third-party vendors and over 30 cookies - dramatically more tracking than any adult site in the sample. The paper notes this undermines the hypothesis that "creepy" subject matter naturally limits tracking. If sensitivity alone drove privacy protections, medical sites would be the most private on the web. They are among the least.
The Real-World Consequences Are Documented ⚠ HIGH
The paper cites Professor Andrew Gilden's research documenting cases where online sexual activity data was used in custody battles, divorce proceedings, and as propensity evidence in criminal trials. It also references the Ashley Madison breach, where tens of millions of users were exposed and subsequently targeted for extortion. The researchers explicitly note that even private, non-public data leakage creates a logical chain to extortion and blackmail through the increased risk that a third party with access to the data could exploit it.

What Has Changed Since 2016

HTTPS Adoption
2016Only 2 of 11 adult sites used HTTPS by default. Full URL strings visible to ISPs and network operators.
2026HTTPS is now near-universal. Major browsers flag HTTP sites as insecure. Most adult sites now serve over HTTPS. ISPs can see the domain you visit but not the specific page or video URL.
Flash
2016Flash detected on 5 of 11 sites, used to read HTTP cookie values.
2026Flash is dead. Discontinued December 2020. No longer a tracking vector.
Third-Party Cookies
2016Third-party cookies were the primary cross-site tracking mechanism. Adult sites set a median of 8 third-party cookies per visit.
2026Chrome has not fully deprecated third-party cookies despite years of announcements. Firefox and Safari block them by default. The tracking has shifted toward fingerprinting, first-party data collection, and server-side tracking - harder to detect, harder to block.
Google's Tracking Presence
2016Google Analytics and/or DoubleClick on 9 of 11 sites.
2026Google's advertising and analytics infrastructure remains the most pervasive tracking system on the web. The specific scripts have evolved (GA4 replaced Universal Analytics in 2023), but Google's ability to correlate activity across sites has only increased.

What Has Not Changed

Private Browsing Still Does Not Protect You From Third-Party Tracking ⚠ HIGH - this is the single most common misconception
Private browsing (Incognito, Private Window) prevents local history from being saved. It does not prevent the site from sending your data to Google Analytics, ad networks, or any other third party. It does not prevent your ISP from seeing which domain you visited. It does not prevent browser fingerprinting. If you visit an adult site in a private window on the same browser where you are logged into Google in a normal tab, the sessions can be correlated.
Search Terms and Preferences Can Still Be Leaked ⚠ MED - the mechanism has shifted from URL parameters to first-party analytics pipelines, but the data still moves
HTTPS prevents ISPs from seeing search terms in URLs. But the site itself still sends those queries to its own analytics stack and to any third-party scripts it loads. Google Analytics, if present, receives page view data that can include search parameters. The data no longer travels in the clear across the network - but it still arrives at Google's servers.
DNS Queries Expose The Domain ⚠ MED - your ISP knows which adult site you visited even over HTTPS, unless you use encrypted DNS
HTTPS encrypts the content of your connection. It does not encrypt the DNS query that resolves the domain name. Unless you are using DNS over HTTPS (DoH) or DNS over TLS (DoT), your ISP sees every domain you resolve - including adult sites. Most users are still on their ISP's default DNS resolver.
The Consequences Have Only Gotten Worse ⚠ HIGH
Since 2016, data breaches have accelerated, sextortion scams have industrialized, and state-level legislation targeting pornography access has expanded (age verification laws, ISP-level filtering mandates). In jurisdictions where pornography is criminalized, the tracking infrastructure documented in this paper is not an advertising nuisance - it is an evidence trail.

What Actually Protects You

If you take away one thing from this analysis, it is that private browsing is a local-only protection. It hides your history from someone who picks up your device. It does not hide your activity from the network, the site, or the third parties the site sends your data to. Here is what does.

Use a VPN or Tor
A trustworthy VPN encrypts all traffic and hides the destination domain from your ISP. Tor goes further by routing traffic through multiple relays so no single entity sees both your IP and your destination. Either one prevents your ISP from knowing which sites you visit. Tor is stronger but slower.
Use a Hardened Browser With Tracking Protection
Firefox with Enhanced Tracking Protection (strict mode), or Brave, blocks third-party trackers including Google Analytics and DoubleClick by default. This directly addresses the primary finding from the Berkeley study - that Google was present on nearly every adult site.
Use Encrypted DNS (DoH or DoT)
DNS over HTTPS or DNS over TLS prevents your ISP from seeing which domains you resolve. Firefox supports DoH natively. Configure it to use a privacy-respecting resolver like Quad9 (9.9.9.9) or Mullvad DNS. Without this, HTTPS only protects the page content - the domain name is still visible.
Use a Separate Browser Profile or Container
Firefox Multi-Account Containers or a completely separate browser profile prevents session correlation between your logged-in Google/social media activity and any other browsing. This is the cheapest, most effective mitigation against the re-identification risk the researchers described.
Do Not Log Into Anything
Creating an account on an adult site ties your activity to an email address, and potentially to payment data. The Berkeley researchers found that even without accounts, tracking infrastructure could correlate sessions. Adding a login makes it trivial. If you must create an account, use a disposable email and pay with cryptocurrency or a prepaid card.

References

  1. [1] Altaweel, I., Hils, M., & Hoofnagle, C.J. (2016). "Privacy on Adult Websites." Submitted to FTC PrivacyCon 2017. SSRN ID 2851997.
  2. [2] Englehardt, S. & Narayanan, A. (2016). "Online Tracking: A 1-million-site Measurement and Analysis." 23rd ACM Conference on Computer and Communications Security.
  3. [3] Marotta-Wurgler, F. (2016). "Understanding Privacy Policies: Content, Self-Regulation, and Markets." NYU Law and Economics Research Paper No. 16-18.
  4. [4] Gilden, A. (2016). "Punishing Sexual Fantasy." 58 William and Mary Law Review.
  5. [5] Darling, K. (2014). "IP Without IP? A Study of the Online Adult Entertainment Industry." 17 Stanford Technology Law Review 655.
  6. [6] Krishnamurthy, B., Naryshkin, K. & Wills, C. (2011). "Privacy leakage vs. Protection measures: The growing disconnect." 11(3) IEEE Security & Privacy 14.
Field Assessment

SMS vs iMessage vs WhatsApp vs Telegram vs Signal: What the FBI's own document tells us

In January 2021, the FBI produced an internal guide titled "Lawful Access" detailing exactly what data it can legally obtain from nine messaging apps. The document was obtained via FOIA by the nonprofit Property of the People and published by Rolling Stone. It is the single most useful primary source available for evaluating messaging privacy. Here is a field-by-field analysis of five of the most common services, based on that document, official privacy policies, court records, and published law enforcement capabilities.

D David
December 2026
Privacy Research
~25 min read
The FBI does not need to break your encryption. On January 7, 2021, someone leaked an internal FBI document. It was prepared by the Bureau's Science and Technology Branch and Operational Technology Division, marked "For Official Use Only" and "Law Enforcement Sensitive," and it answered a question most people never think to ask: what can the FBI actually get from each messaging app with a court order?The answer depends entirely on which app you use. Encryption protects message content on most platforms now. But metadata, who you talk to, when, how often, from where, and on what device, is a different story. WhatsApp hands it over in near real time. Signal gives up almost nothing. SMS exposes everything by design. And as of March 2026, the FBI has confirmed to Congress that it also buys location data from commercial data brokers, skipping the warrant process altogether.This analysis goes through all five services field by field, then looks at the threats that exist outside the app itself. Every claim is sourced.
EXPOSED: plaintext or accessible to provider/carrier
CONDITIONAL: depends on configuration or legal process type
PROTECTED: E2EE or not collected
NOT APPLICABLE
PRIMARY SOURCE FBI "Lawful Access" document, dated Jan. 7, 2021, obtained via FOIA by Property of the People. Published by Rolling Stone, Nov. 29, 2021. Analysis supplemented by Just Security (Riana Pfefferkorn, Stanford Internet Observatory), the ACLU, the Center for Democracy and Technology, and official privacy policies from each service.

Encryption Architecture

The foundation. Whether your messages can be read by anyone other than you and your recipient depends entirely on the encryption model. These five services use fundamentally different approaches, and the differences determine everything that follows.

End-to-End Encryption (E2EE) by Default Whether message content is encrypted on your device and only decryptable by the recipient, with no access by the provider ⚠ CRITICAL: this is the single most important privacy property of any messaging service
SMSNO ENCRYPTION Plaintext. Your carrier can read every message. Content is visible to anyone who intercepts the transmission, including stingray devices, ISPs, and intelligence agencies.
iMessageYES E2EE between Apple devices (blue bubbles). Falls back to unencrypted SMS (green bubbles) when communicating with non-Apple devices.
End-to-End Encryption (continued)
WhatsAppYES Signal Protocol. All messages, calls, photos, and videos are E2EE by default. However, metadata is extensively collected and available to law enforcement.
TelegramNO, NOT BY DEFAULT Only "Secret Chats" are E2EE (one-on-one only, no groups, no desktop). All other chats use client-server encryption. Telegram holds the keys and can read them.
End-to-End Encryption (continued)
SignalYES Signal Protocol (the gold standard). All messages, calls, group chats, voice notes, and file transfers are E2EE by default. No exceptions. No fallback to unencrypted modes.
Encryption Protocol
SMSNone. Transmitted via SS7, a protocol designed in the 1970s with no encryption layer.
iMessageApple's proprietary protocol. RSA-1280 + AES-128 (older), transitioning to Elliptic Curve + AES-256. Closed source.
Encryption Protocol (continued)
WhatsAppSignal Protocol (licensed from Signal/Open Whisper Systems). Open source protocol, closed source app.
TelegramMTProto 2.0 (proprietary, home-grown). Criticized by cryptographers including Matthew Green (Johns Hopkins) for non-standard design choices and lack of independent audit.
Encryption Protocol (continued)
SignalSignal Protocol. Open source. The most widely peer-reviewed E2EE protocol in existence. Also used by WhatsApp, Facebook Messenger (opt-in), and Google Messages (RCS).
Open Source
SMSN/A Carrier infrastructure, not an app.
iMessageNO Fully closed source. Apple publishes a security whitepaper but the code is not auditable.
Open Source (continued)
WhatsAppPROTOCOL ONLY Uses open-source Signal Protocol but the WhatsApp app itself is closed source.
TelegramCLIENT ONLY Client apps are open source. Server code is closed source, meaning the server-side behavior cannot be independently verified.
Open Source (continued)
SignalFULL Client apps AND server code are open source. The only service on this list where both sides are independently auditable.
KEY POINT Telegram's marketing as a "secure messenger" has been directly challenged by cryptographer Matthew Green (Johns Hopkins University), who wrote in August 2024: "Telegram really has no legs to stand on in this particular discussion" regarding encryption. He noted that the vast majority of Telegram conversations, and every single group chat, are visible on Telegram's servers. Telegram CEO Pavel Durov was arrested in France in August 2024 on charges related to the platform's role in facilitating criminal activity.
FBI LAWFUL ACCESS: WHAT EACH SERVICE HANDS OVER
Under a Court Order or Search Warrant
This section is derived directly from the FBI's January 2021 "Lawful Access" document. The legal instruments referenced are subpoenas, court orders (18 U.S.C. §2703(d)), search warrants, and pen register/trap-and-trace orders. What follows is what the FBI says it can obtain from each service with the appropriate legal process.
Message Content ⚠ CRITICAL: this is what you actually wrote
SMSFULLY ACCESSIBLE Carriers can read and store message content. Most US carriers delete content after delivery (days), but metadata is retained for months to years. Forensic extraction from the device recovers "deleted" messages.
iMessageLIMITED E2EE protects content in transit. But if iCloud backup is enabled, Apple stores iMessage content with the encryption key and will hand both over under a search warrant.
Message Content (continued)
WhatsAppLIMITED E2EE protects content. But if the target uses an iPhone with iCloud backup enabled, iCloud returns may contain WhatsApp data including message content. WhatsApp introduced optional encrypted backups after the FBI document was prepared.
TelegramCLOUD CHATS: ACCESSIBLE Telegram holds decryption keys for all cloud chats. Secret Chats are E2EE and inaccessible. But most users never enable Secret Chats.
Message Content (continued)
SignalNONE Signal does not store message content. E2EE with no server-side copies. No cloud backup integration. The FBI document confirms: no message content available.
Who You Talk To (Contact/Address Book Data) ⚠ HIGH: reveals your social graph, which is often more valuable than content
SMSFULLY EXPOSED Every sender/recipient pair is logged by the carrier with timestamps. Retained for months to years.
iMessageEXPOSED Subpoena: basic subscriber info. Court order: 25 days of iMessage lookup data showing who searched for your number or email in iMessage.
Who You Talk To (continued)
WhatsAppEXPOSED Search warrant returns: address book contacts, WhatsApp users who have the target in their contacts, blocked users. Pen register: source and destination of every message.
TelegramMINIMAL FBI document: "No contact information provided." Telegram states it may disclose IP and phone number only for confirmed terrorist investigations.
Who You Talk To (continued)
SignalNONE Signal does not store contact lists on its servers. The FBI document confirms: no contact information available. Signal responded to a 2021 grand jury subpoena by providing only registration date and last connection date. Nothing else, because nothing else exists.
Real-Time Surveillance (Pen Register / Trap-and-Trace) ⚠ CRITICAL: this is live, forward-looking surveillance, not historical records
SMSFULLY CAPABLE Carriers provide real-time metadata. Stingray devices can intercept SMS content without carrier involvement. NSA collected 200+ million texts per day globally (Snowden documents, 2014).
iMessageNO PEN REGISTER CAPABILITY per the FBI document. Apple does not provide real-time metadata feeds.
Real-Time Surveillance (continued)
WhatsAppNEAR REAL-TIME WhatsApp is the only service in the FBI document that provides pen register data: source and destination of every message, delivered every 15 minutes. No content, but a full communication graph in near real-time. The ACLU called this "devastating to a reporter communicating with a confidential source."
TelegramNO CAPABILITY per the FBI document.
Real-Time Surveillance (continued)
SignalNO CAPABILITY Signal does not have the data to provide. No pen register capability. No real-time metadata feed. The FBI document confirms this.
IP Address
SMSN/A Cell tower location data serves the same purpose. Your carrier knows your physical location.
iMessageLIKELY Apple collects IP addresses for iCloud services. Available under warrant.
IP Address (continued)
WhatsAppCOLLECTED WhatsApp/Meta collects and retains IP addresses.
TelegramCONDITIONAL Telegram states it may disclose IP address and phone number for confirmed terrorist investigations only.
IP Address (continued)
SignalNOT RETAINED Signal does not log IP addresses. Confirmed by their response to the 2021 grand jury subpoena: no IP data provided because none exists.
Registration / Account Data
SMSFULL IDENTITY Your phone number is tied to your legal identity, billing address, SSN (in most cases), and payment method at the carrier.
iMessageFULL IDENTITY Tied to your Apple ID: name, email, phone, payment info, device serial numbers.
Registration / Account Data (continued)
WhatsAppPHONE NUMBER Required at registration. Subpoena returns basic subscriber records.
TelegramPHONE NUMBER Required at registration. Telegram retains this.
Registration / Account Data (continued)
SignalPHONE NUMBER Required at registration. This is Signal's one identifiable data point. But Signal retains nothing else: no name, no email, no profile data on servers.
THE NATALIE EDWARDS CASE In 2020, former senior FinCEN adviser Natalie Edwards pled guilty to leaking Suspicious Activity Reports to a BuzzFeed reporter. Edwards and the reporter communicated via WhatsApp, believing it to be secure. The FBI used WhatsApp metadata (who messaged whom, when, and how often) to build the case against Edwards. She was sentenced to six months in prison. As Daniel Kahn Gillmor of the ACLU stated: "WhatsApp offering all of this information is devastating to a reporter communicating with a confidential source." The message content was encrypted. The metadata convicted her.

Metadata Collection: What Each Service Knows About You

Even when message content is encrypted, every service collects operational metadata to function. The critical question is: how much, and is it retained?

Metadata Retained by Provider What the service knows about your usage patterns, independent of message content
SMSEVERYTHING Sender, recipient, timestamp, cell tower location, message length. Retained by carriers for 1 to 7 years depending on carrier and field. Content retained for days (varies).
iMessageMODERATE Apple retains 30 days of iMessage metadata including who you attempted to message and your IP address at the time. 25 days of iMessage lookup queries are available to law enforcement.
Metadata Retained (continued)
WhatsAppEXTENSIVE Meta/WhatsApp collects: profile info, contacts, who you message, when, how often, for how long, device type, OS version, battery level, signal strength, IP address, phone number, and location. ProPublica (2021): "Facebook Inc. has downplayed how much data it collects from WhatsApp users."
TelegramMODERATE (cloud chats) IP address, phone number, device info, username, last seen. For cloud chats: full message content on their servers. Telegram claims a distributed key infrastructure requiring multiple jurisdictions, but server code is closed source and this cannot be independently verified.
Metadata Retained (continued)
SignalNEAR ZERO Phone number (registration). Date of account creation. Date of last connection. That is the complete list. Confirmed by Signal's response to a 2021 grand jury subpoena from the US Attorney's Office, Central District of California: "It's impossible to turn over data that we never had access to in the first place."
Cloud Backups: The Encryption Bypass Cloud backups are the most common way E2EE is defeated in practice ⚠ CRITICAL: if your encrypted messages are backed up unencrypted to iCloud or Google Drive, a warrant gets the content
iMessageDEFAULT: UNENCRYPTED iCloud backup stores iMessage content with the encryption key. Apple can and does hand this over under warrant. Apple introduced Advanced Data Protection (opt-in E2EE for iCloud) in late 2022, but it is not on by default.
WhatsAppDEFAULT: UNENCRYPTED WhatsApp backups to iCloud/Google Drive were unencrypted until late 2021 when WhatsApp introduced optional encrypted backups. If the target has unencrypted backups enabled, the FBI can obtain message content via Apple or Google, bypassing WhatsApp's E2EE entirely.
Cloud Backups (continued)
SignalNO CLOUD BACKUP Signal does not integrate with iCloud or Google Drive backup systems. Messages exist only on the device. If the device is lost or wiped, the messages are gone. This is a deliberate architectural decision: there is no cloud-side copy to subpoena.

The Uncomfortable Truth: Encryption Is Not the Ceiling

Everything above analyzes what happens when law enforcement works within the system: subpoenas, warrants, pen registers, data requests to providers. That is the normal case. But if you are individually targeted by a government, the analysis changes fundamentally. Encryption protects data in transit and at rest. It does not protect data on a compromised device. And governments do not need to break your encryption when they can simply hack your phone.

Pegasus: The Encryption Bypass That Costs $0 in Warrants ⚠ CRITICAL: once Pegasus is on your device, every messaging app is compromised, including Signal
NSO Group's Pegasus spyware can be installed on iOS and Android devices using zero-click exploits, meaning the target does not need to click anything, open a link, or interact with the phone at all. In 2019, Pegasus was deployed via a WhatsApp voice call vulnerability; the phone was compromised even if the call was never answered. By 2020, Pegasus shifted to iMessage zero-click exploits that bypassed Apple's BlastDoor security sandbox. Google's Project Zero called one of these exploits "the most technically sophisticated exploit we've ever seen." Once installed, Pegasus reads every message in every app (Signal, WhatsApp, iMessage, Telegram) because it reads the screen and memory after decryption. It can also activate the microphone and camera, extract contacts, passwords, photos, location data, and browsing history. E2EE is irrelevant once the device is compromised. The encryption protects the transmission. Pegasus reads the message after it arrives.
Who Has Access to Pegasus
NSO Group licenses Pegasus exclusively to government agencies. As of 2023, operators in at least 45 countries have been identified by Citizen Lab (University of Toronto). Documented targets include journalists (Jamal Khashoggi's associates, Ben Hubbard of the NYT), human rights activists (Bahrain, UAE, Mexico), political dissidents, lawyers, and heads of state. In the 2021 Pegasus Project investigation by Forbidden Stories and Amnesty International, a leaked list contained over 50,000 phone numbers of potential surveillance targets. In May 2025, a US federal jury ordered NSO Group to pay $167 million in damages to Meta/WhatsApp for hacking approximately 1,400 devices. NSO Group claims to have refocused sales on NATO-aligned countries, but independent verification is not possible.
Cellebrite, GrayKey, and Physical Extraction ⚠ HIGH: if law enforcement has physical access to your device, encryption may not matter
Cellebrite and GrayKey are forensic extraction tools used by law enforcement agencies worldwide to unlock and extract data from seized phones. Cellebrite's UFED products are available to US federal agencies under streamlined procurement contracts (NASA SEWP, NIH CIO-CS). These tools can bypass screen locks, extract "deleted" messages, and recover data from encrypted applications, in many cases without the device's passcode. The EFF notes that even with E2EE messaging, "police have several tools to try to unlock your phone and read your encrypted messages, including Cellebrite and Greykey devices." The constraint is physical access. If law enforcement has your phone (at a border crossing, during an arrest, or via a search warrant for your residence) device-level encryption is the last barrier, not app-level E2EE.
What This Means for Your Threat Model
The messaging app comparison in this article matters for the normal case: protecting against network-level surveillance, provider-side data requests, and routine legal process. This is the threat model for 99% of users. But if you are a journalist working on a story that threatens a government, an activist under state surveillance, a lawyer handling politically sensitive cases, or a dissident in an authoritarian country, your threat model includes device compromise. In that scenario, the messaging app is not the weak link. Your phone is. Mitigations include: keeping your OS updated at all times (most Pegasus exploits target unpatched vulnerabilities), enabling Apple's Lockdown Mode (iOS 16+) which significantly reduces the attack surface for zero-click exploits, using a dedicated device for sensitive communications that is not used for anything else, rebooting your phone regularly (some Pegasus implants are non-persistent and do not survive a reboot), and using disappearing messages aggressively so that even a compromised device holds less readable history.
THE JAMAL KHASHOGGI CONNECTION Pegasus was used to surveill associates of Saudi journalist Jamal Khashoggi before his assassination in the Saudi consulate in Istanbul in October 2018. His wife Hanan Elatr's device was infiltrated with Pegasus in just 72 seconds during her detention at Dubai International Airport in April 2018. The surveillance continued until the assassination. The encryption on their messaging apps was irrelevant because the device itself was the access point. Ben Hubbard, a New York Times Middle East correspondent who was writing about Saudi Crown Prince Mohammed bin Salman, was targeted with Pegasus repeatedly over three years (2018 to 2021). The lesson is not that encrypted messaging failed. The lesson is that device compromise bypasses encryption entirely.

The Data Broker Loophole: Buying What a Warrant Would Require

In 2018, the Supreme Court ruled in Carpenter v. United States that law enforcement must obtain a warrant to access cell phone location data from carriers. That ruling was supposed to be a privacy landmark. Eight years later, the FBI is buying the same data, and more, from commercial data brokers. No warrant required.

FBI Director Confirms Data Broker Purchases (March 2026) ⚠ CRITICAL: this happened this month
On March 18, 2026, FBI Director Kash Patel testified before the Senate Intelligence Committee that the FBI purchases commercially available data, including location data derived from cell phones. When Senator Ron Wyden asked Patel to commit to not buying Americans' location data without a warrant, Patel declined, stating: "We do purchase commercially available information that's consistent with the Constitution and the laws under the Electronic Communications Privacy Act, and it has led to some valuable intelligence for us." This reverses the position taken by former FBI Director Christopher Wray, who indicated in 2023 that the Bureau had stepped back from purchasing location data from commercial brokers.
How the Loophole Works
The apps on your phone (weather apps, games, fitness trackers, social media, ad-supported free apps) collect location data through advertising SDKs embedded in the app code. This data flows to advertising exchanges and data aggregators. Companies like Gravy Analytics and its subsidiary Venntel collect and sell this data, tied to mobile advertising IDs that can be mapped back to individual devices. The FBI and other federal agencies, including ICE and the Department of Defense, purchase this data from commercial brokers. Because the data is "commercially available," agencies argue it falls outside the Fourth Amendment's warrant requirement. The CDT's Jake Laperruque offered a more accurate analogy: this is like the FBI giving a landlord a hefty sum to get the key to someone's apartment rather than obtaining a search warrant. "We wouldn't say, 'Oh, well, they bought the access.'"
The Scale of the Data
In January 2025, the FTC took enforcement action against Gravy Analytics/Venntel, alleging the company collected and sold location data tied to more than one billion mobile devices daily, including visits to health clinics, places of worship, domestic violence shelters, and military installations. A subsequent data breach at Gravy Analytics (reported by Wired) compromised location records tied to popular apps including Candy Crush, Tinder, and MyFitnessPal. That data could identify and track specific individuals. The data broker market is vast, fragmented, and largely unregulated at the federal level. There is no comprehensive federal data privacy statute in the United States.
Why This Matters for Messaging Privacy ⚠ HIGH: your messaging app's encryption is irrelevant if the FBI can buy your location from an ad broker
You can use Signal with disappearing messages and a VPN. But if a weather app on your phone is feeding your GPS coordinates to an advertising exchange every 30 seconds, and the FBI is purchasing that data from Gravy Analytics or Venntel, they know where you are, where you have been, and by extension, who you met with physically. They do not need to know what you said. Location data at the resolution these brokers sell (often within meters) can reveal your home address, your workplace, your doctor's office, your attorney's office, your place of worship, and every meeting you attend. Senator Wyden called the practice "an outrageous end run around the Fourth Amendment." The bipartisan Government Surveillance Reform Act, introduced March 13, 2026 by Senators Wyden and Mike Lee, would close this loophole by requiring a warrant for data broker purchases. As of this writing, it has not passed.
What You Can Do About It
The data broker pipeline starts on your device. Audit every app's location permissions (Settings → Privacy → Location Services on iOS; Settings → Location on Android). Revoke location access for every app that does not absolutely require it. Disable advertising ID tracking (iOS: Settings → Privacy → Tracking → toggle off "Allow Apps to Request to Track"; Android: Settings → Privacy → Ads → Delete advertising ID). Uninstall free, ad-supported apps that you do not actively use, because each one is a potential data source for brokers. Use a VPN to mask your IP address. And understand that the messaging app is one layer of a much larger surveillance surface. Choosing Signal over WhatsApp is meaningful. But if you are carrying a phone full of ad-supported apps with location permissions enabled, the messaging app is not your biggest exposure.

Per-Service Summary

📵 SMS / MMS

CONNo encryption whatsoever. Carriers can read every message. Content is transmitted in plaintext over SS7.

CONFull identity tied to every message. Your phone number, legal name, billing address, and location are attached to every text.

CONStingray interception. Law enforcement can intercept SMS content in real time without carrier involvement using cell-site simulators.

CONMetadata retained for years. Sender, recipient, timestamp, cell tower location are all retained by the carrier and available under routine subpoena.

CONForensic recovery. "Deleted" SMS messages are recoverable from the device using tools like Cellebrite until overwritten by new data.

WARNRCS is replacing SMS and offers encryption in transit, but the ecosystem is still fragmented and cross-platform E2EE is not guaranteed.

🍎 iMessage

PROE2EE by default between Apple devices. Strong encryption in transit.

CONiCloud backup defeats E2EE. If iCloud backup is enabled (it is by default), Apple stores message content with the encryption key and hands it over under warrant.

CONFalls back to SMS when messaging non-Apple devices. Green bubble means no encryption.

CON25 days of iMessage lookup data available to law enforcement, showing who searched for your contact info in iMessage.

CONClosed source. Apple's encryption implementation cannot be independently audited.

WARNAdvanced Data Protection (launched Dec 2022) enables E2EE for iCloud backups, but it is opt-in, not on by default, and most users have never enabled it.

💬 WhatsApp

PROE2EE by default using the Signal Protocol. Message content is protected in transit and at rest on WhatsApp's servers.

CONNear real-time metadata to the FBI. WhatsApp is the only service that provides pen register data: source and destination of every message, every 15 minutes. This is live surveillance.

CONExtensive metadata collection. Meta collects your contacts, usage patterns, device info, IP address, location, and behavioral data.

CONCloud backup bypass. Unencrypted iCloud/Google Drive backups can expose message content. Encrypted backup is opt-in.

CONOwned by Meta. WhatsApp shares data with Facebook's advertising infrastructure. The privacy policy explicitly permits this.

WARNThe Natalie Edwards case proved that WhatsApp metadata alone, without any message content, was sufficient to convict a federal employee of leaking classified documents.

✈️ Telegram

PROSecret Chats are E2EE with self-destructing messages and no server-side storage.

PROMinimal FBI access. The FBI document shows no message content and no contact info available under standard legal process.

CONDefault chats are NOT end-to-end encrypted. Telegram holds decryption keys for all cloud chats. Every group chat, every channel, and every default one-on-one conversation is readable by Telegram.

CONProprietary, unaudited server. Server code is closed source. Telegram's claims about distributed key storage cannot be verified.

CONMTProto protocol criticized by Johns Hopkins cryptographer Matthew Green and others for non-standard design, lack of independent audit, and implementation concerns.

WARNPost-Durov arrest policy change. After Pavel Durov's arrest in France (Aug 2024), Telegram announced it would begin cooperating with law enforcement under court order, reversing its prior stance.

🟣 Signal: The Standard

BESTThe FBI's own document confirms it. Signal provides: date/time of registration, date of last use. That is the entire list. No message content. No contacts. No metadata. No IP addresses. No pen register capability.

BESTE2EE on everything, no exceptions. All messages, group chats, calls, voice notes, file transfers. No cloud backup integration. No fallback to unencrypted modes.

BESTFull open source. Client AND server. The only service on this list where both are auditable.

BESTOperated by a nonprofit. Signal Foundation. No advertising business model. No incentive to collect data.

PRODisappearing messages, sealed sender, screen security. Advanced privacy features enabled at the user's discretion.

CONRequires a phone number to register. This is Signal's one identifiable data point. It is a real limitation for users who need full anonymity.

CONSmaller user base. Most of your contacts are probably on WhatsApp or iMessage. Network effects work against Signal's adoption.

WARNSignal does not protect against device compromise. If Pegasus or Cellebrite gains access to your phone, Signal's encryption is irrelevant. Signal protects the pipe. It cannot protect the endpoints.

Recommendations by Threat Model

If your communications could be subpoenaed, surveilled, or used against you:
Use Signal. No equivocation. The FBI's own internal document confirms that Signal provides the least data of any messaging service they analyzed. If you are a journalist protecting sources, a lawyer communicating with clients about sensitive matters, a whistleblower, an activist, or anyone whose communications could be compelled by legal process, Signal is the only service on this list where the provider genuinely cannot comply with a data request, because the data does not exist. Enable disappearing messages. Verify safety numbers with your contacts. Use a PIN for registration lock.
If you need broad compatibility and your contacts will not switch:
WhatsApp with encrypted backups enabled is a reasonable compromise, but understand the metadata exposure. WhatsApp's E2EE is real and uses the Signal Protocol. Your message content is protected. But Meta collects extensive metadata, provides pen register data to the FBI every 15 minutes, and the Natalie Edwards case proved that metadata alone can be used to identify and convict. If you use WhatsApp: enable encrypted backups, disable iCloud/Google Drive unencrypted backups, and understand that who you talk to and when is visible to Meta and obtainable by law enforcement.
If you face state-level targeting or device compromise threats:
The messaging app is not your primary concern. Your phone is. Use Signal, yes. But also: keep iOS and Android fully updated at all times, because most zero-click exploits target unpatched vulnerabilities. Enable Apple Lockdown Mode (iOS 16+), which dramatically reduces the attack surface. Use a dedicated device for sensitive communications that has no other apps installed. Reboot your phone daily, because some spyware implants (including certain Pegasus variants) do not survive a reboot. Set disappearing messages to the shortest practical interval. Do not carry a phone full of ad-supported apps with location permissions enabled. Audit every app on your device and remove anything you do not actively need. The strongest encryption in the world protects nothing if the device reading the decrypted messages is already compromised.
Regardless of which service you use:
Stop using SMS for anything sensitive. SMS has no encryption, no privacy protections, and full carrier visibility into content and metadata. Every text you send is attached to your legal identity, your physical location, and a timestamp, all retained for months to years and available under routine subpoena.

Turn off iCloud backup if you use iMessage. Or enable Advanced Data Protection (Settings → Apple ID → iCloud → Advanced Data Protection). Without this, your "encrypted" iMessages are sitting on Apple's servers with the decryption key, waiting for a warrant.

Do not trust Telegram's default encryption. Telegram's default chats are client-server encrypted, meaning Telegram holds the keys. Only Secret Chats are E2EE, and they do not work on desktop, do not work for groups, and require manual activation. Cryptographer Matthew Green: "The vast majority of one-on-one Telegram conversations, and literally every single group chat, are probably visible on Telegram's servers."

Understand that encryption protects content, not metadata. The FBI does not need to break encryption. Metadata (who you talk to, when, how often, from where, on what device) is available from most services under routine legal process. The only services that protect both content and metadata are Signal (by architecture) and Telegram Secret Chats (by opt-in). Everything else leaks metadata to varying degrees.

Audit your phone's data broker exposure. Disable location permissions for every app that does not require them. Delete your advertising ID. Uninstall free apps you do not use. The FBI confirmed to Congress this month that it purchases commercially available location data from brokers. Your encrypted Signal messages are meaningless if a weather app on the same phone is reporting your GPS coordinates to an advertising exchange every 30 seconds.

The best service is the one your contacts will actually use. Signal is objectively the most private messaging service available. But privacy is a two-way street. If your contact is backing up WhatsApp to unencrypted iCloud, your messages are exposed on their end regardless of what you use. Have the conversation. Move your sensitive contacts to Signal. Accept that not everyone will switch, and compartmentalize accordingly.

References

  1. [1] FBI, "Lawful Access" document, dated Jan. 7, 2021. Obtained via FOIA by Property of the People. | propertyofthepeople.org
  2. [2] Kroll, A., "FBI Document Says the Feds Can Get Your WhatsApp Data, in Real Time." Rolling Stone, Nov. 29, 2021. | rollingstone.com
  3. [3] Pfefferkorn, R., "We Now Know What Information the FBI Can Obtain from Encrypted Messaging Apps." Just Security, Dec. 14, 2021. | justsecurity.org
  4. [4] Green, M., "Is Telegram really an encrypted messaging app?" A Few Thoughts on Cryptographic Engineering, Aug. 25, 2024. | blog.cryptographyengineering.com
  5. [5] Signal, Response to grand jury subpoena, Central District of California, 2021. | signal.org/bigbrother
  6. [6] Signal, Terms of Service & Privacy Policy. | signal.org/legal
  7. [7] Gillmor, D.K. (ACLU), quoted in Rolling Stone, Nov. 29, 2021: "WhatsApp offering all of this information is devastating to a reporter communicating with a confidential source."
  8. [8] Knodel, M. (Center for Democracy & Technology), quoted in Rolling Stone, Nov. 29, 2021: "The most popular encrypted messaging apps iMessage and WhatsApp are also the most permissive."
  9. [9] ProPublica, "How Facebook Undermines Privacy Protections for Its 2 Billion WhatsApp Users," Sept. 2021.
  10. [10] Freedom of the Press Foundation, "Metadata 102: What is communications metadata and why do we care about it?" | freedom.press
  11. [11] EFF / Digital Rights Bytes, "Can the Government Read My Text Messages?" | digitalrightsbytes.org
  12. [12] Telegram FAQ on Secret Chats and encryption. | telegram.org/faq
  13. [13] ESET, "Telegram Privacy Explained: What's Protected and What's Not," 2024. | eset.com
  14. [14] Apple, iMessage security overview, Apple Platform Security Guide.
  15. [15] Snowden, E., documents (2014): NSA collected 200+ million text messages per day globally. Reported by The Guardian.
  16. [16] NPR / Joffe-Block, J., "Your data is everywhere. The government is buying it without a warrant." March 25, 2026. | npr.org
  17. [17] FedScoop, "Privacy advocates sound alarm on 'data broker loophole' used by FBI, other federal agencies." March 20, 2026. | fedscoop.com
  18. [18] Futurism, "The Head of the FBI Just Admitted Something Moderately Horrifying." March 22, 2026. | futurism.com
  19. [19] FTC, Enforcement action against Gravy Analytics / Venntel, January 2025.
  20. [20] Google Project Zero, "A deep dive into an NSO zero-click iMessage exploit." Dec. 2021. | securityweek.com
  21. [21] Wikipedia, "Pegasus (spyware)." | en.wikipedia.org
  22. [22] Citizen Lab (University of Toronto), Pegasus Project investigations, 2016 to 2024.
  23. [23] Infosecurity Magazine, "NSO Group Hit with $168m Fine for WhatsApp Pegasus Spyware Abuse." May 2025. | infosecurity-magazine.com
  24. [24] Government Surveillance Reform Act, introduced March 13, 2026, by Sen. Wyden (D-OR) and Sen. Lee (R-UT).
  25. [25] Laperruque, J. (Center for Democracy & Technology), quoted in FedScoop, March 20, 2026, on the data broker warrant loophole.
Field Assessment

Your photo library knows more about you than your journal: Ente vs iCloud vs Google Photos vs OneDrive

Every photo you take records your exact GPS coordinates, the device you used, the time down to the second, and in many cases the direction you were facing. When you upload those photos to a cloud service, the question is not whether that data exists. It is who can see it, who can be compelled to hand it over, and whether AI is being trained on it without your meaningful consent. This is a field-by-field comparison of five configurations across four providers: Ente Photos, iCloud Photos with Advanced Data Protection, iCloud Photos without it, Google Photos, and Microsoft OneDrive.

D David
November 2026
Privacy Research
~25 min read
A single photo taken with a modern smartphone can contain over 460 metadata fields. These include GPS coordinates accurate to within a few meters, a timestamp precise to the second, the make, model, and serial number of the device, camera settings like aperture, shutter speed, and focal length, the direction the camera was facing (compass bearing), altitude, and in some cases the name of the Wi-Fi network the phone was connected to at the time. This is called EXIF data (Exchangeable Image File Format), and it is embedded automatically every time you press the shutter button. You cannot see it by looking at the photo. But anyone with access to the file can read it in seconds. When you upload that photo to a cloud service, all of that metadata goes with it. The privacy question then becomes: does the service encrypt it? Can the service read it? Can law enforcement compel the service to hand it over? And is the service using your photos to train AI models? The answers vary dramatically depending on which service you use and how you have configured it.

Part 1: What a Single Photo Reveals About You

Before comparing providers, it is worth understanding exactly what is at stake. Photo metadata is not abstract. It has been used in criminal investigations, stalking cases, corporate leak investigations, and intelligence operations. The EFF documented a case where the FBI identified an Anonymous hacker ("w0rmer") solely through GPS coordinates embedded in a photo posted to Twitter. The photo was taken with an iPhone 4, and the EXIF data contained the exact latitude and longitude of the house where it was taken, which led directly to the suspect's arrest. [1]

EXIF Fields That Identify You ⚠ HIGH: these fields are embedded in every photo by default unless you disable them
LocationGPS latitude, longitude, altitude, compass bearing, and speed. Accurate to within meters. Reveals your home address (from photos taken at home), workplace, doctor's office, attorney's office, place of worship, and every location you visit. Can be used to build a complete movement profile over time.
DeviceCamera make, model, serial number, firmware version, lens type. The serial number is a unique hardware identifier that links every photo taken with that device to a single physical object. Investigators have used serial numbers to connect photos across different platforms and timeframes to the same person.
EXIF Fields That Reveal Your Activity
TimeExact date and time the photo was taken, often to the second. Separate fields for the time the file was created, modified, and digitized. Establishes precise timelines of activity. Has been used in court to corroborate or destroy alibis.
Software and EditingSoftware used to process or edit the image, editing timestamps, embedded thumbnails of the original uncropped image. The thumbnail field is particularly dangerous: even if you crop sensitive content out of a photo, the original uncropped version may persist as an embedded thumbnail in the EXIF data.
REAL CASE In a documented stalking case, law enforcement found that a stalker extracted GPS coordinates from photos a victim had shared on a dating app. The platform did not strip EXIF metadata on upload, effectively turning every shared photo into a location beacon. Companies like Facebook and Instagram now strip EXIF data on upload, but many platforms still do not. And regardless of what the platform does, the cloud service where the original photo is stored retains all of it. [2]

Part 2: Encryption Architecture, Provider by Provider

The core question: who can see your photos and their metadata? The answer depends entirely on the encryption model. Some providers encrypt your photos but hold the keys (meaning they can decrypt them under legal process). Some encrypt photos end-to-end (meaning only you can decrypt them). And some do not encrypt stored photos at all in any meaningful sense.

End-to-End Encryption of Photos ⚠ CRITICAL: this determines whether the provider can see your photos
Ente PhotosFULL E2EE All photos, videos, and all metadata (including EXIF data, descriptions, tags, album names) are encrypted on your device before upload. Ente cannot see any of it. Uses XSalsa20-Poly1305 (via libsodium) for symmetric encryption, X25519 for key exchange, and Argon2 for key derivation from your password. Audited twice by Cure53 (2023 crypto audit, 2025 CERN-sponsored infrastructure audit). [3]
iCloud (with ADP)E2EE When Advanced Data Protection is enabled, Photos is one of the 23 data categories protected with end-to-end encryption. Apple does not hold the decryption keys. However, ADP is opt-in, not on by default, and most users have not enabled it. Some metadata (file checksums, timestamps) remains under standard encryption even with ADP on. [4]
End-to-End Encryption of Photos (continued)
iCloud (without ADP)STANDARD ENCRYPTION ONLY Photos are encrypted in transit and at rest, but Apple holds the decryption keys. Apple can and does decrypt photos in response to valid search warrants. This is the default configuration for every iCloud account. [5]
Google PhotosNO E2EE Photos are encrypted in transit (TLS) and at rest on Google's servers, but Google holds the keys. Google can access, scan, analyze, and provide your photos to law enforcement under a search warrant. Google actively scans photos for CSAM using automated detection. [6]
End-to-End Encryption of Photos (continued)
Microsoft OneDriveNO E2EE FOR PHOTOS OneDrive encrypts files in transit and at rest with BitLocker and per-file encryption, but Microsoft holds the keys. OneDrive Personal Vault adds an extra authentication layer but does not add end-to-end encryption. Microsoft can access stored photos under legal process. [7]
EXIF Metadata Encryption Whether the GPS coordinates, timestamps, and device identifiers embedded in your photos are encrypted ⚠ CRITICAL: if metadata is not encrypted, the provider knows where every photo was taken, when, and with what device
EnteFULLY E2EE All metadata, including EXIF, is encrypted on device before upload. Ente explicitly states: "all metadata (including exif creation time, location, description etc) is also end-to-end encrypted." [3]
iCloud (ADP)MOSTLY E2EE Photo content is E2EE. Apple states that "some metadata and usage information" including file modification dates and checksums remains under standard data protection even with ADP enabled. [4]
EXIF Metadata Encryption (continued)
iCloud (no ADP)NOT E2EE Apple holds the keys. All photo metadata is accessible to Apple and producible under warrant.
Google PhotosNOT E2EE Google reads EXIF metadata to power features like location-based photo search, timeline view, and face grouping. This data is actively indexed and searchable by Google's systems.
EXIF Metadata Encryption (continued)
OneDriveNOT E2EE Microsoft can access photo metadata. OneDrive uses EXIF data for its "On This Day" and location features, confirming server-side access to the metadata.
Open Source
EnteFULL Client apps (iOS, Android, desktop, web) AND server code are open source. Audited by Cure53 and Symbolic Software. CERN-sponsored audit in October 2025. [8]
iCloudNO Closed source. Apple publishes a security whitepaper but the code is not independently auditable.
Open Source (continued)
Google PhotosNO Closed source. Proprietary infrastructure.
OneDriveNO Closed source. Proprietary infrastructure.

Part 3: AI, Your Photos, and Who Benefits

Every major cloud photo provider now offers AI-powered features: face recognition, object search, memory curation, image enhancement, and more. The privacy question is where that AI processing happens (on your device or on the provider's servers) and whether your photos are used to train the company's broader AI models.

Where AI Processing Happens ⚠ HIGH: server-side processing means the provider can see your photos in order to analyze them
EnteON-DEVICE ONLY All machine learning (face detection, recognition, magic search) runs entirely on your device. Photos are downloaded locally, indexed locally, and the ML indexes are encrypted before syncing to your other devices. Ente's servers never see unencrypted photos or ML data. "Your photos and ML data are never used to train any AI models, neither by Ente nor by any third parties." [3]
iCloudMOSTLY ON-DEVICE Face recognition and object classification in Apple Photos runs on-device. Apple states it does not use iCloud Photos data to train generative AI models. However, Apple's CSAM scanning proposals (announced then withdrawn in 2021) raised concerns about server-side photo analysis capabilities.
Where AI Processing Happens (continued)
Google PhotosSERVER-SIDE Google processes photos on its servers to power features like face grouping, object search, location tagging, memory curation, and the AI-powered "Magic Editor." Google's help page states: "We don't train any generative AI models outside of Photos with your personal data." But this carefully worded statement leaves open the question of what happens within the Photos product itself. Google actively scans every photo for CSAM using automated detection systems. Ente founder Vishnu Mohandas created the site TheySeeyourPhotos.com specifically to demonstrate what Google's AI can infer from a single photo. [9]
OneDriveSERVER-SIDE Microsoft uses server-side processing for photo features. Microsoft's privacy statement permits the use of data to "improve and develop" its products. Microsoft Copilot integration with OneDrive raises additional questions about how photo data is processed.
What AI Can Infer From Your Photo Library This applies to any provider that can see your photos in unencrypted form
A photo library is not just a collection of images. To a modern AI system with server-side access, it is a comprehensive behavioral and biographical dataset. From a typical photo library, AI can infer: your home address and workplace (from recurring GPS coordinates), your daily routine and travel patterns, every person you spend time with and how frequently (from face clustering), your physical appearance over time, the interior of your home, your financial status (from visible possessions, car, neighborhood), your children's faces and schools, your medical conditions (from hospital visits visible in location data), your political and religious affiliations (from events and locations), and your romantic relationships. The Ente team built TheySeeyourPhotos.com to make this visible: upload any photo and see what Google's publicly available AI models can extract from it. [9]
THE BROADER AI TRAINING QUESTION In July 2023, Google updated its general privacy policy to explicitly state it may use publicly available information to train AI models. While Google's Photos-specific page says it does not train "generative AI models outside of Photos" with your personal data, the broader Google Terms of Service (updated May 2024) grant Google broad rights to use data across services. Proton publicly accused Google in December 2025 of training image generation AI on Google Photos libraries, though Google denied this and no direct evidence was provided. The fundamental issue remains: if the provider can see your photos (because they hold the encryption keys), the technical capability to use them for AI training exists regardless of stated policy. Policy can change. Encryption cannot. [10]
UNDER A COURT ORDER: WHAT GETS HANDED OVER
Your Photo Library Under Legal Process
A search warrant compels the provider to hand over everything they have. The question is: what do they have? For providers that hold the encryption keys, the answer is everything. For providers with end-to-end encryption, the answer is essentially nothing.
Photo and Video Content ⚠ CRITICAL: your actual photos and videos
EnteCANNOT HAND OVER E2EE. Ente does not hold decryption keys. All content is encrypted blobs on their servers.
iCloud (ADP on)CANNOT HAND OVER E2EE. Apple does not hold decryption keys for ADP-protected data. Apple's Law Enforcement Response Team has stated they will not confirm ADP status without a warrant. [11]
Photo and Video Content (continued)
iCloud (no ADP)CAN HAND OVER Apple holds the keys. This was the loophole law enforcement used for years: even when an iPhone itself was locked, the iCloud backup (including Photos) was accessible via warrant. [5]
Google PhotosCAN HAND OVER Google holds the keys. A search warrant compels disclosure of "content stored in a Google Account, such as Gmail messages, documents, photos and YouTube videos." [6]
Photo and Video Content (continued)
OneDriveCAN HAND OVER Microsoft holds the keys. Producible under valid legal process.
EXIF Metadata (GPS, timestamps, device info) ⚠ HIGH: reveals where every photo was taken, when, and with what device
EnteCANNOT HAND OVER E2EE. Metadata is encrypted with the file.
iCloud (ADP on)PARTIALLY PROTECTED Photo content is E2EE, but Apple states that "some metadata and usage information" remains under standard protection, including modification dates and checksums.
EXIF Metadata (continued)
iCloud (no ADP)CAN HAND OVER All metadata accessible.
Google / OneDriveCAN HAND OVER All metadata accessible. Google actively indexes this metadata for search and timeline features.
Face Recognition Data / AI-Generated Indexes
EnteCANNOT HAND OVER ML indexes are generated on-device and encrypted before sync. Ente's servers never see them.
Google PhotosCAN HAND OVER Face groupings, object tags, location clusters, and AI-generated labels are all server-side data that Google holds and can produce under warrant.
THE UK PRECEDENT In February 2025, the UK Home Office secretly ordered Apple under the Investigatory Powers Act to provide blanket access to Advanced Data Protection-encrypted iCloud data worldwide. Rather than comply, Apple disabled ADP for UK users entirely. This means iCloud users in the UK cannot enable E2EE for their photos. Apple stated: "We have never built a backdoor or master key to any of our products or services and we never will." But the UK order demonstrates that government pressure on E2EE photo storage is not theoretical. It is happening now. [12]

Part 5: Ente's Encryption Architecture (and Its One Noted Limitation)

Since Ente is the only provider on this list with full end-to-end encryption of photos by default, its architecture deserves a closer look. It also has one documented limitation worth discussing.

How Ente's Key Hierarchy Works
When you create an Ente account, the app generates a masterKey on your device. This key never leaves your device unencrypted. Your password is used to derive a keyEncryptionKey via Argon2, which encrypts the masterKey. The encrypted masterKey is stored on Ente's servers so you can access your account from other devices, but only your password can unlock it.

Each album (collection) gets its own collectionKey. Each file gets its own fileKey. Files are encrypted with their fileKey, fileKeys are encrypted with the collectionKey, and collectionKeys are encrypted with your masterKey. This hierarchical key structure means that sharing an album with another Ente user requires only sending them the collectionKey (encrypted with their public key), not re-encrypting every file.

All cryptographic operations use libsodium: XSalsa20-Poly1305 for authenticated encryption, X25519 for key exchange, and Argon2id for password-based key derivation. The entire architecture has been audited by Cure53 (March 2023) and the infrastructure was audited again in a CERN-sponsored engagement (October 2025). [3] [8]
Shared Links and Forward Secrecy ⚠ NOTE: this is a known architectural property, not a vulnerability
Ente allows you to share albums via links that do not require the recipient to have an Ente account. These links are end-to-end encrypted (the decryption key is embedded in the URL fragment, which is never sent to the server). However, these shared links do not provide forward secrecy. Forward secrecy means that if a key is compromised in the future, past communications remain secure. In Ente's shared link model, the key embedded in the link is static for the lifetime of that link. If someone obtains the link (through interception, a compromised device, or a shared chat log), they can access the album for as long as the link is active.

This is a known property of Ente's sharing model, not a bug. The Cure53 audit did not flag it as a vulnerability because it is an inherent limitation of link-based sharing without requiring accounts. The mitigation is straightforward: set an expiration on shared links, protect them with a password, and revoke links when they are no longer needed. For truly sensitive sharing, use Ente's account-to-account sharing (which uses asymmetric key exchange and does not have this limitation) rather than link-based sharing.

Per-Provider Summary

💚 Ente Photos

BESTFull E2EE for everything. Photos, videos, all EXIF metadata, album names, descriptions, tags, and ML indexes. Ente cannot see any of your data.

BESTOn-device AI only. Face recognition and search run locally. ML indexes are encrypted before syncing. No server-side photo processing. No AI training on your data.

BESTFull open source. Client and server. Audited twice by Cure53. CERN-sponsored infrastructure audit.

PROSelf-hostable. You can run the entire Ente server on your own hardware.

PRO3x replication across providers in the EU, including an underground facility in Paris.

WARNShared links lack forward secrecy. Use password protection, expiration, and account-to-account sharing for sensitive content.

CONPaid service. 10 GB free, then paid plans starting at $1.49/month (50 GB). No free unlimited tier.

🍎 iCloud Photos (with ADP)

PROE2EE when ADP is enabled. Apple does not hold the keys. Photos and most metadata are protected.

PRODeep Apple ecosystem integration. Seamless with iPhone, iPad, Mac.

CONADP is opt-in. Not enabled by default. Most users have never turned it on.

CONSome metadata remains under standard protection even with ADP enabled (modification dates, checksums).

CONClosed source. Encryption implementation cannot be independently audited.

CONShared Albums and collaborative features do not support ADP. Photos in Shared Albums use standard encryption only.

CONDisabled in the UK after the Home Office ordered Apple to provide backdoor access (Feb 2025).

🍎 iCloud Photos (without ADP)

CONApple holds the encryption keys. All photos and metadata are accessible to Apple and producible under warrant.

CONThis is the default. Every iCloud account that has not explicitly enabled ADP operates in this mode.

CONThe iCloud backup loophole. For years, law enforcement obtained iPhone data not from the device itself, but from the unencrypted iCloud backup. Photos were a primary target.

WARNEnabling ADP is the single most important step an iPhone user can take for photo privacy. Settings → [Your Name] → iCloud → Advanced Data Protection → Turn On.

📸 Google Photos

CONNo end-to-end encryption. Google holds the keys to every photo.

CONServer-side AI processing. Google's systems analyze your photos, faces, locations, objects, and scenes. All of this data is indexed and searchable on Google's servers.

CONActively scans for CSAM. Automated scanning means Google's systems are looking at your photos.

CONFull producible under warrant. Photos, metadata, face groupings, AI-generated labels, location history derived from photos.

CONPart of the Google advertising ecosystem. Google's broad privacy policy grants extensive data usage rights.

WARNGoogle received nearly 40,000 law enforcement data requests in the first half of 2020 alone, complying with 83% of subpoenas.

☁️ Microsoft OneDrive

CONNo end-to-end encryption for photos. Microsoft holds the keys. Personal Vault adds an authentication layer but not E2EE.

CONServer-side processing. Microsoft processes photos for organizational features and Copilot integration.

CONProducible under warrant. All photos and metadata accessible to Microsoft and law enforcement.

WARNOneDrive is not primarily a photo service. It is a general file storage platform. For users who store photos in OneDrive because it came bundled with Microsoft 365, the privacy implications are the same as Google Photos: the provider can see everything.

Recommendations

If privacy is a priority:
Use Ente Photos. It is the only service on this list that encrypts everything (photos, videos, all metadata, ML indexes) end-to-end by default, runs all AI processing on-device, is fully open source, and has been independently audited by Cure53 (twice, including a CERN-sponsored audit). The provider cannot see your photos under any circumstances. There is nothing to hand over under a court order because the data is encrypted and Ente does not hold the keys. For shared links, always set a password and an expiration. For sensitive sharing, use account-to-account sharing instead of links.
If you are staying in the Apple ecosystem:
Enable Advanced Data Protection immediately. This is the single most impactful privacy setting on your iPhone. Without it, every photo you take is stored on Apple's servers with keys Apple holds, producible under warrant. With it, your photos are end-to-end encrypted and Apple cannot decrypt them. Go to Settings → [Your Name] → iCloud → Advanced Data Protection → Turn On. You will need to set up a recovery key or recovery contact. Do it now. Be aware that Shared Albums do not support ADP, and that some metadata remains under standard protection even with ADP enabled.
Regardless of which service you use:
Disable location tagging on your camera. iOS: Settings → Privacy and Security → Location Services → Camera → set to "Never." Android: Open Camera app → Settings → disable "Save location." This prevents GPS coordinates from being embedded in your photos in the first place. It is the most effective single step you can take to reduce photo metadata exposure.

Strip EXIF data before sharing photos externally. If you share a photo via email, messaging, or any platform that does not automatically strip metadata, the full EXIF data travels with it. Use a metadata removal tool or configure your sharing workflow to strip it.

Understand that Google Photos and OneDrive are not encrypted in any meaningful privacy sense. The provider holds the keys. They can see your photos, analyze them with AI, and hand them over under legal process. If you use these services, you are trusting the provider and every government with jurisdiction over them with your complete photo history, including every location, face, and moment captured in it.

Review your photo library as a threat surface. Your photo library is a biography. It contains your home, your family's faces, your travel patterns, your friends, your possessions, your health-related visits, and years of location history. Treat it with the same seriousness you would treat your medical records or financial statements. The encryption (or lack thereof) on your photo storage is not a technical detail. It is a decision about who gets access to the most intimate record of your life.

References

  1. [1] EFF, "A Picture is Worth a Thousand Words, Including Your Location." 2012. | eff.org
  2. [2] ConvertKit Images, "4 Hidden Dangers of Image Metadata." Feb. 2026. | convertkitimages.com
  3. [3] Ente, Architecture and Security FAQ. | ente.io/architecture
  4. [4] Apple, "iCloud data security overview" and "Advanced Data Protection." | support.apple.com
  5. [5] Journal of High Technology Law, "Closing the Loophole: Effects of Apple's New Advanced Data Protection for iCloud." March 2023. | suffolk.edu
  6. [6] Google, "How Google handles government requests for user information." | policies.google.com
  7. [7] Microsoft, OneDrive Personal Vault and data protection documentation.
  8. [8] Ente, "Ente completes CERN sponsored audit." Nov. 2025. Cure53 audit report. | ente.io/blog/cern-audit
  9. [9] PetaPixel, "This Website Reveals What Google's AI Can Learn From Your Photos." Dec. 2024. TheySeeyourPhotos.com by Ente. | petapixel.com
  10. [10] CyberNews, "Proton claims Google trains AI on Google Photos albums: does it?" Dec. 2025. | cybernews.com
  11. [11] WarrantBuilder, "iCloud Advanced Data Protection: A challenge for law enforcement." Feb. 2024. | warrantbuilder.com
  12. [12] Malwarebytes, "Apple ordered to grant access to users' encrypted data." Feb. 2025. | malwarebytes.com
  13. [13] Cure53, "Audit-Report ente Crypto Design and Code." March 2023. | ente.io
  14. [14] Apple, Legal Process Guidelines (Oct 2025). | apple.com (PDF)
  15. [15] Perspectives in Legal and Forensic Sciences, "Forensic Value of Exif Data." 2025. | sciepublish.com
Technical Brief

How encryption actually works, why it matters, and where it stops protecting you

This is the post I will keep pointing you to. If you have read anything else on this site and found yourself wondering what AES-256 actually means, what end-to-end encryption is doing under the hood, whether quantum computers can break all of it, or why encryption alone is not enough, this is where those questions get answered. No prerequisites. No dumbing down. Just the actual mechanics, explained clearly.

D David
October 2025
Privacy Research
~25 min read
Encryption is mathematics applied to secrecy. That is all it is. It takes readable information, runs it through a mathematical function with a key, and produces output that is unreadable without the key. The math is public. The algorithms are published. The only secret is the key. Everything you read on this site about email privacy, messaging apps, notes metadata, and court orders comes back to this one question: what is encrypted, with what algorithm, and who holds the key? This post explains the core concepts that make all of that analysis possible. It is written for someone who has never studied cryptography, but it does not skip the parts that matter to someone who has.

Part 1: The Two Fundamental Types of Encryption

Every encryption system in use today falls into one of two categories, or combines both. Understanding the difference between them is the single most useful thing you can learn about cryptography.

Symmetric Encryption: One Key, Two People
Symmetric encryption uses the same key to encrypt and decrypt. Think of it like a combination lock on a shared locker. Both people need the same combination. If you have the key, you can lock the box. If you have the key, you can unlock the box. The same key does both jobs.

This is the oldest form of encryption and still the fastest. When your phone encrypts its storage, it uses symmetric encryption. When Signal encrypts a message after the handshake is complete, the actual message encryption is symmetric. When a VPN encrypts your traffic, symmetric encryption does the heavy lifting.

The problem with symmetric encryption is the key exchange: how do you get the same key to two people without anyone else seeing it? If you are standing in the same room, you can whisper it. If you are on opposite sides of the internet, you need a way to agree on a shared secret without ever transmitting it in the clear. That problem is what asymmetric encryption was invented to solve.
The Major Symmetric Algorithms
AES (Advanced Encryption Standard)The most widely used symmetric cipher in the world. Standardized by NIST in 2001. Comes in three key sizes: AES-128, AES-192, and AES-256. Used by governments, banks, messaging apps, VPNs, disk encryption, and virtually everything else that needs to keep data secret. AES-256 is the variant recommended for long-term security and is considered quantum-resistant (more on this below).
ChaCha20A modern stream cipher designed by Daniel Bernstein. Used by Signal, WireGuard VPN, and Google's TLS implementation. Faster than AES on devices without hardware AES acceleration (like older phones). Considered equally secure to AES-256 at its full key length.
Asymmetric Encryption: Two Keys, One Direction Each
Asymmetric encryption (sometimes referred as public-key cryptography) uses two mathematically linked keys: a public key and a private key. Anyone can encrypt a message using your public key. Only your private key can decrypt it. In other words, the public key locks; the private key unlocks. They are not the same key and you cannot derive one from the other in any practical amount of time.

This solves the key exchange problem. You publish your public key to the world. Someone encrypts a message with it and sends it to you. Only you can read it, because only you have the private key. No shared secret needs to be transmitted.

The trade-off is speed. Asymmetric encryption is dramatically slower than symmetric encryption, often by a factor of 1,000x or more. You would never encrypt an entire video call with RSA. It would be unusably slow. This is why real-world systems use both types together.
The Major Asymmetric Algorithms
RSAThe original. Published in 1977. Security is based on the mathematical difficulty of factoring very large numbers into their prime components. RSA-2048 and RSA-4096 are common key sizes. Proton Mail uses RSA. RSA is vulnerable to quantum computers running Shor's algorithm, which can factor large numbers efficiently. RSA will need to be replaced in the post-quantum era.
Elliptic Curve (ECC)A newer approach using the mathematics of elliptic curves. Provides equivalent security to RSA with much smaller keys (a 256-bit ECC key is roughly equivalent to a 3,072-bit RSA key). X25519 (Curve25519) is the most widely used variant. Signal uses it. Tuta Mail uses it. Faster and more efficient than RSA. Also vulnerable to Shor's algorithm on a quantum computer.
Hybrid Encryption: How the Real World Actually Works ✓ This is what every modern secure system uses
In practice, no system uses purely symmetric or purely asymmetric encryption. Every secure messaging app, every HTTPS connection, every VPN tunnel uses both together in what is called hybrid encryption. Here is how it works:

Step 1: Your device and the other device use asymmetric encryption (like X25519) to agree on a shared secret key. This is called the key exchange or handshake. It happens once at the start of the session.

Step 2: Once both devices have the shared secret, they switch to symmetric encryption (like AES-256 or ChaCha20) for the actual data. Every message, every file, every voice packet is encrypted with the fast symmetric cipher using the key that was established in step 1.

The asymmetric part solves the "how do two strangers agree on a secret?" problem. The symmetric part handles the "now encrypt everything quickly" problem. Together, they give you both security and speed. When someone says "Signal uses the Signal Protocol with X25519 and AES-256," they are describing a hybrid system.

Part 2: Can AES-256 Be Cracked? The Quantum Question, Answered With Physics

This is the section I wrote because I keep seeing the same fear repeated online: "quantum computers will break all encryption." That statement is half true and half dangerously misleading. Quantum computers pose a real, existential threat to asymmetric encryption (RSA, ECC). They pose essentially zero practical threat to AES-256. Here is why, quantified in terms that are hard to forget.

What Grover's Algorithm Actually Does
Grover's algorithm, published in 1996, is the best known quantum algorithm for attacking symmetric encryption. It provides a quadratic speedup for brute-force search, meaning it can search a keyspace in the square root of the time a classical computer would need.

For AES-256, the keyspace is 2^256 possible keys. Grover's algorithm reduces this to 2^128 operations. In plain language: a quantum computer running Grover's algorithm against AES-256 would face the equivalent difficulty of a classical computer brute-forcing AES-128.

AES-128 is still considered secure against classical computers today. 2^128 is approximately 3.4 × 10^38 operations. To put that in human terms: there are roughly 10^22 stars in the observable universe. 2^128 is about 10 quadrillion times the number of stars in the observable universe. Grover's algorithm makes AES-256 "only" as hard as that. [1] [2]
What Shor's Algorithm Does (and Why It Matters for RSA, Not AES)
Shor's algorithm is the quantum threat that actually keeps cryptographers awake. Unlike Grover's quadratic speedup, Shor's algorithm provides an exponential speedup for factoring large numbers and computing discrete logarithms. These are the specific mathematical problems that RSA and Elliptic Curve cryptography depend on.

A sufficiently powerful quantum computer running Shor's algorithm could factor the large primes behind RSA-2048 in hours instead of billions of years. It could solve the elliptic curve discrete logarithm problem that protects X25519. When this happens (and most researchers believe it will within 10 to 20 years), every asymmetric algorithm currently in widespread use will be broken.

AES is not vulnerable to Shor's algorithm. AES does not depend on factoring or discrete logarithms. It is a substitution-permutation network. Shor's algorithm simply does not apply to it. The only quantum attack on AES is Grover's brute-force speedup, and as shown above, AES-256 survives that comfortably. [3]
The Thermodynamic Argument: Why Physics Itself Protects AES-256 ✓ This is the part where encryption stops being a math problem and becomes a physics problem
In 2000, MIT professor Seth Lloyd published a paper in Nature titled "Ultimate Physical Limits to Computation." Lloyd calculated the maximum computational capacity of a physical system based on fundamental physics: the speed of light, the quantum scale (Planck's constant), and the gravitational constant. His theoretical "ultimate laptop" was a 1-kilogram computer confined to 1 liter of volume, operating at the absolute thermodynamic limit, converting all of its mass into computational energy via E=mc². [4]

This ultimate laptop could perform 10^51 operations per second and store 10^31 bits. It would, in Lloyd's words, be operating as a "black hole" since confining that much energy in that volume would collapse it past the Schwarzschild radius.

Now apply that to AES-256. Even after Grover's algorithm reduces the problem to 2^128 operations (about 3.4 × 10^38), Lloyd's ultimate laptop running at the thermodynamic limit of physics would need approximately 10^13 seconds to complete the search. That is roughly 300,000 years.

To be clear about what this means: a computer that converts its entire mass into energy, operates at the absolute physical limit permitted by quantum mechanics, and exists as a literal black hole would still need three hundred thousand years to brute-force AES-256 after Grover's speedup.

And that is one key. For one message. Against one user. With a computer that cannot actually be built.

A more recent estimate (2019, Kryptera) calculated that a quantum computer with more than 6,600 logical, error-corrected qubits would be needed to meaningfully attack AES-256. As of 2026, the largest quantum computers have roughly 1,000 physical qubits, and each logical qubit requires thousands of physical qubits for error correction. The gap between where we are and where we would need to be is not years. It is architectural generations. [5]

The NSA's own CNSA (Commercial National Security Algorithm) suite explicitly recommends AES-256 as quantum-resistant for protecting classified information. [6]
BOTTOM LINE Quantum computers are a real and urgent threat to RSA and Elliptic Curve cryptography. They are not a meaningful threat to AES-256. If your data is encrypted with AES-256 (or ChaCha20 at equivalent key length), the quantum apocalypse does not apply to you. The post-quantum transition is about replacing the key exchange algorithms (the asymmetric part), not the data encryption algorithms (the symmetric part). This is exactly what Tuta Mail did in March 2024 when it deployed TutaCrypt: a hybrid protocol that replaces the vulnerable asymmetric layer with CRYSTALS-Kyber (quantum-resistant) while keeping AES-256 for the symmetric layer.

Part 3: How HTTPS Encryption Works (and What It Does Not Do)

Every time you see the padlock icon in your browser or a URL starting with https://, you are using encryption. Since this site is a progressive web app served over HTTPS, and since HTTPS is the most common encryption most people encounter daily, it is worth understanding what it actually protects.

What HTTPS Does
HTTPS stands for HTTP Secure. It wraps the normal web protocol (HTTP) in a layer of encryption called TLS (Transport Layer Security, the successor to SSL). When your browser connects to a website over HTTPS, a hybrid encryption handshake occurs:

1. Your browser and the server agree on a shared secret using asymmetric encryption (typically X25519 or ECDHE).
2. All subsequent data is encrypted with that shared secret using symmetric encryption (typically AES-256-GCM or ChaCha20-Poly1305).
3. The server proves its identity using a digital certificate signed by a Certificate Authority (CA) that your browser trusts.

The result: everything between your browser and the server is encrypted. The content of the pages you view, the data you submit in forms, the cookies your browser sends, all of it travels through an encrypted tunnel that your ISP, your Wi-Fi operator, and anyone else on the network path cannot read.
What HTTPS Does Not Do ⚠ These are common misconceptions
HTTPS does not hide which website you are visiting. Your ISP can see the domain name (for example, orionprivate.com) because the DNS query that resolves the domain is typically unencrypted, and the TLS handshake itself includes the server name in plaintext. HTTPS hides the page you are on and the content you are viewing, but not the fact that you visited the site at all. To hide the domain, you need encrypted DNS (DoH or DoT) and Encrypted Client Hello (ECH).

HTTPS does not protect data after it reaches the server. TLS encryption covers data in transit. Once the data arrives at the server, it is decrypted and the server can read it. If you submit a form over HTTPS, the website operator can see exactly what you typed. This is client-to-server encryption, not end-to-end encryption. End-to-end encryption (like Signal or the PGP form on this site) means only the intended recipient can decrypt the data, not even the server that carries it.

HTTPS does not mean the website is trustworthy. A padlock means the connection is encrypted. It says nothing about who is on the other end. Phishing sites use HTTPS too.

Part 4: Where Encryption Stops Protecting You

This is the most important section in this entire post. Encryption is extraordinary technology. The math is sound. AES-256 will not be broken in your lifetime, your children's lifetime, or the lifetime of the sun. But encryption only protects what it covers. And the places where it does not cover you are exactly the places where real-world surveillance happens.

Metadata Is Not Encrypted ⚠ This is the single most important concept in practical privacy
End-to-end encryption protects the content of your message. It does not protect the fact that you sent it, who you sent it to, when you sent it, how often you communicate, from what device, or from what location. This information, called metadata, is available to the service provider and, under legal process, to law enforcement.

As we documented in our messaging comparison, the FBI's own "Lawful Access" document shows that WhatsApp provides pen register metadata (source and destination of every message) every 15 minutes to law enforcement. The message content is encrypted. The metadata is not. And as the Natalie Edwards case proved, metadata alone was sufficient to identify and convict a federal employee. [7]

Former NSA Director Michael Hayden put it more bluntly: "We kill people based on metadata."
A Compromised Device Makes Encryption Irrelevant ⚠ CRITICAL
Encryption protects data in transit (between devices) and at rest (on disk). It does not protect data in use (on your screen, in your app's memory, after decryption). If your device is compromised by spyware like NSO Group's Pegasus, the attacker reads your messages after they are decrypted, directly from the screen and memory of your phone. It does not matter if you are using Signal with AES-256 and X25519 and disappearing messages. Pegasus reads the decrypted text, records your screen, activates your microphone, and exfiltrates everything. Google's Project Zero called one Pegasus exploit "the most technically sophisticated exploit we've ever seen." [8]

Similarly, forensic extraction tools like Cellebrite and GrayKey, used by law enforcement worldwide, can bypass device locks and extract data from phones, including messages from encrypted apps, if they gain physical access to the device. [9]

The encryption did not fail. The device did.
Your Phone Takes Screenshots of Everything ⚠ MED: most people do not know this happens
Both iOS and Android take automatic screenshots of your apps for the app switcher (the screen you see when you swipe up to switch between apps). These preview images are stored locally on the device. On iOS, these snapshots are taken every time you leave an app or switch to another one. They capture whatever was on screen at that moment, including the contents of encrypted messaging apps, passwords visible in a password manager, sensitive documents, and anything else displayed.

These snapshots are stored in the device's filesystem and can be extracted by forensic tools during a physical examination of the phone. Your encrypted Signal message that disappears after 30 seconds may live on indefinitely as a screenshot in the app switcher cache.

Signal addresses this with its "Screen Security" setting (Settings → Privacy → Screen Security on Android; Screen Lock on iOS), which blocks the operating system from capturing these previews. When enabled, Signal appears as a blank screen in the app switcher. Other messaging apps do not offer this protection. If you are using WhatsApp, iMessage, or Telegram and care about forensic extraction, there is no built-in mitigation for this.

For iPhone users facing elevated threat levels, Apple's Lockdown Mode (iOS 16+) reduces the attack surface significantly by disabling features commonly exploited by spyware, including certain iMessage attachment types, FaceTime from unknown callers, and shared albums. It is the single most effective step an iPhone user can take against targeted surveillance short of switching to a dedicated security-focused operating system.
Cloud Backups Undo Encryption
As covered in our messaging comparison: if your encrypted iMessages are backed up to iCloud without Advanced Data Protection enabled, Apple stores the messages with the encryption key and hands both over under a search warrant. If your WhatsApp messages are backed up to Google Drive or iCloud without encrypted backup enabled, the content is stored unencrypted in the cloud.

The encryption worked perfectly. The backup defeated it. This is not a flaw in the encryption. It is a flaw in the system design, and in the defaults that most users never change.
Operational Security (OPSEC) Is the Real Perimeter
Encryption is a tool. Operational security is the discipline of using that tool correctly within the full context of your threat model. The best encryption in the world does not help if:

You use a personal recovery email on your encrypted mail account (the Catalan activist case: Proton Mail's encryption held, but the recovery email led to identification via Apple). [10]

You pay for a privacy service with a credit card (the Stop Cop City case: the FBI identified a Proton Mail user solely through payment metadata). [11]

You carry a phone full of ad-supported apps with location permissions enabled while the FBI is purchasing location data from commercial brokers (confirmed by FBI Director Kash Patel in Senate testimony, March 18, 2026). [12]

You discuss sensitive topics in a group chat where one participant has iCloud backup enabled. Your encryption is only as strong as the weakest device in the conversation.

Encryption protects the pipe. OPSEC protects everything around it. The adversaries who matter do not attack the math. They attack the human.

Quick Reference: What to Remember

Symmetric encryption (AES-256, ChaCha20):
Same key encrypts and decrypts. Fast. Used for the actual data. AES-256 is quantum-resistant. The NSA recommends it for classified data. A computer operating at the thermodynamic limits of physics would need 300,000 years to brute-force it even after Grover's quantum speedup.
Asymmetric encryption (RSA, ECC/X25519):
Two keys: public encrypts, private decrypts. Slow. Used for key exchange and digital signatures. Vulnerable to quantum computers running Shor's algorithm. Will need to be replaced with post-quantum algorithms (CRYSTALS-Kyber, CRYSTALS-Dilithium) within the next 10 to 20 years. The "harvest now, decrypt later" threat is real: data encrypted with RSA today could be stored and decrypted once quantum computers are available.
Hybrid encryption:
Every real-world system uses both together. Asymmetric for the handshake, symmetric for the data. HTTPS, Signal, WhatsApp, VPN tunnels, and Tuta Mail all work this way. Post-quantum migration means replacing the asymmetric handshake layer while keeping the symmetric data layer.
HTTPS:
Encrypts traffic between your browser and the server. Protects content in transit. Does not hide which domain you visit (use encrypted DNS for that). Does not protect data after it reaches the server (that requires end-to-end encryption). Does not mean the website is trustworthy.
Where encryption stops:
Metadata (who, when, how often) is not encrypted by most services. Compromised devices read messages after decryption. App switcher screenshots capture encrypted content. Cloud backups store encrypted messages unencrypted. Operational security failures (recovery emails, payment data, location tracking) expose you through channels encryption was never designed to protect. The math is unbreakable. The human is not. Plan accordingly.

References

  1. [1] Grover, L.K., "A fast quantum mechanical algorithm for database search." Proceedings of the 28th Annual ACM Symposium on the Theory of Computing, 1996.
  2. [2] Fortinet, "Understanding Shor's and Grover's Algorithms." | fortinet.com
  3. [3] Grassl, M. et al., "Applying Grover's algorithm to AES: quantum resource estimates." arXiv:1512.04965, 2015. | arxiv.org
  4. [4] Lloyd, S., "Ultimate physical limits to computation." Nature 406, 1047-1054 (2000). MIT Department of Mechanical Engineering. | nature.com
  5. [5] Kryptera, Research paper on quantum resource requirements for AES-256, 2019. Cited in Fierce Electronics.
  6. [6] NSA, Commercial National Security Algorithm (CNSA) Suite. AES-256 recommended for classified data protection.
  7. [7] FBI, "Lawful Access" document, Jan. 7, 2021. Obtained via FOIA by Property of the People. Published by Rolling Stone, Nov. 29, 2021.
  8. [8] Google Project Zero, Beer, I. and Groß, S., "A deep dive into an NSO zero-click iMessage exploit." Dec. 2021. | securityweek.com
  9. [9] EFF / Digital Rights Bytes, "Can the Government Read My Text Messages?" | digitalrightsbytes.org
  10. [10] TechCrunch, "Encrypted Services Apple, Proton and Wire Helped Spanish Police Identify Activist." May 2024. | techcrunch.com
  11. [11] 404 Media, "Proton Mail Helped FBI Unmask Anonymous 'Stop Cop City' Protester." March 2026. | 404media.co
  12. [12] NPR, "Your data is everywhere. The government is buying it without a warrant." March 25, 2026. | npr.org
  13. [13] PostQuantum.com, "Grover's Algorithm and Its Impact on Cybersecurity." | postquantum.com
Policy Commentary

The FTC was never enough: How America's last privacy enforcer was designed to fail, then gutted on purpose

The Federal Trade Commission has been called America's "de facto privacy regulator." For two decades, its consent decrees against Google and Facebook were the closest thing Americans had to enforceable privacy law. In 2025, the President fired the Democratic commissioners, deleted 300 blog posts of enforcement guidance, and the remaining commissioners began reopening and setting aside consent orders. What happened is not a surprise. The institutional design of the FTC made this outcome structurally inevitable. The academic work of Chris Jay Hoofnagle, spanning three papers from 2012 to 2018, diagnosed every failure point with extraordinary precision. This article traces his analysis through the institutional collapse now underway and asks the question his work leaves open: if the institution designed to protect your privacy was never equipped to do so, and is now being actively dismantled, what remains?

D David
February 2026
Analysis
~20 min read
The United States does not have a federal privacy law. The EU has the GDPR, built on an explicit fundamental rights framework with independent enforcement authorities. The US has a patchwork: HIPAA for health, FERPA for education, COPPA for children, and for everything else, the Federal Trade Commission.The FTC's authority over privacy comes from Section 5 of the FTC Act, which prohibits "unfair or deceptive acts or practices." That is it. The agency has no dedicated privacy mandate, no general power to levy civil penalties for privacy violations, and no rulemaking authority specific to data protection. Its privacy role is not written into law. It is inferred from a statute about commercial fairness that was never designed to regulate the collection, processing, or sale of personal data.For over two decades, the FTC made this work through what can only be described as institutional improvisation. Staff attorneys identified privacy cases, framed them as deceptive practices under Section 5, and settled them through consent decrees imposing 20-year behavioral requirements on companies like Google, Facebook, and Twitter. It worked, but it was always fragile. The whole system depended on staff attorneys willing to push boundaries, a Bureau of Economics that would go along with it, and five commissioners who were politically aligned enough to vote for enforcement.In 2025, all three of those dependencies failed at the same time.

Part 1: The Three-Layer Diagnosis

Professor Chris Jay Hoofnagle of UC Berkeley has produced what may be the most complete structural critique of U.S. privacy enforcement. Across three papers spanning six years, he identified three distinct failure modes that, taken together, explain why the FTC was never equipped to protect consumer privacy at scale, and why the current collapse was predictable.

Layer 1: The Industry Frames Privacy Protection as "Paternalism" (2012)
In "Post Privacy's Paternalism" (2012), Hoofnagle pinpointed the rhetorical strategy that Silicon Valley uses to resist privacy regulation: the argument that we live in a "post-privacy world" where controls on personal information are unnecessary and even harmful to human freedom, and that imposing privacy rules on businesses is therefore paternalistic. [1]

Hoofnagle's response was precise. The proponents of this argument, he observed, "often hail from the most socially liberal companies in the world. As employees, their situation is rare, because they are generally highly sought-after and well-compensated white men." They do not face drug testing, time clocks, or transparent backpack inspections. "In this world, the prospect of facing retaliation for acts of deviance (short of violence) is improbable." Meanwhile, teachers were being fired for posting vacation photos with alcohol on Facebook.

The "paternalism" frame served a dual purpose. It positioned privacy regulation as an obstacle to freedom while obscuring the fact that the companies themselves were the real paternalists. For example, Facebook unilaterally altered information flows through Newsfeed, Beacon, and Instant Personalization; while Google banned pseudonyms on Google Plus to leverage real names for advertising and payment networks. Both companies bounded the amount of privacy their users could have. When these companies called regulation "paternalistic," Hoofnagle argued, they were recognizing their own behavior. As Tacitus recorded Cremutius Cordus observing: "when you resent a thing, you seem to recognize it."

Hoofnagle's original survey research, conducted via telephone with no corporate sponsorship, found that 92% of Americans wanted a right to delete personal data, 68% wanted a right of access, and 69% chose the highest monetary penalty offered ($2,500) for privacy violations. The "consumers don't care about privacy" narrative was not supported by evidence. It was a strategy.
Layer 2: Platforms Sell Data Through Developers and Call It Something Else (2018)
In "Facebook and Google Are the New Data Brokers" (2018), Hoofnagle dismantled the central claim of both companies: that they do not sell user data. His argument was structural, not semantic. Google and Facebook pay developers with data. A developer who builds a trivial app on Facebook's platform receives access to user profiles, friend lists, and photos. A developer who integrates "Sign in with Google" receives an API that delivers name, email, and profile photo of every user who authenticates. These are not gifts. They are payments. A "sale" under contract law only requires a transfer of value, and barter qualifies. [2]

This is how Cambridge Analytica's Aleksandr Kogan accessed tens of millions of Facebook profiles: by building a survey app that gave Facebook a small amount of user engagement, in exchange for which Facebook gave Kogan access to its Open Graph. If Kogan had tried to buy the same data on the open market from traditional data brokers, the lower-bound cost would have been thousands of dollars. The comparative cost gives us the "price" Facebook paid Kogan for user engagement.

The developer-platform relationship also explains why oversight was absent. Katherine Losse, an early Facebook employee, wrote in The Boy Kings that Zuckerberg treated developers as trusted insiders, "look[ing] away from the fact that almost all of Facebook users' data was available to them through the platform." The Wall Street Journal later reported that Google could not even determine what happened to platform data exposed to developers because it lacked audit rights. Google never called or visited any of its developers to verify data handling. Yet when Google's own intellectual property was at stake (the Uber autonomous driving theft case), the company could identify specific computers used, who used them, and what data was taken. The asymmetry is telling: Google's forensics are detailed when protecting its own assets and nonexistent when protecting yours.

To mask the third-party nature of these relationships, both companies employed what Hoofnagle called "definitional guile." Facebook sometimes called developers "service providers." Both companies used "partner" in expansive, misleading ways incongruent with the legal definition of partnership. In Google's parlance, "partner" meant an arms-length third party. In law, a partner has shared risk, shared reward, and fiduciary duty. Under Facebook's usage, you could call the dog walker a "partner."
Layer 3: Inside the FTC, Economists Systematically Zero Out Privacy Harm (2017)
In "The Federal Trade Commission's Inner Privacy Struggle" (2017), Hoofnagle revealed the institutional dynamic that most privacy advocates overlooked entirely. Inside the FTC, every privacy and security case is assigned to a consumer protection economist from the Bureau of Economics (BE). While the Bureau of Consumer Protection (BCP) attorneys select cases and push for expanded privacy norms through enforcement, the BE economists evaluate those cases and recommend remedies. And the BE systematically concludes that privacy violations cause no measurable harm, driving monetary penalties to zero. [3]

Hoofnagle identified several reasons for this. First, the BE does not believe there is a functioning market for privacy. Without a market price, there is no basis for calculating consumer injury from a privacy violation. When Google was fined $22.5 million for tracking Safari users despite promising not to (an intentional deception while already under a consent decree), a BE analyst could view the fine as disproportionately high because no consumer paid money for the privacy feature that was violated. The practice justified a relatively small fine because "any price on the tracking would be speculative."

Second, the BE's economics of privacy was shaped by a narrow, conservative disciplinary lens. Hoofnagle obtained the BE's training materials and literature review through a Freedom of Information Act request and found that the reading list had "a distinctly laissez faire bent, with the first paper listing the product of an industry think tank supported by five- and six-figure donations from telecommunications companies and Silicon Valley firms." The diverse array of traditional and empirical economic work exploring welfare gains from privacy protection was absent.

Third, and most structurally significant: the BCP lawyers learned to route around the BE by framing privacy cases as deception rather than unfairness. "Deception cases receive much less economic attention," Hoofnagle observed. The BE cannot block a deception case the way it can challenge an unfairness case. This created a dynamic where the FTC's privacy enforcement was built almost entirely on deception theory, avoiding the cost-benefit analysis that the BE's unfairness framework would require. This worked as long as the lawyers controlled case selection. It became a critical vulnerability the moment they did not.

Part 2: The Prediction and the Reality

Hoofnagle's 2017 paper concluded with a prediction: "We should expect President Donald Trump's administration to expand the role of the BE and to make its role more public. With newfound powers, the BE will argue that more cases should be pled under the unfairness theory. This will have the effect of blunting the lawyers' attempts to expand privacy rights through case enforcement." He was right. But what happened in 2025 went further than even his analysis anticipated.

What Hoofnagle Predicted
The prediction was that the Bureau of Economics would gain influence over case selection (not just evaluation), that the unfairness framework (which requires proof of "substantial injury" and invites cost-benefit analysis) would become the dominant theory, and that this would make it harder for BCP attorneys to bring privacy cases because the BE would require demonstrated consumer harm that its own methods were designed to find nonexistent. The lawyers' strategy of clothing unfairness cases in deception garb would become harder to sustain. Privacy enforcement would contract not because the law changed, but because the internal power balance shifted.
What Actually Happened: The 2025 Restructuring ⚠ This goes beyond what Hoofnagle predicted
In January 2025, Andrew Ferguson became FTC Chairman. In a leaked two-page memo to the incoming Trump administration, Ferguson had emphasized repealing "burdensome regulations," ending "novel and legally dubious consumer protection cases," and stopping the FTC's attempts to regulate AI. He positioned himself as a "Trump-aligned Chairman" willing to terminate uncooperative bureaucrats. [4]

In February, Ferguson appointed Christopher Mufarrige as director of the Bureau of Consumer Protection, signaling a move away from the expansive use of Section 5 unfairness that had characterized the Khan-era FTC. In March, President Trump fired the two Democratic commissioners, Alvaro Bedoya and Rebecca Kelly Slaughter, leaving the FTC with only two members, both Republicans. Bedoya wrote: "The President just illegally fired me. Now, the President wants the FTC to be a lapdog for his golfing buddies." Both filed lawsuits challenging their removal. [5]

The same day the firings were announced, the FTC removed more than 300 blog posts published during the Biden administration, including several focused on consumer protection, AI compliance, and privacy enforcement guidance. No formal explanation was given. [6]

In December 2025, the FTC reopened and set aside its consent order against Rytr LLC, a generative AI company. The order had been issued under former Chair Lina Khan over the dissent of then-Commissioner Ferguson and Commissioner Holyoak. The set-aside was unusual: it was initiated by the FTC itself, not by the respondent company. The commission cited the Trump administration's AI Action Plan and concluded that the original complaint "failed to satisfy the legal requirements of the FTC Act." Mufarrige stated: "Condemning a technology or service simply because it potentially could be used in a problematic manner is inconsistent with the law and ordered liberty." [7]

The Rytr set-aside is significant not for what it says about AI but for what it signals about the consent decree framework itself. If the FTC can reopen and vacate a finalized consent order at the direction of the White House, the entire enforcement model that Hoofnagle described (attorneys selecting cases, BE economists evaluating them, consent decrees imposing 20-year behavioral requirements) becomes contingent on political alignment rather than legal process.
Beyond Prediction: Institutional Capture
Hoofnagle predicted that the BE would gain power within the existing FTC structure, making privacy enforcement harder through the normal internal dynamics he documented. What happened instead was a restructuring of the institution itself: the removal of dissenting commissioners, the deletion of enforcement guidance, and the reopening of consent decrees at White House direction. This is not the BE winning an internal debate. This is the elimination of the debate entirely.

The distinction matters. In Hoofnagle's model, the BCP lawyers could still select cases, still frame them as deception, and still push for expanded privacy norms even against BE resistance. The system produced imperfect outcomes, but it produced outcomes. The lawyers brought over 150 privacy and security cases. They extracted consent decrees from Google, Facebook, Twitter, and Microsoft. The $5 billion Facebook settlement in 2019 was the largest privacy penalty in FTC history. Those outcomes were possible because the institutional machinery, however flawed, allowed attorneys to act.

The 2025 restructuring removed the conditions that made those outcomes possible. With only two commissioners (both Republicans appointed under the current administration), no case can proceed without unanimous agreement. With the Bureau of Consumer Protection under new leadership explicitly committed to "targeting fraud and tangible consumer harm" rather than "speculative harms," the case selection function that Hoofnagle identified as the lawyers' primary power has been narrowed. With consent decrees subject to White House review and potential set-aside, the enforcement tool itself is no longer durable.

The FTC's reliance on consent decrees was understandable given its limited toolkit. But as Hoofnagle's work and the 2025 events demonstrate, the model had three fundamental vulnerabilities that made it unsuitable as a long-term framework for privacy protection.

Consent Decrees Do Not Create Precedent
Because consent decrees are settlements, they do not produce judicial opinions defining the FTC's authority. As Hoofnagle noted, "Few companies challenge the FTC in court, and so there are few modern court opinions defining the agency's authority. Instead, the wrongness of the corporate act is simply declared in a complaint and consent decree agreed to by the respondent company." [3] This meant the FTC's privacy authority was never tested in court, never clarified by judges, and never hardened into doctrine. Van Eijk, Hoofnagle, and Kannekens observed in their comparative analysis with EU unfair commercial practices law that "the contours of the FTC Act are unclear, because Congress gave the agency a broad mandate, but also because the FTC typically does not explain why a practice is deceptive or unfair in great detail." [8] This ambiguity was an asset when the agency was expanding its privacy reach (flexibility allowed the attorneys to bring novel cases) and a liability the moment political leadership decided to contract it (there was no binding precedent to prevent the contraction).
Consent Decrees Depend on Continuous Institutional Will
A consent decree is a contract between the FTC and the respondent. To enforce it, the FTC must bring suit. The burden is on the FTC to show noncompliance. In some circuits, the FTC must show "substantial noncompliance," not just a technical violation. This enforcement model requires active, sustained institutional commitment over the 20-year life of the decree. When Google violated its consent decree by tracking Safari users, the resulting $22.5 million fine was so small relative to Google's revenue that Hoofnagle argued it "created incentives for bad behavior by setting such a low penalty for intentional misbehavior." If the institution lacks the will or the resources to enforce, consent decrees become pieces of paper.

The Rytr set-aside demonstrates something more concerning: the FTC can not only fail to enforce consent decrees but can actively undo them. If this precedent extends to privacy-specific orders (the AI Action Plan directive requires review of "all FTC final orders, consent decrees, and injunctions"), the entire body of FTC privacy enforcement built over two decades could be subject to retroactive dissolution.
Consent Decrees Arrive After the Damage Is Done
Hoofnagle identified this timing problem as particularly acute in the platform economy. "Remedies are unlikely to be effective by the time the FTC gets involved, investigates a case, and litigates it. The delay involved in FTC processes gives respondents time to establish their platform and shut out competitors." [3] Both Facebook and Google used what Hoofnagle described as a form of bait and switch: attracting users with promises of privacy, then relaxing those features once network effects made defection impractical. By the time the FTC intervened, the platform was already dominant, the competitors were already gone, and the consent decree addressed yesterday's harm while today's data practices continued to evolve.

Part 4: What Remains When Institutions Fail

Hoofnagle's work describes a system where the FTC's internal structure (the BE zeroing out privacy harm), its external constraints (no general privacy statute, no civil penalty authority), and the industry's rhetorical strategies ("post-privacy," "paternalism," "we don't sell data") combined to produce enforcement that was better than nothing but structurally inadequate. The 2025 events did not create these weaknesses. They exploited them. The question that follows is practical: if the institutional framework was always insufficient and is now being actively dismantled, what is left?

State Enforcement Is Filling the Gap (Partially)
California, Texas, and other states have stepped into the enforcement vacuum. California's CCPA/CPRA provides a statutory definition of "sale" that includes barter, consistent with the definition Hoofnagle argued for in 2018. Texas has brought enforcement actions against companies for selling driving data without consent. These state actions are meaningful but fragmented, jurisdiction-limited, and subject to federal preemption if Congress ever passes a federal privacy law with a preemption clause (as several proposed bills have included). [9]
Legislative Action Remains Stalled
The Government Surveillance Reform Act (introduced March 13, 2026) would close the data broker loophole by requiring warrants for government data purchases. The Fourth Amendment Is Not For Sale Act has bipartisan support. Neither has passed. The FISA Section 702 reauthorization deadline (April 20, 2026) is the best legislative vehicle, but the White House is pushing for a clean reauthorization with no amendments. No comprehensive federal privacy statute is on the horizon. [10]
Architectural Protection: The Only Layer That Does Not Depend on Institutional Will
Hoofnagle's 2017 paper concluded with a call for the BE to "foster a market for privacy" and for academics to document privacy markets so the BE could assign value to privacy violations. This was the right prescription for an institution that was undervaluing privacy but still functioning. It is the wrong prescription for an institution that is being dismantled.

What remains when the regulatory institution fails is the architecture of the tools themselves. End-to-end encryption cannot be subpoenaed if the provider does not hold the keys. Metadata that is never collected cannot be handed over under a court order or sold to a data broker. On-device processing means the server never sees the data. Open-source code means the architecture can be verified. These are not policy preferences. They are engineering decisions that produce privacy outcomes independent of any regulatory framework.

This is why the distinction between a service that claims not to sell your data (a policy that can change, as Facebook's has, repeatedly) and a service that cannot access your data (an architecture that holds regardless of who is in office) is not academic. It is the difference between privacy that depends on institutional goodwill and privacy that depends on mathematics. The FTC's consent decree with Facebook did not prevent Facebook from altering its privacy settings in 2009. Ente's encryption architecture prevents Ente from seeing your photos regardless of what any government demands. Signal's protocol prevents Signal from producing message content regardless of what court order is served. These are not the same kind of protection. One is a promise. The other is a constraint.

Conclusion: Promises vs. Constraints

Hoofnagle's body of work provides the intellectual framework for understanding why American privacy enforcement was always fragile and why the current moment, while alarming, was structurally inevitable. His analysis should not be read as a counsel of despair. It should be read as a diagnosis that points toward a specific kind of remedy.

The institutional lesson
The FTC's privacy enforcement was a remarkable improvisation. Dedicated staff attorneys working inside an institution that was never designed for the task, evaluated by economists trained to find no harm, limited to a toolkit with no civil penalties, and dependent on political leadership that turned over every four to eight years.That this system produced as much as it did, over 150 cases, the $5 billion Facebook settlement, consent decrees against every major platform, is a testament to the people who ran it. That it could be dismantled so quickly is a testament to the structure they were running inside.Institutions built on discretion can be redirected by new leadership. Institutions built on inferred authority can have that authority reinterpreted. Enforcement frameworks that never produced binding precedent can be contracted without overturning a single court decision.The lesson is not that institutions do not matter. They do. The lesson is that privacy protection built entirely on institutional discretion is privacy protection built on sand.
The architectural response
None of this diminishes the importance of legislative reform, FTC funding, or the restoration of bipartisan commission membership. All of those things should happen. But the events of 2025 and 2026 have demonstrated that individuals cannot rely exclusively on any of them. Privacy tools that are architecturally private (end-to-end encrypted, zero-knowledge, open source, on-device processing) provide a form of protection that does not depend on who chairs the FTC, whether the Bureau of Economics considers privacy harm "speculative," whether Congress passes a comprehensive privacy statute, or whether a consent decree survives the next administration. This is not a substitute for institutional reform. It is a necessary complement to it. And in the current moment, where the institutional framework is at its weakest point in two decades, it may be the only layer of protection that is reliably operative.

Hoofnagle wrote in 2012 that "left to the market, advertisers will code privacy into oblivion because they see individuals as objects." [1] He was right. The counter-move is to code privacy into existence: to build and use tools where the architecture itself enforces the protections that institutions have failed to provide. The math does not change with the administration. The encryption does not expire with a political appointment. The metadata that was never collected cannot be purchased by a data broker, regardless of what the Bureau of Economics concludes about the value of your privacy.

References

  1. [1] Hoofnagle, C.J., "Post Privacy's Paternalism." European Data Protection Law Review (EDPL), 2012. SSRN: 2468322.
  2. [2] Hoofnagle, C.J., "Facebook and Google Are the New Data Brokers." 2018. SSRN: 2901526. | ssrn.com
  3. [3] Hoofnagle, C.J., "The Federal Trade Commission's Inner Privacy Struggle." In The Cambridge Handbook of Consumer Privacy (Cambridge University Press). SSRN: 2901526.
  4. [4] Alston and Bird, "First 100 Days: Federal Privacy and Cybersecurity Regulation and Enforcement Under the Second Trump Administration." May 2025. | alston.com
  5. [5] CNN, "Trump fires Democratic FTC commissioners." March 19, 2025. PBS NewsHour interview with Commissioner Bedoya, March 28, 2025. | cnn.com
  6. [6] Goodwin, "Trump 2.0 Tech Policy Rundown: 100 Days In." May 2025. Noting deletion of 300+ blog posts. | goodwinlaw.com
  7. [7] FTC, "FTC Reopens and Sets Aside Rytr Final Order in Response to the Trump Administration's AI Action Plan." Dec. 22, 2025. | ftc.gov
  8. [8] Van Eijk, N., Hoofnagle, C.J., and Kannekens, E., "Unfair Commercial Practices: A Complementary Approach to Privacy Protection." European Data Protection Law Review (EDPL) 3/2017.
  9. [9] WilmerHale, "Year in Review: The Top Ten US Data Privacy Developments from 2025." Jan. 2026. Perkins Coie, "Privacy Law Recap 2025: FTC Enforcement." Jan. 2026. | wilmerhale.com
  10. [10] NPR, "Your data is everywhere. The government is buying it without a warrant." March 25, 2026. Government Surveillance Reform Act, introduced March 13, 2026.
  11. [11] The Record, "Trump admin's removal of Democratic FTC commissioners could shift its privacy efforts." March 19, 2025. | therecord.media
  12. [12] Perkins Coie, "Privacy Law Recap 2025: FTC Enforcement." Jan. 2026. Noting FTC under Ferguson "moved away from novel theories under Section 5." | perkinscoie.com
  13. [13] Paul Weiss, "2025 Year in Review: Cybersecurity and Data Protection." Noting FTC finalized order against Mobilewalla (Jan. 2025). | paulweiss.com
  14. [14] Reed Smith, "Rewriting Rytr: The FTC sets aside a Final Order." Dec. 2025. Noting AI Action Plan requires review of "all FTC final orders, consent decrees, and injunctions." | reedsmith.com
Investigative Report

The FBI director used Gmail. Iranian hackers read his email for years.

On March 27, 2026, an Iranian state-linked hacking group published over 300 emails, personal photographs, and documents from the personal Gmail account of FBI Director Kash Patel. The FBI confirmed the breach. The emails were not obtained by breaking Google's encryption or exploiting a zero-day vulnerability. They were obtained because the director of the Federal Bureau of Investigation kept a personal Gmail account that had been exposed in prior data breaches, and apparently did not take the steps necessary to prevent reuse of those credentials. This is not a story about a sophisticated cyberattack. It is a story about the gap between what people assume their email protects and what it actually does.

D David
March 28, 2026
Investigative Report
~10 min read
Last verified: March 28, 2026
The head of the FBI had his personal email hacked. Not his classified systems. Not the Bureau's internal network. His personal Gmail. The account contained emails dating from 2010 to 2022, personal photos, travel receipts, family correspondence, apartment leasing inquiries, tax-related messages, and at least one email from 2014 in which Patel appears to have used his Department of Justice email to send himself a link, copying both his FBI address and his personal Gmail. The breach was claimed by Handala Hack Team, an Iranian state-linked group that the U.S. Department of Justice has formally accused of operating on behalf of Iran's Ministry of Intelligence and Security (MOIS). Cybersecurity researchers believe the compromise likely predates 2026 and that the hackers sat on the material, waiting for a strategically useful moment to release it. The Gmail address Handala claimed to have accessed matches an address linked to Patel in previous data breaches cataloged by the dark web intelligence firm District 4 Labs. This article examines what happened, how it likely happened, why Gmail's architecture made it possible, and what the incident means for anyone who still uses a mainstream email provider for anything they would not want published on the internet.

What Happened

The Breach
On March 27, 2026, the Handala Hack Team published a cache of files on its website claiming to have breached the personal email of FBI Director Kash Patel. The release included personal photographs of Patel, a resume, and a sample of over 300 emails spanning 2010 to 2022. The emails contained personal, family, business, and travel correspondence, including flight and hotel receipts, apartment leasing inquiries in Washington D.C., tax-related messages, and photos exchanged with family members. [1] [2]
Confirmation
The FBI confirmed the breach in an official statement: "The FBI is aware of malicious actors targeting Director Patel's personal email information, and we have taken all necessary steps to mitigate potential risks associated with this activity. The information in question is historical in nature and involves no government information." TechCrunch independently verified that at least some of the leaked emails were from Patel's Gmail account by confirming information contained in the message headers. CNN confirmed the authenticity of the images with a source familiar with the incident. [2] [3]
The Cross-Contamination ⚠ HIGH - this is the detail that matters most for operational security
NBC News reported that in one email from 2014, when Patel worked in the Department of Justice's National Security Division, he appears to have used his official DOJ email to send himself a link, copying both his FBI address and his personal Gmail. This means a single compromised Gmail account could expose not only personal correspondence but also reveal official government email addresses and demonstrate the habit of bridging personal and professional communications through a consumer email service with no end-to-end encryption. [4]
Who Is Handala
Handala Hack Team presents itself as a pro-Palestinian vigilante hacking collective. Western cybersecurity researchers and the U.S. Department of Justice consider the group to be a front for Iran's Ministry of Intelligence and Security (MOIS). The DOJ formally accused MOIS of operating the Handala group. Earlier in March 2026, the group claimed responsibility for a destructive cyberattack against medical device company Stryker that reportedly wiped tens of thousands of employee devices. The FBI responded by seizing several Handala-operated domains on March 19, 2026. Handala stated the Patel release was retaliation for that seizure. [5] [6]
The Timing
Cybersecurity researchers believe the actual compromise occurred well before the March 2026 publication. Alex Orleans, head of threat intelligence at Sublime Security, told NBC News that the release appeared to be material the group had been holding. "Looks like something they had sitting around. Iranian actors sit on all kinds of odds and ends for a rainy day." The metadata from the leaked files indicates they were last organized on May 21, 2025. Patel was informed in late 2024, before becoming FBI director, that he had been targeted by Iranian hackers and that some of his personal communications had been accessed. That earlier incident was part of a broader campaign targeting incoming Trump officials including Deputy Attorney General Todd Blanche and Donald Trump Jr. [3] [4]

How It Likely Happened

No technical details about the method of compromise have been officially confirmed. But the available evidence points clearly in one direction, and it is not a sophisticated one.

The Gmail Address Was in Prior Data Breaches ⚠ CRITICAL - this is almost certainly the entry point
Reuters reported that the personal Gmail address Handala claims to have breached matches an address linked to Patel in previous data breaches cataloged by the dark web intelligence firm District 4 Labs. This means Patel's email address and potentially an associated password were already circulating in breach databases available on criminal marketplaces. [1]

The most probable attack vector is credential stuffing. The hackers obtained Patel's email and password from an older breach, tested it against Gmail, and found it still worked, either because the password had not been changed or because a similar password was in use. This is not speculative. It is the single most common method by which Gmail accounts are compromised, and it is exactly how John Podesta's Gmail was accessed in 2016.
No Evidence of a Gmail Infrastructure Breach
Google's servers were not compromised. This was not a failure of Gmail's infrastructure. It was a failure of account-level security on a consumer email service. The distinction matters: Gmail encrypts data in transit (TLS) and at rest on its servers, but Google holds the decryption keys. If an attacker obtains valid credentials, they have the same access Google does. They can read every email, download every attachment, and browse every conversation, because Gmail is not end-to-end encrypted. The attacker does not need to break any encryption. They log in as the user.
The Likely Kill Chain
Step 1: Obtain the target's email address and associated password from a prior breach database. These are commercially available on dark web marketplaces for embarrassing low costs.

Step 2: Test the credentials against Gmail. If the password was reused or never changed (which is often the case), the login succeeds. If two-factor authentication is not enabled the account is open.

Step 3: Access the account, download the full email archive using Google Takeout or IMAP, and exfiltrate everything. Easy peasy. Google's own data export tools makes this trivial once you are authenticated.

Step 4: Encrypt and store the data. Wait for a strategically useful moment. Release it for maximum impact.

Reuters described the breach as "relatively unsophisticated," consistent with the type of low-level hack that U.S. intelligence assessments predicted from Iranian proxies. Check Point's Gil Messing described the operation as part of Iran's strategy to embarrass U.S. officials. [1]
KEY POINT The director of the FBI was not hacked by breaking encryption, exploiting a zero-day vulnerability, or compromising Google's infrastructure. He was hacked because he had a consumer Gmail account with credentials that appeared in prior data breaches. The encryption on Gmail's servers is irrelevant when the attacker has the password.

What Gmail Does and Does Not Protect

This incident is a concrete illustration of a point we have made repeatedly on this site: encryption that protects data in transit and at rest is not the same as encryption that protects data from the provider or from anyone who obtains your credentials.

Gmail's Encryption Model
In transitTLS Gmail encrypts email in transit between your browser and Google's servers, and between Google's servers and the recipient's server. This prevents network-level interception.
At restENCRYPTED, GOOGLE HOLDS KEYS Gmail encrypts stored email on Google's servers. But Google holds the decryption keys. Google can read your email. Google does read your email (for spam filtering, malware scanning, possible AI training, etc.). And anyone with your credentials can read it too.
What This Means in Practice ⚠ CRITICAL - this is what happened to Patel
Gmail's encryption protects against network interception (someone sniffing your Wi-Fi with something like Wireshark) and physical theft of Google's servers. It does not protect against:

Compromised credentials. If an attacker has your email and password, they are you. Gmail cannot distinguish between you logging in and an Iranian intelligence operative logging in with the same credentials.

Legal compulsion. Google holds the keys. A search warrant compels Google to decrypt and hand over your email. As we documented in our messaging comparison, this is the fundamental difference between services where the provider holds the keys and services where only you hold the keys.

Data export by an authenticated user. Google provides tools (Google Takeout, IMAP access) that allow any authenticated user to download a complete archive of the account. An attacker with valid credentials can export years of email in minutes.
How End-to-End Encrypted Email Would Have Changed This
It's hard to say if Patel's situation would have been different had he used an end-to-end encrypted email provider like Proton Mail or Tuta Mail.

On Proton Mail, the email content is encrypted with the user's private key, which is itself encrypted with the user's password. An attacker who obtains the account password could potentially access the account, but Proton's architecture provides additional layers of protection: the mailbox password (if configured). I highly recommend this configuration on both Proton Mail and Proton Pass.

On Tuta Mail, the encryption scope is even broader: subject lines, sender names, and the full address book are end-to-end encrypted. Tuta does not support two-password authentication like Proton but have made comments they are developing it.

Neither service would have prevented a credential compromise. But both could have limited what the attacker could read once inside, because the email content is encrypted with keys that the server does not hold. On Gmail, once you are in, you can read everything.

This Is a Pattern, Not an Anomaly

The Patel breach is not an isolated incident. It fits a documented pattern of state-linked hackers targeting the personal accounts of senior officials, and of those accounts being on consumer email services with no end-to-end encryption.

The Pattern
2015CIA Director John Brennan: literal teenage hackers broke into Brennan's personal AOL account and leaked data about U.S. intelligence officials. The director of the CIA was using AOL for personal email. [1]
2016John Podesta (Clinton Campaign Chairman): Russian-linked hackers accessed Podesta's personal Gmail account via a phishing email. The contents were published to WikiLeaks and arguably altered the course of the presidential election. [1]
The Pattern (continued)
2024Incoming Trump officials: Iranian and Chinese hackers targeted personal accounts of incoming administration officials including Patel, Deputy AG Todd Blanche, and Donald Trump Jr. Patel was informed of the compromise before taking office. [3]
2026Patel again: the same Gmail account, apparently still active, was accessed and its contents published by a different Iranian-linked group. The fact that Patel was warned about a compromise of this account in 2024 and the account was still accessible in 2025 (when the metadata shows the files were organized) raises serious questions about follow-through on basic security hygiene. Come on now.
It Is Not Just Government Officials
The same techniques used against Patel, Podesta, and Brennan are used every day against professors, journalists, corporate executives, attorneys, healthcare workers, and private citizens. Iranian state hackers specifically targeted university faculty and researchers in Operation SpoofedScholars. Chinese APT groups have targeted law firms handling sensitive merger and acqusition transactions. Russian groups have targeted journalists covering the war in Ukraine. The attackers here are MUCH more sophisticated. But the vulnerability remain the same. A consumer email account with reused credentials, no hardware security key or 2FA, and no end-to-end encryption.

You do not need to be the FBI director for your email to be targeted. You need to be someone who has information that someone else wants. That includes lawyers handling sensitive cases, academics with controversial research, executives with access to proprietary data, healthcare professionals with patient information, journalists with source communications, and activists under government scrutiny.

What You Should Do

The Patel breach is a case study in what goes wrong when someone relies on the default security posture of a consumer email service. Every mitigation below addresses a specific failure point illustrated by this incident.

Check if your credentials are in breach databases
Go to haveibeenpwned.com and enter every email address you use. If your address appears in any breach (and it almost certainly will), assume the associated password is compromised. Change it immediately. Do not reuse any password that has ever appeared in a breach, even a modified version of it. Use a reputable password manager. This is likely the exact step that would have prevented the Patel breach.
Use a hardware security key for two-factor authentication ⚠ This is the single most effective protection against account takeover
SMS-based two-factor authentication can be bypassed through SIM swapping or SS7 interception. App-based TOTP codes can be phished in real time, although harder and typically only happens in targeted attacks not crediential stuffing. A hardware security key (YubiKey, Google Titan, SoloKeys) requires physical possession of the device. It cannot be phished, intercepted, or remotely compromised. If Patel had a hardware security key on his Gmail account, the stolen password alone would not have been enough. Google's own internal data shows that after deploying hardware security keys to all employees in 2017, the company experienced zero successful phishing attacks against employee accounts.
Use a password manager with unique passwords for every account
The entire credential stuffing attack model depends on password reuse. If every account has a unique, randomly generated password stored in an encrypted password manager, a breach at one service cannot cascade to others. The password for your email should be a long, random string that you never type from memory and never use anywhere else.
Migrate to end-to-end encrypted email
This is the architectural fix. Gmail's encryption model means that anyone who authenticates as you can read everything you have ever sent or received. End-to-end encrypted providers (Proton Mail, Tuta Mail) encrypt your mailbox with keys derived from your password. Even if credentials are compromised, the attacker faces an additional encryption layer that the server itself cannot bypass. We have published detailed comparisons of Proton Mail vs Tuta Mail covering encryption scope, jurisdiction, court order compliance, and real-world cases. The Patel breach is a live example of why the distinction between "encrypted at rest by the provider" and "end-to-end encrypted by the user" is not academic.
Never bridge personal and professional email
Patel's 2014 email, in which he copied both his DOJ address and his personal Gmail, is a textbook example of cross-contamination. A compromised personal account should not reveal your professional email addresses, your workplace communication patterns, or the fact that you forward work-related material to a consumer email service. Use separate accounts for separate purposes. Never CC your personal email on work correspondence. If you must access work material on a personal device, use a secure method that does not leave copies in a consumer email archive.
Delete old email you do not need
Patel's Gmail contained emails from 2010. Twelve years of personal correspondence, travel receipts, family photos, and professional contacts just sitting in a consumer email account with no end-to-end encryption. If you do not need an email from 2014, delete it. If you need to retain records, export them to encrypted local storage and remove them from the server (more on this at a later time). Every email sitting in your Gmail inbox is an email that can be read by anyone who obtains your credentials, by Google under a warrant, and by any future breach of your account. Delete them!
THE TAKEAWAY The FBI director was not hacked because Gmail's encryption failed. He was hacked because Gmail's encryption is not designed to protect you from someone who has your password. It is designed to protect Google's servers. End-to-end encryption protects you. The distinction is not technical trivia. It is the difference between a breach that exposes 300 emails and a breach that exposes nothing readable. The tools to prevent this exist. They are available to everyone. The director of the FBI chose not to use them.

References

  1. [1] Satter, R. and Bayoumy, Y., "Iran-linked hackers breach FBI Director Kash Patel's personal email, publish excerpts online." Reuters / CNBC, March 27, 2026. | cnbc.com
  2. [2] Franceschi-Bicchierai, L., "Iranian hackers claim breach of FBI director Kash Patel's personal email account." TechCrunch, March 27, 2026. | techcrunch.com
  3. [3] Lyngaas, S., "Iran-linked hackers have breached FBI Director Kash Patel's personal emails." CNN, March 27, 2026. | cnn.com
  4. [4] Collier, K. and Kosnar, M., "Iranian hackers publish emails allegedly stolen from Kash Patel." NBC News, March 27, 2026. | nbcnews.com
  5. [5] "Pro-Iranian group claims credit for hacking into FBI Director Patel's personal account." PBS News / Associated Press, March 27, 2026. | pbs.org
  6. [6] Sommerville, M., "FBI director Kash Patel's email hacked: What has been leaked?" Newsweek, March 27, 2026. | newsweek.com
  7. [7] Raman, S., "Iran-linked group claims hack of FBI Director Kash Patel." Axios, March 27, 2026. | axios.com
Services

Privacy Consulting

Privacy consulting for individuals and small operations with real reasons to care. Threat modeling, jurisdiction analysis, encrypted infrastructure migration, and operational security guidance.

✓ Who I work with
Professionals in sensitive fields. Individuals navigating difficult personal situations. Everyday people who have decided their digital privacy matters. I do not ask why you need privacy nor require justifications.
✕ Not yet offered
Privacy program management for businesses with employees or IT infrastructure. Regulatory compliance work (HIPAA, SOC 2, CCPA operational implementation). Legal advice, documents, or representation. I refer out for these today. Business privacy services are on the roadmap once CIPM certification is complete.
On qualifications: On qualifications: B.A. in Legal Studies (minor in Public Policy), UC Berkeley. CIPP/US certification in progress through the International Association of Privacy Professionals. Not an attorney. Nothing here is legal advice. Orion Private LLC carries errors and omissions insurance and cybersecurity coverage. My work is privacy research, threat modeling, jurisdiction analysis, and implementation. Custom reports and education materials are tailored to each client's background whether technical or not.
Services & Pricing
How It Works
01
Encrypted intake form (free)
Fill out the secure form on the contact page. No commitment. I respond within 36 hours with an honest assessment of whether my services are a good fit.
02
Paid consultation ($60, 30 min)
If it is a good fit, we go deeper into your needs and threat model. If you move forward with any service, the $60 is subtracted from your total.
03
Scoping & proposal
Written scope of work with exactly what I will do, what you receive, and total cost. No hourly billing.
04
Execution
Research, configuration, migration, hardening. Hands-on work happens over screen share so you stay in control.
05
Handoff & training
Written documentation, walk-through, and operational security training. Support continues 30–90 days.
Common Questions
Free, no commitment. I respond within 36 hours.
About

About Orion Private LLC

O
Orion Private LLC
Founded by David
privacy@orionprivate.com
Legal Studies (UC Berkeley)Privacy ResearchEncryptionDigital Rights
PGP Fingerprint
B4A7 B0FF 8F75 F17E A28E 056F F025 2FAB 7CB1 AA0C
Download public key →
For encrypted communication, use the intake form or encrypt with this key in your email client.

I am David. I run Orion Private LLC, a privacy consulting practice for professionals and everyday people who have decided their digital lives are worth protecting properly. My background is in Legal Studies and Public Policy from UC Berkeley, and that training shapes every engagement. I read statutes, analyze court opinions, trace jurisdiction, and translate regulatory frameworks into clear, actionable guidance. The practice carries errors and omissions coverage and general liability insurance because accountability matters when someone trusts you with their digital life.

I trace where your data is stored, who has legal authority over it, what a court order can compel a provider to hand over, and what is actually protected by the math versus what is just marketing. I analyze metadata exposure at the field level, map encryption protocols against documented court orders, and publish everything in plain language.

Nothing here runs on third-party platforms that consume your information. Client records live in encrypted spreadsheets I built myself, stored in Veracrypt containers on local hardware, backed up to encrypted cloud storage. Invoices, receipts, and bookkeeping are handled the same way. No client name or engagement detail has ever touched a CRM, a SaaS dashboard, or a billing platform that monetizes the data passing through it. I built the operational infrastructure from scratch because the tools that exist were not built for the people I serve.

Looking Ahead

Orion Private is built with a long-term trajectory in mind. The long-term goal is law school and eventually a practice that applies this same thinking to private trusts, estate planning, and property law. Every home purchase generates a data trail. Every estate plan contains sensitive personal information. Most of that exposure is not inevitable. It is just how things have always been done. With the right legal strategy and technical awareness, even deeply personal assets can be structured to preserve privacy by design rather than by accident.

I am currently pursuing CIPP/US certification through the International Association of Privacy Professionals. That credential validates what the research and client work already demonstrate, but it matters for the clients and institutions that need to see it formalized. CIPM is next, which opens the door to working with businesses on privacy program management. The certifications are part of the same trajectory as the practice itself.

Anyone paying attention to legal education knows the math changed in 2025. The One Big Beautiful Bill Act eliminated the Grad PLUS loan program and capped federal borrowing for law students at $50,000 per year. Most law schools cost more than that. Public service loan forgiveness survived, but income-driven repayment for new borrowers was effectively replaced. This practice is how I am building the foundation to pursue that education without massive debt that would dictate the kind of law I practice.

Research Classification
Field Assessment Technical comparisons, tool evaluations, provider breakdowns
Regulatory Analysis Policy failures, regulatory gaps, government overreach
Technical Brief Educational explainers, foundational concepts
Investigative Report Exposure pieces, tracking research, hidden practices revealed
Policy Commentary Opinion, commentary on law, engagement with scholarship

Sources include official documentation, court records, transparency reports, academic research, and established reporting. No affiliate links. No sponsorship.

About This Site

This entire website is a single file. No frameworks, no templates, no WordPress. One HTML document that contains every page, every article, and every animation. It is self-hosted on infrastructure I control in a European data center covered by EU data protection law. There are no third-party tracking scripts and no external requests except the OpenPGP library that powers the encrypted contact form.

The site runs Umami, a self-hosted open-source analytics tool on the same server. No cookies, no personal data collection, no visitor profiles. It counts total visitors and page views. The data never leaves my server.

It loads fast because there is nothing to fetch. It cannot leak your browsing behavior because no third parties are involved.

Contact

Get in Touch

Whether you need consulting, have a research question, or want to flag something I got wrong, I am happy to hear from you.

Encrypted Intake Form

Free, no commitment. Your message is encrypted with PGP in your browser. Only I can decrypt it.

Encryption happens in your browser. Nothing is sent until you email it.
Anonymous Encrypted Contact

No email. No phone number. No account. Check it out.

https://smp15.simplex.im/a#xZA93oZyUT6tQ7lAHF7K_45JNwSjTMQHzewneBQn8cU
Need SimpleX? Download here →
Encrypted Email*
privacy@orionprivate.com
PGP Fingerprint:
B4A7 B0FF 8F75 F17E A28E 056F F025 2FAB 7CB1 AA0C
Download public key →
*Or skip the setup and use the automated encrypted form.
How This Form Works
Your message is encrypted with PGP in your browser using OpenPGP.js. The plaintext never touches a server. You copy the ciphertext and email it. Only my private key can decrypt it.
Getting Started
Intake form (free) → I respond in 36 hours → Paid consultation ($60, 30 min) → Fee credited toward any service.
Response Time
Within 36 hours