It's not.
Your email provider holds the keys to your inbox.
Your carrier tracks your location to the yard and stores it for years.
Your data is sold by companies you have never heard of.
The encryption that could stop all of it is under legislative attack.
Privacy is not a product you buy.
It is an architecture you build.
Privacy research, investigative reporting, and field-level analysis of encryption protocols and jurisdiction. Written by David for Orion Private LLC.
On March 27, 2026, an Iranian state-linked group published over 300 emails from FBI Director Kash Patel's personal Gmail. The FBI confirmed the breach. The emails were not obtained by breaking encryption. They were obtained because a consumer email account had credentials in prior data breaches.
The FTC's consent decrees were the closest thing Americans had to enforceable privacy law. In 2025, commissioners were fired, guidance was deleted, and consent orders were reopened. Chris Jay Hoofnagle's academic work predicted every failure point.
Both encrypt your inbox. But jurisdiction, metadata exposure, encryption scope, and real world court orders tell a more complicated story.
End-to-end encryption protects your note content. It does not protect the metadata your notes generate.
The FBI confirmed to Congress this month that it purchases location data from commercial brokers. No warrant. No judge. No notification. This has been happening for years, across multiple agencies and administrations. Here is the full story.
In 2021 the FBI produced an internal guide detailing exactly what data it can legally obtain from nine messaging apps. The document was obtained via FOIA. Here is a field-by-field analysis of five of the most common services.
Private browsing does not make you private. A UC Berkeley study found Google on almost every adult site, search terms leaked in plaintext, and sexual preferences encoded in cookies.
Every photo records your GPS coordinates, device serial number, and exact timestamp. When you upload to the cloud, the question is who can see it, who can be compelled to hand it over, and whether AI is training on it.
Symmetric vs asymmetric, AES-256 vs quantum computers, what HTTPS actually protects, and why none of it matters if your phone is compromised. The foundational reference for everything else on this site.
Every VPN claims no logs. Court records tell a different story.
Which messenger leaks the least?
Both encrypt your inbox. Both market themselves as the antidote to Gmail. But jurisdiction, metadata exposure, encryption scope, and real world court orders tell a more complicated story. Here is a field by field comparison.
Company & Jurisdiction
Jurisdiction determines what a government can legally force a provider to do. It is arguably more important than the encryption itself, because encryption protects content, but jurisdiction determines what happens to everything around it.
Encryption Protocols - The Technical Layer
Both providers use end-to-end encryption. The difference is in what standard they chose, what that standard can and cannot encrypt, and how far ahead each has moved on post-quantum readiness. These are not cosmetic differences, they determine the scope of what the server can never see.
What Gets Encrypted - Field by Field
This is where the choice of encryption protocol creates real, measurable differences. Proton's adherence to OpenPGP means certain email headers are structurally excluded from end-to-end encryption. Tuta's proprietary protocol lets it encrypt fields that PGP cannot. Every field listed here as "not E2EE" is a field the provider can theoretically read or hand over under court order.
Jurisdiction Under Pressure - Real Cases
Marketing copy is written for good days. Court orders arrive on bad ones. These are the documented cases where each provider's jurisdiction was tested by law enforcement, listed chronologically. Every case here is drawn from court records, official company statements, or investigative journalism.
French authorities, working through Europol and Swiss MLAT channels, obtained a Swiss court order compelling Proton to begin logging the IP address of a specific Proton Mail account associated with the Youth for Climate movement in Paris. The activist was involved in occupying buildings in the Place Sainte Marthe area. Proton complied, as it had no legal mechanism to refuse. [4]
A blackmail email was sent from a Tuta account to an auto supplier. The Cologne Regional Court ordered Tuta to develop a monitoring function for the specific account, copying unencrypted incoming and outgoing emails before they were encrypted. Tuta argued it was not a telecommunications provider and should not be subject to telecom interception laws. The Hanover Regional Court had previously agreed with this position. But the German Federal Court of Justice (BGH) ruled against Tuta, finding that "over the top" email services qualify as telecoms under German criminal procedure law. [17]
Spanish police (Guardia Civil) sought to identify a pseudonymous member of the Catalan pro-independence movement known as "Xuxo Rondinaire," suspected of planning protest actions related to King Felipe VI's visit. The request went through Europol to Swiss authorities, who issued a binding order to Proton. Proton handed over the only user-identifiable information it had: the recovery email address on the account, which was an Apple iCloud address. Apple then provided Spanish authorities with the individual's full name, two home addresses, and a linked Gmail account. [15]
The FBI, investigating arson, vandalism, and doxxing linked to the Stop Cop City movement in Atlanta, used the MLAT process to compel Proton through Swiss authorities to hand over payment data associated with a specific Proton Mail account affiliated with Defend the Atlanta Forest. The payment data, a credit card transaction, was sufficient to identify the account holder. The individual does not appear to have been charged with a crime at the time of the disclosure. [16]
The pattern across all four cases is consistent: end-to-end encryption held every time. No government obtained the contents of a single encrypted email from either provider. What was obtained in every case was metadata, including IP addresses, recovery emails, payment data, and non-E2EE message copies. The lesson is not that these providers failed. The lesson is that encryption protects content, and only content. Everything around it, every data point you provide at signup, every payment method you choose, every field that falls outside the encryption envelope, is fair game under a court order.
Features, Ecosystem & Pricing
Security architecture is the priority. But people also need to use these products daily. This section covers the practical differences that affect everyday use.
Per-Provider Summary
BESTEcosystem breadth. VPN, Drive, Calendar, Pass, and Wallet under one account, the closest thing to a privacy respecting Google replacement that exists.
BESTPGP interoperability. You can exchange encrypted email with any PGP user on Earth, not just other Proton users. This matters for journalists and researchers who correspond with varied sources.
PROSwiss jurisdiction requires MLAT for foreign requests, creating meaningful procedural friction. Switzerland is not in any intelligence sharing alliance.
PROTor onion site for IP anonymous access. Combined with Proton VPN, the IP logging risk is fully mitigable by the user.
CONSubject lines are NOT end-to-end encrypted. Under a valid Swiss court order, Proton can hand over the subject lines of every email in your inbox. This is a PGP limitation, not a Proton decision, but it is a real exposure.
CONRecovery email is a deanonymization vector. The 2024 Catalan case proved this. Recovery email is optional, but Proton prompts users to add one.
CONPayment data exposure. The 2026 Stop Cop City case showed that credit card payment metadata alone was sufficient for the FBI to identify a user.
WARN~11,000 legal orders in 2024 with a ~94% compliance rate. Partly a function of scale, but volume matters.
BESTEncryption scope. Subject lines, sender names, recipient names, attachment data, and the entire address book are E2EE. Tuta encrypts more fields than any other email provider on the market.
BESTPost-quantum encryption, live since March 2024. TutaCrypt's hybrid protocol (CRYSTALS-Kyber + X25519) is the first production deployment of quantum resistant email encryption.
BESTNo recovery email concept. Uses a recovery code instead. The Catalan case deanonymization vector does not exist in Tuta's architecture.
PROArgon2 key derivation, more resistant to GPU/ASIC brute force attacks than Proton's bcrypt.
PROECHR encryption protection. As an EU based provider, Tuta is protected by the 2024 European Court of Human Rights ruling banning laws that weaken E2EE.
CONGermany is a Fourteen Eyes member. Intelligence sharing between allied nations means German agencies could theoretically receive and act on foreign intelligence.
CONBGH ruling created a real-time monitoring precedent. Tuta can be ordered to copy future non-E2EE mail before encryption for specific accounts.
CONNo PGP support. You cannot exchange encrypted email with PGP users outside the Tuta ecosystem.
WARNNarrower ecosystem. Email + Calendar only. No VPN, no file storage (yet), no password manager.
End-to-end encryption protects your note content. It does not protect the metadata your notes generate. Here is an exhaustive breakdown of every metadata field collected by Standard Notes, Notesnook, Cryptee, and Joplin - and what it reveals.
Timestamps
Timestamps are the most operationally significant metadata a notes app collects. They do not reveal what you wrote - but they reveal when, how often, and for how long. In an investigative context, that is often enough.
| Field | Standard Notes | Notesnook | Cryptee | Joplin |
|---|---|---|---|---|
| created_at Exact timestamp when a note was first created ⚠ HIGH - reveals when you began writing about a topic | STORED | STORED | NOT STORED | STORED |
| updated_at / user_updated_time Every modification timestamp, used for sync conflict resolution ⚠ HIGH - reveals editing cadence and co-editing patterns | STORED | STORED | VERSION IDs only | STORED |
| user_created_time vs. server_created_time SN and Joplin store both client-reported AND server-recorded timestamps as two separate records ⚠ MED - discrepancy between the two can reveal offline editing periods | BOTH STORED | BOTH STORED | N/A | BOTH STORED |
| deleted_at When a note was trashed - item is marked deleted, not immediately purged ⚠ MED - reveals when you removed sensitive material | STORED | STORED | NOT CONFIRMED | STORED |
| sync_token / cursor_token SN's internal sync clock - increments on every create or update event, acting as a proxy for edit frequency even without readable timestamps ⚠ MED - editing frequency is inferable from token progression | STORED (SN only) | N/A | N/A | N/A |
Item Identifiers & Structure
Identifiers are low-risk in isolation. Their value to an adversary comes from correlation - mapping UUIDs across requests, sessions, and time to build a behavioral profile without reading a single word.
| Field | Standard Notes | Notesnook | Cryptee | Joplin |
|---|---|---|---|---|
| UUID / Item ID Unique identifier per note, tag, and notebook. Stored in plaintext as a database key. ✓ LOW alone - meaningless without content, but correlatable across requests | STORED | STORED | STORED | STORED |
| content_type Whether an item is a Note, Tag, Component, or Preference. Stored in plaintext for server-side routing. ⚠ MED - server knows you have X notes, Y tags, Z preferences without reading any of them | STORED | ENCRYPTED | PARTIAL | STORED |
| items_key_id (SN only) Associates each encrypted note payload with the Items Key UUID that encrypted it ✓ LOW - only reveals which key encrypted which note. Useful for key rotation, not revealing. | STORED | N/A | N/A | N/A |
| parent_id / notebook relationship The structural link between a note UUID and its notebook UUID ⚠ MED - reveals organizational grouping even when folder names are encrypted | ENCRYPTED | ENCRYPTED | FOLDER COUNT VISIBLE | STORED (E2EE off) / ENCRYPTED (E2EE on) |
| tag relationships UUID-to-UUID associations between notes and tags ✓ LOW if tag names are encrypted - count is inferable but meaning is not | ENCRYPTED | ENCRYPTED | ENCRYPTED | STORED (E2EE off) / ENCRYPTED (E2EE on) |
| is_deleted flag Soft-delete marker. The item remains in the database as a tombstone until a hard purge runs. ⚠ MED - deleted notes persist server-side. SN: 14 days. Others: unstated. | 14-day tombstone | Window unstated | NOT CONFIRMED | Window unstated |
Note Content Fields
This is where all four products perform equally well. Title, body, tags, and folder names are encrypted in every product when E2EE is properly active. The server holds ciphertext and cannot read any of it.
| Field | Standard Notes | Notesnook | Cryptee | Joplin |
|---|---|---|---|---|
| Note titleThe visible title of the note✓ Encrypted in all four when E2EE is active | ENCRYPTED | ENCRYPTED | ENCRYPTED | ENCRYPTED (E2EE on) |
| Note body / contentFull note text✓ Encrypted in all four when E2EE is active | ENCRYPTED | ENCRYPTED | ENCRYPTED | ENCRYPTED (E2EE on) |
| Tag namesHuman-readable labels applied to notes✓ Server sees UUIDs only - names are encrypted | ENCRYPTED | ENCRYPTED | ENCRYPTED | ENCRYPTED (E2EE on) |
| Notebook / folder namesNames of organizational containers✓ Server sees UUIDs only - names are encrypted | ENCRYPTED | ENCRYPTED | ENCRYPTED | ENCRYPTED (E2EE on) |
Structural & Size Metadata
Encrypted blob sizes correlate with content length even after encryption. A 200KB blob versus a 2KB blob reveals relative note length. Attachment types reveal what kind of material you work with. None of this requires decrypting anything.
| Field | Standard Notes | Notesnook | Cryptee | Joplin |
|---|---|---|---|---|
| Encrypted payload byte size Size of the encrypted blob on the server - correlates with note length even after encryption ⚠ MED - short vs. long writing is inferable from blob size | VISIBLE | VISIBLE | EXPLICITLY STORED | VISIBLE |
| Total note count Number of items stored under the account - inferable from item listings ✓ LOW - reveals productivity habits but not topics | INFERABLE | INFERABLE | STORED PER FOLDER | INFERABLE |
| Attachment MIME types File type of attachments - PDF, image, audio, etc. - known before encryption wrapping ⚠ MED - "this note has an audio attachment" or "a scanned document" is contextually revealing | NOT CONFIRMED | NOT CONFIRMED | EXPLICITLY STORED | STORED |
| Attachment count per note How many files are attached to a given note ✓ LOW alone - correlatable with other behavioral patterns | INFERABLE | INFERABLE | VISIBLE | STORED |
| Folder color / archive status (Cryptee) Whether a folder is archived and its color - structural decoration metadata ✓ LOW - cosmetic only | N/A | N/A | STORED (stated in policy) | N/A |
Revision History Metadata
Revision history is perhaps the most underappreciated metadata risk. Every saved version carries its own timestamp, creating a granular editing timeline that persists on the server independently of the note's content.
| Field | Standard Notes | Notesnook | Cryptee | Joplin |
|---|---|---|---|---|
| Revision history stored server-side Whether previous versions of a note are held on the server ⚠ HIGH - each revision carries a timestamp, creating a full editing timeline server-side | 1yr (Productivity) / Unlimited (Professional) | Limited history | NOT STATED | LOCAL ONLY |
| Timestamp per revision Every stored version carries the modification time of that specific edit ⚠ HIGH - for a journalist, this shows exactly when a draft was touched relative to external events | YES - per revision | YES - per revision | N/A | LOCAL ONLY |
| Revision content encrypted Whether stored past versions are encrypted on the server | YES - encrypted blobs | YES - encrypted blobs | N/A | N/A - local only |
| Nightly email backup (SN paid, opt-in) SN paid plans can send an encrypted nightly backup to your email - this creates a second metadata trail at the email provider ⚠ MED - timing, recipient address, and volume are visible to your email provider even if content is encrypted | OPT-IN - paid only | NO | NO | NO |
Joplin-Specific Note Fields
Joplin's data model was inherited partly from Evernote's ENEX format and carries more metadata fields per note than any other product here. Several of these fields are unique to Joplin and have no equivalent in the other three apps.
| Field | Standard Notes | Notesnook | Cryptee | Joplin |
|---|---|---|---|---|
| latitude / longitude / altitude GPS coordinates embedded directly into note properties. On by default in the mobile app. ⚠ CRITICAL - exact location at note creation. Embedded in the note body itself, not just a server log. Travels with exports. Must be disabled manually in Settings → Note → Geolocation. | NOT COLLECTED | NOT COLLECTED | NOT COLLECTED | DEFAULT ON (mobile) |
| source / source_application Which Joplin client created the note - e.g. "joplin-desktop" or "net.cozic.joplin-mobile". Stored in note properties. ⚠ MED - device-type fingerprinting at the note level. Syncs plaintext if E2EE is off. | NOT COLLECTED | NOT COLLECTED | NOT COLLECTED | EMBEDDED IN NOTE |
| source_url When a note is created from a web clip, the source URL is embedded in note properties ⚠ HIGH - if E2EE is off, syncing a web-clipped note exposes the exact URL in plaintext. Directly reveals research activity. | NOT COLLECTED | NOT COLLECTED | NOT COLLECTED | EMBEDDED IN NOTE |
| author field Optional field that can be populated from your OS account username automatically ⚠ HIGH - if auto-populated from system username, your OS account name may be embedded in every note you create | NOT COLLECTED | NOT COLLECTED | NOT COLLECTED | CHECK IF AUTO-POPULATED |
| is_todo / todo_due / todo_completed Todo status, due date, and completion flag ✓ LOW if encrypted. Reveals task patterns if not. | N/A | ENCRYPTED | N/A | STORED (E2EE off) / ENCRYPTED (E2EE on) |
| markup_language Whether a note uses Markdown or HTML. Stored in note JSON properties. ✓ LOW - cosmetic formatting preference only | N/A | N/A | N/A | STORED in note JSON |
Sync & Operational Metadata
Sync events are visible at the API level regardless of content encryption. Active editing bursts, multi-device conflicts, and session counts paint a behavioral picture even when nothing is readable.
| Field | Standard Notes | Notesnook | Cryptee | Joplin |
|---|---|---|---|---|
| Sync frequency / timing How often notes are pushed to the server - inferable from API request logs even when content is encrypted ⚠ MED - active editing bursts are visible as sync events with no content required | INFERABLE | INFERABLE | INFERABLE | INFERABLE |
| Conflict records Created when two devices edit the same note simultaneously - links two item versions server-side ✓ LOW - reveals multi-device use pattern, not content | STORED | STORED | NOT STATED | STORED |
| Active session count How many concurrent device sessions are authenticated to the account ✓ LOW - reveals device count, not content | STORED | STORED | STORED | STORED |
Per-Product Summary
BESTMost transparent documentation. The sync API spec is publicly available and explicit about every stored field.
BESTPrivate username mode orphans all timestamps from a real identity - the data exists but is linked to a meaningless hash, not a person.
WARNRevision history timestamps are the biggest note-level risk. Professional plan stores an unlimited timestamped edit history indefinitely on Proton's servers.
WARNSync token progression is unique to SN - a proxy clock that reveals editing frequency independently of timestamps.
WARNNightly email backup (opt-in, paid) creates a second metadata trail at your email provider - timing, volume, and recipient visible to them even if content is encrypted.
BESTcontent_type fully encrypted - the server cannot distinguish a note from a tag from a preference. More thorough than SN and Joplin on this specific field.
PROPer-note sync toggle - notes kept local-only generate zero server-side metadata whatsoever.
WARNTimestamps are tied to your email address at registration. No private username equivalent means modification times are always attributable to an identifiable account.
CONDevice fingerprint metadata captured at the network level makes sync events more attributable than any other product here. Notesnook collects device IDs, OS type, and IP address in its own systems - the only app here that does all three.
BESTNo ISO timestamps on notes. Sequential version IDs are used for conflict resolution - readable as "this came after that," not as a datetime.
BESTNo server-side revision history. Previous versions of notes are not retained on the server at all.
WARNFolder count and file sizes explicitly stored. The number of documents per folder and attachment MIME types are visible to the server, per the published privacy policy.
WARNGhost Folders partially mitigates the folder count exposure - hidden folders are excluded from visible metadata - but the overall structure is still known to the server.
BESTNo server-side revision history. All past edits stay on device. The cleanest revision metadata posture of the four.
CONGPS coordinates embedded by default on mobile - the single most dangerous note-level metadata field across all four products. It is in the note body, not a server log, so it travels with every export regardless of encryption state.
CONsource_url field - web-clipped notes embed the origin URL in note properties. With E2EE off, this syncs in plaintext.
CONsource_application field - every note records which Joplin client created it. Device-type fingerprinting at the per-note level.
WARNWith E2EE properly enabled, the dangerous fields above are encrypted in sync and at rest on the server. They remain in the local SQLite database unencrypted regardless.
On March 18, 2026, FBI Director Kash Patel told the United States Senate that the FBI purchases commercially available location data from data brokers. He declined to commit to stopping. This is not new. It has been happening for years, across multiple administrations, involving the FBI, the Department of Defense, ICE, the IRS, and the Secret Service. But for the first time, the current FBI director said it out loud, on the record, under oath. Here is what is happening, how long it has been happening, who it affects, and what you can do about it.
This Has Been Happening for Years
The NPR report from today, March 25, 2026, is not a revelation. It is the latest chapter in a pattern that stretches back at least six years. Here is a condensed timeline of what has been publicly documented.
Why This Is Legal (and Why That Should Alarm You)
You might assume that if the Supreme Court said warrants are required for location data, the government cannot just buy it instead. The legal reality is more disturbing than that.
AI Makes This Exponentially Worse
Data broker purchases are not new. What is new is the ability to process them. Artificial intelligence transforms bulk location data from a filing cabinet into a searchable intelligence platform.
The Legislative Response (and Why It Might Actually Happen This Time)
Why This Matters for You (and What You Can Do Right Now)
You may be reading this and thinking: "I have nothing to hide." That is not the point. The point is that a system exists where the government can purchase a detailed record of your physical movements, your associations, your habits, and your patterns of life, without ever asking a judge for permission, without ever notifying you, and without any meaningful oversight. Whether or not you have something to hide, you have a Fourth Amendment right not to be subjected to this. Here is what you can do today.
Private browsing does not make you private. A UC Berkeley study cataloged the tracking infrastructure on the most popular adult websites in the US and found Google on almost all of them, search terms leaked in plaintext, and sexual preferences encoded in cookies. Here is what the research actually found, what has changed since, and what has not.
The Study
The researchers examined all eleven adult sites in the Alexa US Top 500, both manually in Firefox using mitmproxy to capture every connection, and through Mezzobit, a cloud-based tool that maps third-party communications. They supplemented the work with Netograph and Palantir Contour for link and statistical analysis. The paper was submitted to the FTC's PrivacyCon 2017.
Key Findings - What The Researchers Found
The Nuance Most People Miss
What Has Changed Since 2016
What Has Not Changed
What Actually Protects You
If you take away one thing from this analysis, it is that private browsing is a local-only protection. It hides your history from someone who picks up your device. It does not hide your activity from the network, the site, or the third parties the site sends your data to. Here is what does.
In January 2021, the FBI produced an internal guide titled "Lawful Access" detailing exactly what data it can legally obtain from nine messaging apps. The document was obtained via FOIA by the nonprofit Property of the People and published by Rolling Stone. It is the single most useful primary source available for evaluating messaging privacy. Here is a field-by-field analysis of five of the most common services, based on that document, official privacy policies, court records, and published law enforcement capabilities.
Encryption Architecture
The foundation. Whether your messages can be read by anyone other than you and your recipient depends entirely on the encryption model. These five services use fundamentally different approaches, and the differences determine everything that follows.
Metadata Collection: What Each Service Knows About You
Even when message content is encrypted, every service collects operational metadata to function. The critical question is: how much, and is it retained?
The Uncomfortable Truth: Encryption Is Not the Ceiling
Everything above analyzes what happens when law enforcement works within the system: subpoenas, warrants, pen registers, data requests to providers. That is the normal case. But if you are individually targeted by a government, the analysis changes fundamentally. Encryption protects data in transit and at rest. It does not protect data on a compromised device. And governments do not need to break your encryption when they can simply hack your phone.
The Data Broker Loophole: Buying What a Warrant Would Require
In 2018, the Supreme Court ruled in Carpenter v. United States that law enforcement must obtain a warrant to access cell phone location data from carriers. That ruling was supposed to be a privacy landmark. Eight years later, the FBI is buying the same data, and more, from commercial data brokers. No warrant required.
Per-Service Summary
CONNo encryption whatsoever. Carriers can read every message. Content is transmitted in plaintext over SS7.
CONFull identity tied to every message. Your phone number, legal name, billing address, and location are attached to every text.
CONStingray interception. Law enforcement can intercept SMS content in real time without carrier involvement using cell-site simulators.
CONMetadata retained for years. Sender, recipient, timestamp, cell tower location are all retained by the carrier and available under routine subpoena.
CONForensic recovery. "Deleted" SMS messages are recoverable from the device using tools like Cellebrite until overwritten by new data.
WARNRCS is replacing SMS and offers encryption in transit, but the ecosystem is still fragmented and cross-platform E2EE is not guaranteed.
PROE2EE by default between Apple devices. Strong encryption in transit.
CONiCloud backup defeats E2EE. If iCloud backup is enabled (it is by default), Apple stores message content with the encryption key and hands it over under warrant.
CONFalls back to SMS when messaging non-Apple devices. Green bubble means no encryption.
CON25 days of iMessage lookup data available to law enforcement, showing who searched for your contact info in iMessage.
CONClosed source. Apple's encryption implementation cannot be independently audited.
WARNAdvanced Data Protection (launched Dec 2022) enables E2EE for iCloud backups, but it is opt-in, not on by default, and most users have never enabled it.
PROE2EE by default using the Signal Protocol. Message content is protected in transit and at rest on WhatsApp's servers.
CONNear real-time metadata to the FBI. WhatsApp is the only service that provides pen register data: source and destination of every message, every 15 minutes. This is live surveillance.
CONExtensive metadata collection. Meta collects your contacts, usage patterns, device info, IP address, location, and behavioral data.
CONCloud backup bypass. Unencrypted iCloud/Google Drive backups can expose message content. Encrypted backup is opt-in.
CONOwned by Meta. WhatsApp shares data with Facebook's advertising infrastructure. The privacy policy explicitly permits this.
WARNThe Natalie Edwards case proved that WhatsApp metadata alone, without any message content, was sufficient to convict a federal employee of leaking classified documents.
PROSecret Chats are E2EE with self-destructing messages and no server-side storage.
PROMinimal FBI access. The FBI document shows no message content and no contact info available under standard legal process.
CONDefault chats are NOT end-to-end encrypted. Telegram holds decryption keys for all cloud chats. Every group chat, every channel, and every default one-on-one conversation is readable by Telegram.
CONProprietary, unaudited server. Server code is closed source. Telegram's claims about distributed key storage cannot be verified.
CONMTProto protocol criticized by Johns Hopkins cryptographer Matthew Green and others for non-standard design, lack of independent audit, and implementation concerns.
WARNPost-Durov arrest policy change. After Pavel Durov's arrest in France (Aug 2024), Telegram announced it would begin cooperating with law enforcement under court order, reversing its prior stance.
BESTThe FBI's own document confirms it. Signal provides: date/time of registration, date of last use. That is the entire list. No message content. No contacts. No metadata. No IP addresses. No pen register capability.
BESTE2EE on everything, no exceptions. All messages, group chats, calls, voice notes, file transfers. No cloud backup integration. No fallback to unencrypted modes.
BESTFull open source. Client AND server. The only service on this list where both are auditable.
BESTOperated by a nonprofit. Signal Foundation. No advertising business model. No incentive to collect data.
PRODisappearing messages, sealed sender, screen security. Advanced privacy features enabled at the user's discretion.
CONRequires a phone number to register. This is Signal's one identifiable data point. It is a real limitation for users who need full anonymity.
CONSmaller user base. Most of your contacts are probably on WhatsApp or iMessage. Network effects work against Signal's adoption.
WARNSignal does not protect against device compromise. If Pegasus or Cellebrite gains access to your phone, Signal's encryption is irrelevant. Signal protects the pipe. It cannot protect the endpoints.
Every photo you take records your exact GPS coordinates, the device you used, the time down to the second, and in many cases the direction you were facing. When you upload those photos to a cloud service, the question is not whether that data exists. It is who can see it, who can be compelled to hand it over, and whether AI is being trained on it without your meaningful consent. This is a field-by-field comparison of five configurations across four providers: Ente Photos, iCloud Photos with Advanced Data Protection, iCloud Photos without it, Google Photos, and Microsoft OneDrive.
Part 1: What a Single Photo Reveals About You
Before comparing providers, it is worth understanding exactly what is at stake. Photo metadata is not abstract. It has been used in criminal investigations, stalking cases, corporate leak investigations, and intelligence operations. The EFF documented a case where the FBI identified an Anonymous hacker ("w0rmer") solely through GPS coordinates embedded in a photo posted to Twitter. The photo was taken with an iPhone 4, and the EXIF data contained the exact latitude and longitude of the house where it was taken, which led directly to the suspect's arrest. [1]
Part 2: Encryption Architecture, Provider by Provider
The core question: who can see your photos and their metadata? The answer depends entirely on the encryption model. Some providers encrypt your photos but hold the keys (meaning they can decrypt them under legal process). Some encrypt photos end-to-end (meaning only you can decrypt them). And some do not encrypt stored photos at all in any meaningful sense.
Part 3: AI, Your Photos, and Who Benefits
Every major cloud photo provider now offers AI-powered features: face recognition, object search, memory curation, image enhancement, and more. The privacy question is where that AI processing happens (on your device or on the provider's servers) and whether your photos are used to train the company's broader AI models.
Part 5: Ente's Encryption Architecture (and Its One Noted Limitation)
Since Ente is the only provider on this list with full end-to-end encryption of photos by default, its architecture deserves a closer look. It also has one documented limitation worth discussing.
Per-Provider Summary
BESTFull E2EE for everything. Photos, videos, all EXIF metadata, album names, descriptions, tags, and ML indexes. Ente cannot see any of your data.
BESTOn-device AI only. Face recognition and search run locally. ML indexes are encrypted before syncing. No server-side photo processing. No AI training on your data.
BESTFull open source. Client and server. Audited twice by Cure53. CERN-sponsored infrastructure audit.
PROSelf-hostable. You can run the entire Ente server on your own hardware.
PRO3x replication across providers in the EU, including an underground facility in Paris.
WARNShared links lack forward secrecy. Use password protection, expiration, and account-to-account sharing for sensitive content.
CONPaid service. 10 GB free, then paid plans starting at $1.49/month (50 GB). No free unlimited tier.
PROE2EE when ADP is enabled. Apple does not hold the keys. Photos and most metadata are protected.
PRODeep Apple ecosystem integration. Seamless with iPhone, iPad, Mac.
CONADP is opt-in. Not enabled by default. Most users have never turned it on.
CONSome metadata remains under standard protection even with ADP enabled (modification dates, checksums).
CONClosed source. Encryption implementation cannot be independently audited.
CONShared Albums and collaborative features do not support ADP. Photos in Shared Albums use standard encryption only.
CONDisabled in the UK after the Home Office ordered Apple to provide backdoor access (Feb 2025).
CONApple holds the encryption keys. All photos and metadata are accessible to Apple and producible under warrant.
CONThis is the default. Every iCloud account that has not explicitly enabled ADP operates in this mode.
CONThe iCloud backup loophole. For years, law enforcement obtained iPhone data not from the device itself, but from the unencrypted iCloud backup. Photos were a primary target.
WARNEnabling ADP is the single most important step an iPhone user can take for photo privacy. Settings → [Your Name] → iCloud → Advanced Data Protection → Turn On.
CONNo end-to-end encryption. Google holds the keys to every photo.
CONServer-side AI processing. Google's systems analyze your photos, faces, locations, objects, and scenes. All of this data is indexed and searchable on Google's servers.
CONActively scans for CSAM. Automated scanning means Google's systems are looking at your photos.
CONFull producible under warrant. Photos, metadata, face groupings, AI-generated labels, location history derived from photos.
CONPart of the Google advertising ecosystem. Google's broad privacy policy grants extensive data usage rights.
WARNGoogle received nearly 40,000 law enforcement data requests in the first half of 2020 alone, complying with 83% of subpoenas.
CONNo end-to-end encryption for photos. Microsoft holds the keys. Personal Vault adds an authentication layer but not E2EE.
CONServer-side processing. Microsoft processes photos for organizational features and Copilot integration.
CONProducible under warrant. All photos and metadata accessible to Microsoft and law enforcement.
WARNOneDrive is not primarily a photo service. It is a general file storage platform. For users who store photos in OneDrive because it came bundled with Microsoft 365, the privacy implications are the same as Google Photos: the provider can see everything.
This is the post I will keep pointing you to. If you have read anything else on this site and found yourself wondering what AES-256 actually means, what end-to-end encryption is doing under the hood, whether quantum computers can break all of it, or why encryption alone is not enough, this is where those questions get answered. No prerequisites. No dumbing down. Just the actual mechanics, explained clearly.
Part 1: The Two Fundamental Types of Encryption
Every encryption system in use today falls into one of two categories, or combines both. Understanding the difference between them is the single most useful thing you can learn about cryptography.
Part 2: Can AES-256 Be Cracked? The Quantum Question, Answered With Physics
This is the section I wrote because I keep seeing the same fear repeated online: "quantum computers will break all encryption." That statement is half true and half dangerously misleading. Quantum computers pose a real, existential threat to asymmetric encryption (RSA, ECC). They pose essentially zero practical threat to AES-256. Here is why, quantified in terms that are hard to forget.
Part 3: How HTTPS Encryption Works (and What It Does Not Do)
Every time you see the padlock icon in your browser or a URL starting with https://, you are using encryption. Since this site is a progressive web app served over HTTPS, and since HTTPS is the most common encryption most people encounter daily, it is worth understanding what it actually protects.
Part 4: Where Encryption Stops Protecting You
This is the most important section in this entire post. Encryption is extraordinary technology. The math is sound. AES-256 will not be broken in your lifetime, your children's lifetime, or the lifetime of the sun. But encryption only protects what it covers. And the places where it does not cover you are exactly the places where real-world surveillance happens.
The Federal Trade Commission has been called America's "de facto privacy regulator." For two decades, its consent decrees against Google and Facebook were the closest thing Americans had to enforceable privacy law. In 2025, the President fired the Democratic commissioners, deleted 300 blog posts of enforcement guidance, and the remaining commissioners began reopening and setting aside consent orders. What happened is not a surprise. The institutional design of the FTC made this outcome structurally inevitable. The academic work of Chris Jay Hoofnagle, spanning three papers from 2012 to 2018, diagnosed every failure point with extraordinary precision. This article traces his analysis through the institutional collapse now underway and asks the question his work leaves open: if the institution designed to protect your privacy was never equipped to do so, and is now being actively dismantled, what remains?
Part 1: The Three-Layer Diagnosis
Professor Chris Jay Hoofnagle of UC Berkeley has produced what may be the most complete structural critique of U.S. privacy enforcement. Across three papers spanning six years, he identified three distinct failure modes that, taken together, explain why the FTC was never equipped to protect consumer privacy at scale, and why the current collapse was predictable.
Part 2: The Prediction and the Reality
Hoofnagle's 2017 paper concluded with a prediction: "We should expect President Donald Trump's administration to expand the role of the BE and to make its role more public. With newfound powers, the BE will argue that more cases should be pled under the unfairness theory. This will have the effect of blunting the lawyers' attempts to expand privacy rights through case enforcement." He was right. But what happened in 2025 went further than even his analysis anticipated.
Part 3: The Consent Decree Was Always a Structural Weakness
The FTC's reliance on consent decrees was understandable given its limited toolkit. But as Hoofnagle's work and the 2025 events demonstrate, the model had three fundamental vulnerabilities that made it unsuitable as a long-term framework for privacy protection.
Part 4: What Remains When Institutions Fail
Hoofnagle's work describes a system where the FTC's internal structure (the BE zeroing out privacy harm), its external constraints (no general privacy statute, no civil penalty authority), and the industry's rhetorical strategies ("post-privacy," "paternalism," "we don't sell data") combined to produce enforcement that was better than nothing but structurally inadequate. The 2025 events did not create these weaknesses. They exploited them. The question that follows is practical: if the institutional framework was always insufficient and is now being actively dismantled, what is left?
Conclusion: Promises vs. Constraints
Hoofnagle's body of work provides the intellectual framework for understanding why American privacy enforcement was always fragile and why the current moment, while alarming, was structurally inevitable. His analysis should not be read as a counsel of despair. It should be read as a diagnosis that points toward a specific kind of remedy.
On March 27, 2026, an Iranian state-linked hacking group published over 300 emails, personal photographs, and documents from the personal Gmail account of FBI Director Kash Patel. The FBI confirmed the breach. The emails were not obtained by breaking Google's encryption or exploiting a zero-day vulnerability. They were obtained because the director of the Federal Bureau of Investigation kept a personal Gmail account that had been exposed in prior data breaches, and apparently did not take the steps necessary to prevent reuse of those credentials. This is not a story about a sophisticated cyberattack. It is a story about the gap between what people assume their email protects and what it actually does.
What Happened
How It Likely Happened
No technical details about the method of compromise have been officially confirmed. But the available evidence points clearly in one direction, and it is not a sophisticated one.
What Gmail Does and Does Not Protect
This incident is a concrete illustration of a point we have made repeatedly on this site: encryption that protects data in transit and at rest is not the same as encryption that protects data from the provider or from anyone who obtains your credentials.
This Is a Pattern, Not an Anomaly
The Patel breach is not an isolated incident. It fits a documented pattern of state-linked hackers targeting the personal accounts of senior officials, and of those accounts being on consumer email services with no end-to-end encryption.
What You Should Do
The Patel breach is a case study in what goes wrong when someone relies on the default security posture of a consumer email service. Every mitigation below addresses a specific failure point illustrated by this incident.
Privacy consulting for individuals and small operations with real reasons to care. Threat modeling, jurisdiction analysis, encrypted infrastructure migration, and operational security guidance.
I am David. I run Orion Private LLC, a privacy consulting practice for professionals and everyday people who have decided their digital lives are worth protecting properly. My background is in Legal Studies and Public Policy from UC Berkeley, and that training shapes every engagement. I read statutes, analyze court opinions, trace jurisdiction, and translate regulatory frameworks into clear, actionable guidance. The practice carries errors and omissions coverage and general liability insurance because accountability matters when someone trusts you with their digital life.
I trace where your data is stored, who has legal authority over it, what a court order can compel a provider to hand over, and what is actually protected by the math versus what is just marketing. I analyze metadata exposure at the field level, map encryption protocols against documented court orders, and publish everything in plain language.
Nothing here runs on third-party platforms that consume your information. Client records live in encrypted spreadsheets I built myself, stored in Veracrypt containers on local hardware, backed up to encrypted cloud storage. Invoices, receipts, and bookkeeping are handled the same way. No client name or engagement detail has ever touched a CRM, a SaaS dashboard, or a billing platform that monetizes the data passing through it. I built the operational infrastructure from scratch because the tools that exist were not built for the people I serve.
Orion Private is built with a long-term trajectory in mind. The long-term goal is law school and eventually a practice that applies this same thinking to private trusts, estate planning, and property law. Every home purchase generates a data trail. Every estate plan contains sensitive personal information. Most of that exposure is not inevitable. It is just how things have always been done. With the right legal strategy and technical awareness, even deeply personal assets can be structured to preserve privacy by design rather than by accident.
I am currently pursuing CIPP/US certification through the International Association of Privacy Professionals. That credential validates what the research and client work already demonstrate, but it matters for the clients and institutions that need to see it formalized. CIPM is next, which opens the door to working with businesses on privacy program management. The certifications are part of the same trajectory as the practice itself.
Anyone paying attention to legal education knows the math changed in 2025. The One Big Beautiful Bill Act eliminated the Grad PLUS loan program and capped federal borrowing for law students at $50,000 per year. Most law schools cost more than that. Public service loan forgiveness survived, but income-driven repayment for new borrowers was effectively replaced. This practice is how I am building the foundation to pursue that education without massive debt that would dictate the kind of law I practice.
Sources include official documentation, court records, transparency reports, academic research, and established reporting. No affiliate links. No sponsorship.
This entire website is a single file. No frameworks, no templates, no WordPress. One HTML document that contains every page, every article, and every animation. It is self-hosted on infrastructure I control in a European data center covered by EU data protection law. There are no third-party tracking scripts and no external requests except the OpenPGP library that powers the encrypted contact form.
The site runs Umami, a self-hosted open-source analytics tool on the same server. No cookies, no personal data collection, no visitor profiles. It counts total visitors and page views. The data never leaves my server.
It loads fast because there is nothing to fetch. It cannot leak your browsing behavior because no third parties are involved.
Whether you need consulting, have a research question, or want to flag something I got wrong, I am happy to hear from you.
Free, no commitment. Your message is encrypted with PGP in your browser. Only I can decrypt it.
No email. No phone number. No account. Check it out.