Privacy
11 min read

Europe Pulled the Plug on Mass Message Scanning. The Proposal Isn't Dead.

The EU Parliament voted to let the legal basis for mass-scanning private messages expire. Google, Meta, Microsoft, and Snap were all operating under that safe harbor. It's the most significant institutional win for encrypted communications in years — and the pressure hasn't dissolved.

The EU Parliament voted this month to let the legal basis for mass-scanning private messages expire. Google, Meta, Microsoft, and Snap had been operating under that legal safe harbor — scanning Gmail, Messenger, Outlook, and Snapchat messages for illegal content, inside Europe, without the consent of the people being scanned. That safe harbor is now gone.

For the first time, a major democratic institution formally said no to AI-powered surveillance of private communications at scale.

That's the win. Here's what's still running.

What Just Expired

The legal mechanism that died was called an interim derogation from the EU's e-Privacy Directive. The e-Privacy rules generally prohibit companies from scanning the content of private electronic communications. Your email provider technically cannot read your messages to target ads without your explicit consent. That's the baseline.

Starting in 2021, the EU carved out a temporary exception. Service providers could voluntarily scan private messages to detect child sexual abuse material (CSAM) and other illegal content, despite the e-Privacy prohibition. The logic was pragmatic: Microsoft, Google, Meta, and others were already running these scanning programs in the United States, where no equivalent privacy law exists. Forcing them to stop in Europe while a permanent legal framework was negotiated would disrupt ongoing CSAM detection operations. So the derogation let them continue, under a sunset clause, until a permanent solution was found.

No permanent solution was found. The interim derogation expired. The EU Parliament voted not to renew it.

The organizations actively scanning under this exception included Google (Gmail), Meta (Facebook Messenger), Microsoft (Outlook and OneDrive), and Snap (Snapchat). According to EFF, all four signaled they intend to continue scanning voluntarily even without the derogation — accepting the GDPR legal risk themselves, apparently betting that a court challenge to voluntary CSAM detection is one they'd win.

They may be right. But "voluntary" scanning without a legal safe harbor isn't the same situation. A company that was protected by the derogation is now operating in ambiguous territory, and that ambiguity creates real leverage for future legal challenges by privacy advocates and affected individuals.

What Chat Control Actually Proposed

The vote that just happened is connected to a three-year legislative fight that isn't over.

In 2022, the European Commission proposed what became known as "Chat Control" — a regulation that would require all messaging services to scan every private message for CSAM and terrorist content and report matches to authorities. Not just unencrypted services like email. Every service. Including Signal, WhatsApp, iMessage, and any other end-to-end encrypted messenger operating in the EU.

The proposal hit an immediate technical wall: you cannot scan the content of an end-to-end encrypted message without decrypting it. That's definitional. E2E encryption means only the sender and recipient hold the cryptographic keys. If any third party — including the service provider — can read the plaintext, the encryption is not end-to-end.

The Commission's answer was client-side scanning. Instead of having a server decrypt messages, the scanning would happen on your device, before encryption, using software built into the messaging app. The app would analyze your message content, compare it against databases of known illegal material, flag matches for reporting to a central authority, and then encrypt and send the message as usual.

From a privacy standpoint, this is functionally identical to breaking encryption. You've delegated the surveillance to a different point in the pipeline, but the effect is the same: an authorized party reads your plaintext content before it becomes ciphertext. That it happens on your hardware rather than a server doesn't change the authority being granted. It changes where the apparatus lives.

A coalition of cryptographers and security researchers wrote to EU institutions, repeatedly and over multiple years, explaining this: there is no client-side scanning implementation that cannot be repurposed, expanded, or abused. A capability built to detect known CSAM hashes can detect other content. The same mechanism that flags an image of child abuse can flag an image of a political demonstration, a medication guide, a journalist's source communication. The scanning infrastructure doesn't know or care about the policy document governing it. The next government, the next law, the next court order shapes what the capability actually does.

The EU Parliament rejected mandatory client-side scanning. The EU Council — representing member states — attempted to pass a mandatory scanning version multiple times, most recently in 2025, and ultimately backed off. That branch of the proposal is, for now, shelved.

The interim derogation expiry is connected to that defeat. The legal framework that would have made scanning mandatory failed. The temporary safe harbor that let companies scan voluntarily while that framework was developed has now expired. The two things happening together represents the most complete institutional retreat from EU-level mass scanning since the proposal began.

Why This Vote Matters Beyond the Procedure

The derogation expiry is not just an administrative sunset. It's the first time the EU's institutional structure has formally withdrawn a permission structure that enabled mass scanning of private communications.

That matters as precedent. For three years, the implicit EU policy position was: we want a permanent framework for this, and in the meantime, you can keep scanning. The derogation's expiry says: the interim period is over. If you want to keep scanning, do it without our blessing. The Parliament declined to keep the door open.

The EFF's assessment is measured but clear. Three things still require watching.

First: the interim derogation could be revived. New legislation could create a fresh temporary safe harbor under a different legal mechanism. The same political coalitions that pushed for scanning are still active. The European Commission has not changed its position that CSAM detection should be legally required; it has only changed its tactics in the face of Parliament opposition. A new derogation attempt, framed differently, is a real possibility.

Second: mandatory age verification is the current pressure point. Several member states and parts of the Commission have proposed age verification requirements for online services, framed as child protection policy. Age verification requires identifying users. Once a service knows who its users are, linking identities to communications becomes administratively straightforward. A mandatory age-verification regime is a half-step toward the surveillance infrastructure that Chat Control would have built — without the politically toxic "scan everyone's messages" framing.

Third: voluntary scanning without legal cover creates its own dynamic. When Google and Meta announce they'll continue scanning without legal authorization, they're making a business decision to maintain relationships with law enforcement and government stakeholders, absorbing GDPR risk to preserve those relationships. That "voluntary" behavior can become normalized. If every major messaging platform scans, platforms that don't will face political pressure to explain why they're the exception. Voluntary becomes expected. Expected becomes mandated. That's not a hypothetical progression — it's the documented pattern of how surveillance capabilities expand.

The Signal Problem

Chat Control has had a consistent adversary throughout this debate: Signal.

Signal is end-to-end encrypted, open-source, collects minimal metadata, and has no server-side scanning capability. Its architecture makes client-side scanning harder to implement undetected — the app's source code is public, and modifications would be visible to anyone reviewing it. Signal stated clearly, multiple times, that it would exit EU markets before implementing mandatory client-side scanning.

That threat carried weight in parliamentary debates. Several MEPs cited the prospect of Signal leaving Europe as evidence that the scanning mandate was too broad to be workable policy. WhatsApp and iMessage made similar statements, though less categorically.

This created an unusual dynamic: the most privacy-protective messaging app became the benchmark by which the scanning mandate was evaluated. Not "will this stop CSAM" — which is a separate question with a complicated answer — but "will this law destroy the services people actually rely on."

The derogation's expiry doesn't change Signal's legal status. Signal still doesn't scan, still stores no message content, still exits markets that require scanning. What it changes is the regulatory climate around Signal: for now, its architecture is compatible with EU operating law. That compatibility is not guaranteed to persist.

What to Do

This affects your decisions even if you're not an EU resident. The scanning programs that expired for European users may still run elsewhere. And the EU Chat Control proposal has been watched closely by governments in the UK, Australia, Canada, and India — all of which have their own pending or proposed legislation around message scanning. What the EU does, or fails to do, sets a template.

1. Use Signal for anything you need kept private.

This is not a general recommendation I'm adding out of habit. Signal stores no message content, no metadata beyond your phone number, and has publicly committed to challenging any scanning requirement in court rather than implementing it. WhatsApp uses the Signal protocol for message content encryption, but Meta retains substantial metadata and has not ruled out client-side scanning in future product decisions. The difference matters.

2. Understand the difference between E2E encryption and zero-knowledge architecture.

Many services advertise end-to-end encryption while retaining metadata — who you message, when, from where, and how often. Metadata can be as revealing as content. A surveillance profile built from metadata alone told ICE which protests a Cornell PhD student had attended, without reading a single message. True privacy requires both encrypted content and minimal metadata retention. Signal has both. Most services have at most one.

3. Check your iCloud settings.

Apple's iCloud Photos is not end-to-end encrypted by default. Apple holds the decryption keys, which means they can decrypt and scan stored photos. If you have iCloud Backup enabled, your photos leave the device and are decrypted at Apple's end. Apple does not use client-side scanning for iCloud Photos (they halted that proposal in 2022 under intense backlash), but server-side scanning happens after the decryption that backup requires. If you use Apple devices and care about this: go to Settings → [Your Name] → iCloud → Advanced Data Protection and enable it. With Advanced Data Protection on, iCloud Backup becomes end-to-end encrypted and Apple cannot decrypt it.

4. Track EU Chat Control negotiations through the second half of 2026.

The permanent regulation that could mandate client-side scanning is still in active negotiation between the Commission, Parliament, and Council. Parliament has been the most privacy-protective of the three institutions. Council-level negotiations, driven by member state governments with law enforcement backing, are where scanning proposals keep reemerging. The next substantive vote is expected before the end of 2026. EDRi (European Digital Rights) maintains a public tracker of Chat Control's legislative progress — it's worth bookmarking.

5. If you're in the EU, contact your MEP before the next vote.

The outcome of the derogation vote reflects years of organized lobbying by civil society organizations and, less visibly, individual constituent contact. MEPs who voted against Chat Control did so in part because constituents and privacy advocates made the argument. The next vote will be contested. Making your position known to your representative costs almost nothing and has a track record of mattering.

The Institutional Moment

Something genuinely unusual happened when the EU Parliament voted not to extend the derogation. A democratic institution, presented with an opportunity to extend surveillance capabilities under the framing of child protection, declined.

That framing is designed to be hard to vote against. Every legislator who opposed Chat Control had to explain why they were blocking a tool marketed as finding child abusers. The defensible answer — that the tool works by scanning everyone's private communications, including people never accused of anything, using AI systems that produce false positives, under a legal framework that can be extended by future legislation — is accurate but not simple to deliver in a legislative floor speech or a press release.

The fact that Parliament landed there anyway, twice now, reflects something about how the technical arguments accumulated. When enough cryptographers, security researchers, and civil society organizations explain — consistently, over years — that there is no backdoor that only good actors can use, that argument eventually appears in committee reports, in floor debates, in the questions asked of the Commission during hearings. The expertise percolated into the institution. That doesn't happen automatically. It happens because people wrote papers, testified at committees, and organized public campaigns.

The derogation expiry didn't settle the matter. The Chat Control proposal continues. The tech companies are still scanning voluntarily. The political will for mandatory scanning among EU member states has not disappeared — it has been delayed and redirected.

But democratic institutions rarely formally withdraw surveillance permissions. This one just did it for the second time in two years on the same proposal. That's not nothing. It's worth noting, and it's worth protecting.

Sources: EFF — EU Parliament Blocks Mass-Scanning of Our Chats — What's Next?, EFF — We Need You: Our Privacy Cannot Afford a Clean Extension of Section 702

▸ TAGS
#eu#chat-control#end-to-end-encryption#mass-surveillance#client-side-scanning#privacy#eff
▸ STAY IN THE LOOP

Weekly. No spam. No fluff.