‘Chat control’: The EU’s controversial CSAM-scanning legal proposal explained

Date:

Share post:


The European Union has a longstanding reputation for strong privacy laws. But a legislative plan to combat child abuse — which the bloc formally presented back in May 2022 — is threatening to downgrade the privacy and security of hundreds of millions of regional messaging app users.

The European Commission, the EU legislative body that drafted the proposal, frames it as a plan to protect the rights of children online by combating the misuse of mainstream technology tools by child abusers who it contends are increasingly using messaging apps to distribute child sexual abuse material (CSAM) and even gain access to fresh victims.

Perhaps as a result of lobbying from the child safety tech sector, the approach the EU has adopted is one that’s techno-solutionist. The Commission’s initiative focuses on regulating digital services — principally messaging apps — by putting a legal duty on them to use technology tools to scan users’ communications in order to detect and report illegal activity.

For several years, mainstream messaging apps have had a temporary derogation from the bloc’s ePrivacy rules, which deals with the confidentiality of digital communications — the derogation runs until May 2025, per its last extension — so they can voluntarily scan people’s communications for CSAM in certain scenarios.

However, the child abuse regulation would create permanent rules that essentially mandate AI-based content scanning across the EU.

Critics of the proposal argue it would lead to a situation where messaging platforms are forced to use imperfect technologies to scan users’ private correspondence by default — with dire consequences for people’s privacy. They also warn it puts the EU on a collision course with strong encryption because the law would force end-to-end encrypted (E2EE) apps to degrade their security in order to comply with content screening demands.

Concerns over the proposal are so acute that the bloc’s own data protection supervisor warned last year that it represents a tipping point for democratic rights. A legal advice service to the European Council also thinks it’s incompatible with EU law, per a leak of its assessment. EU law does prohibit the imposition of a general monitoring obligation, so if the law does pass, it is almost certain to face legal challenge.

So far, the EU’s co-legislators haven’t been able to agree on a way forward on the file. But the draft law remains in play — as do all the risks it poses.

Wide-ranging CSAM detection orders

The Commission’s original proposal contains a requirement that platforms, once served with a detection order, must scan people’s messages, not just for known CSAM (i.e., images of child abuse that have been identified previously and hashed for detection) but also for unknown CSAM (i.e., new images of abuse). This would further ramp up the technical challenge of detecting illegal content with a high degree of accuracy and low false positives.

A further component in the Commission’s proposal requires platforms to identify grooming activity in real time. This means, in addition to scanning imagery uploads for CSAM, apps would need to be able to parse the contents of users’ communications to try to understand when an adult user might be trying to lure a minor to engage in sexual activity.

Using automated tools to detect signs of behavior that might prefigure future abuse in general interactions between app users suggests huge scope for misinterpreting innocent chatter. Taken together, the Commission’s wide-ranging CSAM detection requirements would turn mainstream message platforms into mass surveillance tools, opponents of the proposal suggest.

“Chat control” is the main moniker they’ve come up with to encompass concerns about the EU passing a law that demands blanket scanning of private citizens digital messaging — up to and including screening of text exchanges people are sending.

What about end-to-end encryption?

The original Commission proposal for a regulation to combat child sexual abuse does not exempt E2EE platforms from the CSAM detection requirements, either.

And it’s clear that, since the use of E2EE means such platforms do not have the ability to access readable versions of users’ communications — because they do not hold encryption keys — secure messaging services would face a specific compliance problem if they were to be legally required to understand content they can’t see.

Critics of the EU’s plan therefore warn that the law will force E2EE messaging platforms to downgrade the flagship security protections they offer by implementing risky technologies such as client-side scanning as a compliance measure.

The Commission’s proposal does not mention specific technologies that platforms should deploy for CSAM detection. Decisions are offloaded to an EU center for countering child sexual abuse that the law would establish. But experts predict it would most likely be used to force adoption of client-side scanning.

Another possibility is that platforms that have implemented strong encryption could choose to withdraw their services from the region entirely; Signal Messenger, for example, has previously warned it would leave a market rather than be forced by law to compromise user security. This prospect could leave people in the EU without access to mainstream apps that use gold standard E2EE security protocols to protect digital communications, such as Signal, or Meta-owned WhatsApp, or Apple’s iMessage, to name three.

None of the measures the EU has drafted would have the intended effect of preventing child abuse, opponents of the proposal contend. Instead the impact they predict is horrible knock-on consequences for app users as the private communications of millions of Europeans are exposed to imperfect scanning algorithms.

That in turn risks scores of false positives being triggered, they argue; millions of innocent people could be erroneously implicated in suspicious activity, burdening law enforcement with a pipeline of false reports.

The system the EU’s proposal envisages would need to routinely expose citizens’ private messages to third parties that would be involved in checking suspicious content reports sent to them by platforms’ detection systems. So even if a specific piece of flagged content did not end up being forwarded to law enforcement for investigation, having been identified as non-suspicious at an earlier point in the reporting chain, it would still, necessarily, have been looked at by someone other than the sender and their intended recipient/s. So RIP, comms privacy.

Securing personal communications that have been exfiltrated from other platforms would also pose an ongoing security challenge with the risk that reported content could be further exposed if there are poor security practices applied by any of the third parties involved in processing content reports.

People use E2EE for a reason, and not having a bunch of middlemen touching your data is right up there.

Where is this hella scary plan now?

Typically, EU lawmaking is a three-way affair, with the Commission proposing legislation and its co-legislators, in the European Parliament and Council, working with the bloc’s executive to try to reach a compromise they can all agree on.

In the case of the child abuse regulation, however, EU institutions have so far had very different views on the proposal.

A year ago, lawmakers in the European Parliament agreed their negotiating position by suggesting major revisions to the Commission’s proposal. Parliamentarians from across the political spectrum backed substantial amendments that aimed to shrink the rights risks — including supporting a total carve out for E2EE platforms from scanning requirements.

They also proposed limiting the scanning to make it far more targeted: Adding a proviso that screening should only take place on the messages of individuals or groups who are suspected of child sexual abuse — that is, rather than the law imposing blanket scanning on all its users once a platform is served with a detection order.

A further change MEPs backed would restrict detection to known and unknown CSAM, removing the requirement that platforms also pick up grooming activity by screening text-based exchanges.

The parliament’s version of the proposal also pushed for other types of measures to be included, such as requirements on platforms to improve user privacy protections by defaulting profiles to non-public to decrease the risk of minors being discoverable by predatory adults.

Overall, the MEPs’ approach looks a lot more balanced than the Commission’s original proposal. However, since then, EU elections have revised the makeup of the parliament. The views of the new intake of MEPs is less clear.

There is also still the question of what the European Council, the body made up of representatives of member states’ governments, will do. It has yet to agree a negotiating mandate on the file, which is why discussions with the parliament have not been able to start.

Anyone opting for privacy would be downgraded to a basic dumb-phone style feature set of text and audio only. Yes, that is really what regional lawmakers have been considering.

The Council ignored entreaties from MEPs last year to align with their compromise. Instead member states appear to favor a position that’s a lot closer to the Commission’s “scan everything” original. But there are also divisions between member states on how to proceed. And so far, enough countries have objected to compromise texts they’re presented with by the Council presidency to agree a mandate.

Proposals that have leaked during Council discussions suggest member states governments are still trying to preserve the ability to blanket-scan content. But a compromise text from May 2024 attempted to tweak how this was presented — euphemistically describing the legal requirement on messaging platforms as “upload moderation.”

That triggered a public intervention from Signal president Meredith Whittaker, who accused EU lawmakers of indulging in “rhetorical games” in a bid to eke out support for the mass scanning of citizens comms. That’s something she warned in no-nonsense tones would “fundamentally undermine encryption.”

The text that leaked to the press at that time also reportedly proposed that messaging app users could be asked for their consent to their content being scanned. However, users who did not agree to the screening would have key features of their app disabled, meaning they would not be able to send images or URLs.

Under that scenario, messaging app users in the EU would essentially be forced to choose between protecting their privacy or having a modern messaging app experience. Anyone opting for privacy would be downgraded to a basic dumbphone-style feature set of text and audio only. Yes, that is really what regional lawmakers have been considering.

More recently there are signs support may be decreasing within the Council to push for mass surveillance of citizens’ messaging. Earlier this month Netzpolitik covered an announcement by the Dutch government saying it would abstain on another tweaked compromise, citing concerns about the implications for E2EE, as well as security risks posed by client-side scanning.

Earlier this month, discussion of the regulation was also withdrawn from another Council agenda, apparently owing to the lack of a qualified majority.

But there are a large number of EU countries that continue backing the Commission’s push for blanket message scanning. And the current Hungarian Council presidency appears committed to keep trying to find a compromise. So the risk hasn’t gone away.

Member states could still arrive at a version of a proposal that satisfies enough of their governments to open the door to talks with MEPs, which would put everything up for grabs in the EU’s closed-door trilogue discussions process. So the stakes for European citizens’ rights — and the bloc’s reputation as a champion of privacy — remain high.



Source link

Lisa Holden
Lisa Holden
Lisa Holden is a news writer for LinkDaddy News. She writes health, sport, tech, and more. Some of her favorite topics include the latest trends in fitness and wellness, the best ways to use technology to improve your life, and the latest developments in medical research.

Recent posts

Related articles

SpaceX will attempt historic catch of returning Starship booster on Sunday

Starship is ready to fly again — and for the first time, SpaceX is going to try...

TikTok’s research reportedly acknowledges negative effects on teens

Court documents suggest that TikTok executives are aware of the app’s potential harm to teenagers, according to...

‘Where we are today in biology AI is similar to GPT in 2020’: An interview with the CEO of Africa’s biggest AI startup

In January last year, German biotech company BioNTech acquired African AI startup Instadeep for over $550 million,...

The most interesting unicorns to come out of Japan

Japan’s startup sector, despite being one of the biggest in the world, has lagged behind other regions...

How do you solve a problem like MariaDB? Cozy up to the community, says new CEO

The new CEO of MariaDB (Plc) says he wants closer-knit collaboration with the foundation behind the eponymous...

What is wearable neurotech and why might we need it?

The wearables category already contains multitudes, from exercise-focused smart watches and sleep tracking smart rings to smart...

California police aren’t loving their Tesla cop cars

Elon Musk on Thursday night rolled out the latest tech from Tesla, saying of its sleek Cybercab...

Anthropic CEO goes full techno-optimist in 15,000-word paean to AI

Anthropic CEO Dario Amodei wants you to know he’s not an AI “doomer.” At least, that’s my...