No deal was found on the controversial proposal for the Child Sexual Abuse (CSA) regulation – often referred to as 'Chat Control' – at the EU level earlier this week. Now, it seems like the European fight against child abuse images is in danger of stalling.
The debate around the CSA regulation has been complicated, as it pits the need to stop the spread of child sexual abuse images against people's right to privacy.
Proposed in May 2022, the legislation aims to prevent (the spread of) child sexual abuse online through several measures. This would include the establishment of a framework requiring digital platforms to detect and report images of child abuse in people's private messages.
In practice, this means that messaging services (such as WhatsApp, Instagram or even Signal) would scan for images of abuse in all users' private messages.

Credit: Belga
Opponents and privacy experts argue that the proposal would effectively allow for mass scanning of all private digital communications – undermining end-to-end encryption and violating fundamental rights to privacy and data protection through enabling generalised, indiscriminate surveillance.
If the proposal were to be adopted, the law would see millions of innocent citizens being monitored and paying for the crimes of a very small minority.
Experts are particularly concerned about the detection aspect of the proposal: an algorithm could easily misread what falls under child abuse. They concluded that there is currently no technological way to detect these images without "unacceptably high" error rates.
After Denmark (which held the Presidency of the Council of the EU in the second half of last year) backed down on making the interception and scanning of content mandatory in November 2025, the state of the proposal has been uncertain.
What happened?
Earlier this week, negotiations between the European Parliament and EU Member States about potentially extending a voluntary version of the 'Chat Control' regulation concluded without an agreement.
Digital freedom fighter and former German MEP Patrick Breyer (Pirate Party) spoke of a "triumph" for civil society.
"We have stopped a broken and illegal system. Just as the postal service is not allowed to simply open our physical letters, the indiscriminate scanning of our private digital messages must remain strictly off-limits," he said.
With service providers facing different rules in different countries due to a lack of harmonisation at the EU level, voluntary detection and reporting by digital companies has proved insufficient.
While a compromise between the Parliament and Member States is meant to be voted on at a plenary session on 26 March in Brussels, no agreement has been reached.

Illustration picture shows apps including social media platforms on a phone. Credit: Belga/Bruno Fahy
Without an agreement next week, US tech giants such as Meta, Google, and Microsoft will have to stop indiscriminately scanning the private chats and photos of European citizens starting 4 April.
The Cyprus Presidency of the Council of the EU, which is leading the negotiations on behalf of the Member States, warned that the deadlock would create a gap – potentially hindering efforts to rescue victims and prosecute offenders.
Member States have blamed the European Parliament for the stalemate, which insisted that the exemption should not apply to encrypted end-to-end communications.
Additionally, the Parliament demands that the analysis of private messages be restricted to content that was already identified by authorities, or users flagged by law enforcement.
This, however, limits the scope of the regulation in ways that render it "ineffective" for most Member States, they said.
Now what?
The European Parliament and EU Member States are continuing negotiations on a permanent child protection regulation. While EU governments are once again demanding supposedly "voluntary" mass scans, the EU Parliament is now championing a new approach: platforms should be obliged to protect children directly through safe design.
With these 'Security by Design' features, apps would be required to technically prevent sexual approaches to children (grooming) through strict default settings and warning features.
Additionally, the approach states that illegal material on the open web (and the darknet) must be actively tracked down and destroyed at the source via strict, immediate takedown obligations.
"Flooding our police with false positives from mass surveillance does not save a single child from abuse," said Breyer, who called the failed negotiations "a clear stop sign to this surveillance mania."
"Indiscriminate mass scanning of our private messages must finally give way to truly effective child protection that respects fundamental rights," he said. "That is what truly protects children."
The Brussels Times contacted the Cyprus Presidency of the Council of the EU for comment, but did not receive a reply by the time of publication.

