The Belgian Presidency of the Council of the EU is seemingly ending on a slightly controversial note as the vote on the Child Sexual Abuse (CSA) regulation – also called "chat control" – was taken off the agenda on Thursday, following harsh criticism over privacy concerns.
The debate on this file has been complicated by the fact it pits the need to stop the spread of images of child abuse against the right to privacy. Should messaging services like WhatsApp or (even a privacy-focussed app like) Signal monitor the photos people send to each other?
What would the regulation do?
End-to-end encryption of messaging platforms such as Whatsapp or Signal makes it much more difficult for police to intercept communications between suspected criminals, for example by tapping phones. Authorities across the EU have long been asking for exceptions to these privacy laws – specifically for files on child sexual abuse.
The aim of the EU's Child Sexual Abuse legislation (or "chat control") is to tackle child abuse by stopping the circulation of child abuse images online. In 2022, the European Commission made a proposal on this, which would oblige technology companies to detect and report images of child abuse in private messages.
In practice, this means that messaging services such as Whatsapp, Instagram or Signal would scan for images of abuse in all users' private messages.
The initial proposal was rejected by the European Parliament last autumn, but Belgium – which has been closing legislative files at record speed over its EU presidency – still proposed a reworked version in the last few days of its presidency, to be put to a vote.
The compromise proposal, however, is still about tracking users' private communications. This would not include regular texts or voice messages, but visuals (photos and videos) and URLs – which would all be scanned before being sent. While users would have to give explicit consent first, refusing this would come at the cost of no longer being able to send photos or videos.
Why are privacy experts worried?
Privacy experts are sounding the alarm as the proposal has the potential to violate numerous privacy laws. "It is a very intrusive idea, as if the police install a camera in all houses to see if anything happens," encryption specialist and professor Bart Preneel (KU Leuven) told De Morgen.
If adopted, this law would see millions of innocent citizens being monitored and paying for the crimes of a very small minority. "What was originally called chat control is now called upload moderation. Through such changes in language, Belgium is trying to make the proposal more manageable," he said. "But at its core, nothing is changing."
All photos and videos on messaging services would be compared to a list of known images of child sexual abuse, and the bill also states that artificial intelligence will be able to detect new images in the future.
Experts are particularly concerned about the new image detection, seeing as an algorithm could easily misread what falls under child abuse. For example, if parents send each other photos of their child having a bath, this could be unfairly labelled as illegal footage.
"Scientists have made it clear that scanning with AI is technically impossible, but this was ignored," Preneel said. "People can now be accused of this horrible crime without doing anything wrong, resulting in major tragedies."
In an open letter, more than 270 scientists – as well as EDRi, the European Digital Rights advocacy group – called for a re-evaluation of the CSA proposal. They argue that the rebrand from 'chat control' to 'upload moderation' is a mere cosmetic change and fails to address the security and rights concerns on client-side scanning.
"Scanning at the upload point defeats the end-to-end principle of strong encryption, could easily be circumvented, and would create new security vulnerabilities that third parties could exploit," the letter reads. "In short, it will not solve the problem of the online spread of child sexual abuse material, but will introduce significant security risks for all citizens, companies, and governments."
Embarrassing how @EU2024BE ends its presidency by pushing #ChatControl, the umpteenth surveillance proposal that promises more safety. Real safety doesn't come with a quick tech fix that can have devastating consequences. Remember: function creep is real! https://t.co/IuKVWXmyJV
— Nathalie Van Raemdonck @eilah_tan@aoir.social (@eilah_tan) June 19, 2024
On social media, Belgian computer scientist Jeroen Baert underlined that the premise of choosing between privacy and child protection is false.
"Those with a genuine desire to share images of child sexual abuse will find new platforms that escape legislation to do so anyway. It will always be possible to get an encrypted message to or from someone else. No criminal will be stopped by this legislation," he said.
Baert added that it is "naive" to think that a secure backdoor can be built into end-to-end encryption to catch criminals. "You cannot compromise on that. Either everybody has secure communication or nobody does."
Why was the vote cancelled?
The committee of permanent representatives of the 27 EU Member States was supposed to discuss this new compromise on Thursday, but the vote has been postponed for now.
The Council can only start negotiating the text with the Parliament after they reach a majority (55% of countries representing 65% of the EU population) in favour of the proposal.
It was known that the qualified majority required to approve the proposal would be very small, particularly following the harsh criticism of privacy experts on Wednesday and Thursday.
Indeed, several Member States changed their position at the last minute, such as Poland, which tipped the balance.
While speaking at the 20th anniversary of the European Data Protection Summit (EPDS) on Thursday, the European Commissioner for Values & Transparency Věra Jourová admitted on record for the first time that the law would break end-to-end encryption. "The Commission proposed the method or the rule that even encrypted messaging can be broken for the sake of better protecting children," Jourová told the conference.
"[On Thursday morning], it soon became clear that the required qualified majority would just not be met. The Presidency therefore decided to withdraw the item from today's agenda, and to continue the consultations in a serene atmosphere," a Belgian EU Presidency source told The Brussels Times.
Related News
- Sextortion, deep nudes: Record reports of images showing child abuse in Belgium
- Child sexual abuse law: EU Parliament compromise 'step back' for safety online
- 'Toxic warehouse' of child sexual abuse: EU urged to 'get a grip' on spiralling problem
While Member States have diverging views on the exact method of stopping the spread of child sexual abuse images, they all agree on the major importance of protecting children while also protecting people's privacy.
As EU regulations for big tech often tend to become international ones, the Council is aware of its responsibility to children across the world.
"This remains a high priority for the Council, and the work will continue to find a position and start the negotiations with the European Parliament," the source added. "This is a clear commitment from the Council to continue protecting the children from despicable crimes."
A new date for the vote has not yet been proposed.