A high-level panel was held in Brussels on Monday for EU regulation to tackle child sexual abuse and exploitation. The panel had guest speakers including MEP Hilde Vautmans and the Hollywood actor and entrepreneur, Ashton Kutcher.
The European Child Sexual Abuse Legislation Advocacy Group (ECLAG) that led the panel consists of the Brave Movement, ECPAT, Missing Children Europe, IWF, Terre des Hommes and Thorn. The European Union has proposed a new law that could force platforms to scan all private messages for signs of child abuse, and of which one of its most prominent advocates is Ashton Kutcher.
"At least one in five children in Europe is a victim of some form of sexual violence, often online," said MEP Vautmans. She went on to explain the responsibility of the European Parliament to ensure that the images are removed from the internet forever.
Vautman believes that it is important that the proposed legislation fights all forms of child sexual abuse material.
Co-founder of Thorn and actor Ashton Kutcher made it a point to discuss the matter of privacy and what privacy means rather than the devastating abuse of children. "I have yet to meet a person who cares about privacy deeply, that doesn't care about child rights," Kutcher said. "Yet, this conversation constantly becomes a political debate between child's rights and privacy."
Thorn, which Kutcher co-founded, is a technology company that provides detection services and tools to help law enforcement and companies better protect children online.
Three different versions of detection technology would be used to detect exploitation happening online and prevent the abuse from continuing. The technology in question is used daily.
The first is homomorphic encryption, often used for medical information. The tool allows law enforcement to detect child exploitation happening online without actually looking at the image.
The second version is called hash matching. This service is used any time a password is made. The institution for which the password was made doesn't know your password even if it is on their site. Only you know your password.
The third is called privacy-protected machine learning. This can obfuscate what something is and still know what it is. For example, a person knows what a pancake is without knowing the secret recipe to make it.
"All of these techniques give a likelihood that what you're looking at is child sexual abuse material," Kutcher further explains. "They can all give you a quantifiable understanding of what you're looking at without a single human looking at it."
The members of the panel discuss that behind every image is a real child who is suffering from real abuse. "We cannot allow the internet to be a safe haven for abusers," said Heidi De Pauw, CEO of the Belgian foundation Child Focus.
The panel hopes that this legislation will prevent further exploitation, sexual abuse and the re-traumatisation of children. This proposal is the first victim-centred proposal legislature in the world, according to Catharina Rinzema, MEP from the Netherlands.
Not everyone is a fan of the new law, notably privacy rights groups. German NGO Netzpolitik has argued that the new law promotes intrusive technologies that circumvent end-to-end encryption.