EU DECODED: How to balance rights to privacy with combatting online child sex abuse?
One in five children are estimated to be victims of sexual crimes on- and offline and EU member states made some progress with rules designed to criminalise the abuse at the end of 2024. But they failed to agree to controls over online image sharing since these could impact data privacy rights.
MEPs and member states are working on two sets of rulebooks: one being the Child Sex Abuse Directive that will define crimes online as acts such as livestreaming sexual abuse and sharing paedophilic material, including images generated by Artificial Intelligence.
The other is the Child Sex Abuse Material regulation that will set obligations for companies offering online services, particularly chat and messaging, where such offences often occur.
"The regulation would oblige social media and messaging platform to detect but also delete any sexual abuse material of children that they would encounter, reporting that to a new EU centre. This could also include scanning of encrypted messages that was, until now, the most private way to communicate online," says Romane Armangau, who is following the legislative process for Euronews.
Finding the right balance between acting against these crimes and protecting data privacy rights of Internet users has been highly divisive.
Gaining regulatory access to encrypted messages in media such as WhatsApp and Signal is proving very controversial. Advocates for more action against child sex abuse say it is vital to include those platforms.
"One needs to know that two thirds of the messages that contains child sexual abuse materials and representations are shared through private messaging. It is a critical area where the crime occurs and we cannot accept to leave children behind in this


