EU countries have reached a compromise on new rules to fight online child abuse after years of difficult talks. Justice ministers agreed on a common position that pushes platforms to remove illegal content, while dropping mandatory scanning of private messages.
Compromise after years of dispute
The plan aims to force social media companies to remove child sexual abuse material more quickly. It also proposes a new EU Centre on Child Sexual Abuse to support national authorities. Governments would gain powers to order companies to block or delete harmful content within the bloc.
The proposal has faced strong resistance since 2022. Several EU presidencies failed to build unity as arguments over detection orders and encrypted messaging created sharp divisions.
Encryption at the heart of the clash
Denmark finally secured agreement by removing mandatory scanning of private communications. End-to-end encrypted messages will not be scanned by authorities, though platforms such as Instagram or Facebook Messenger may still check content themselves.
Many tech firms welcomed the shift, yet industry groups warned that negotiations must protect both minors and communication privacy.
Campaigners warn of hidden risks
Privacy advocates remain uneasy. Former MEP Patrick Breyer argued the deal still opens the door to mass surveillance through “voluntary” scanning by major US-based platforms. Concerns also persist over high error rates in AI detection systems and the possible use of age-verification tools like facial recognition.
Next steps for 2026
With opposition from some countries, the deal now moves to talks with the European Parliament and the Commission next year. Lawmakers must agree before temporary rules allowing voluntary scanning expire.