The European Union has just taken another step in the long saga of the “ChatControl” project .
After three years of intense discussions , member states have reached a compromise which, while abandoning mandatory automatic detection of child pornography content, leaves a persistent concern about the surveillance of private communications.
While this regulation was initially intended to harmonize the fight against online child sexual abuse , its new version disrupts the balance between child protection, privacy, and digital trust…
A compromise that overturns European strategy
In a recent announcement , the EU Council approved a revised version of the text that abandons the requirement for systematic scanning of messages, including encrypted ones. This provision was the most contested by digital rights advocates and had led to several successive blockages of the project.
Instead, online platforms will now have to assess the risk of their services being used to distribute child pornography and deploy ” risk reduction ” measures .
This could take the form of reporting tools , enhanced default privacy settings , or mechanisms to limit the exposure of minors . According to the text, member states will be able to require the implementation of these measures and sanction non-compliant companies.
Services will also be able to continue, on a voluntary basis, to analyze shared content after April 2026 , when the temporary exemption allowing these checks without violating privacy rules expires. In this regard, the EU also plans to create a European Centre to assist countries and victims with this mechanism.
A compromise deemed favorable to tech giants, as states have abandoned the idea of implementing a uniform obligation , preferring to leave each government responsible for regulating the application of the text…
A persistent threat to privacy
Even if the requirement for automatic scanning disappears, several provisions continue to fuel concerns . Article 4, in particular, requires messaging services to take ” all appropriate measures ” to mitigate risks.
As reported by BFM TV , for Patrick Breyer , former German MEP, this could lead national authorities to demand massive analyses of communications, including on end-to-end encrypted services.
Furthermore, platforms would have to identify minors ” reliably ,” which could lead to widespread age verification via official documents or biometrics. This scenario is dreaded by advocates of anonymity, vital for activists, journalists, and victims seeking protection.
Finally, the ban on access to messaging apps and chat-based games for those under 17, except under strict conditions, remains another controversial measure. Some elected officials see this as a form of digital isolation for teenagers…