Law enforcement experts and policymakers are scheduled to convene on September 12th to deliberate on proposals that would mandate technology companies, including platforms like Signal and WhatsApp, to scan encrypted messages before they are transmitted. This meeting precedes a planned vote on the proposals, known as “Chat Control,” by October 14th, an initiative spearheaded by the Danish presidency of the EU Council.
The “Chat Control” proposals advocate for the mass scanning of mobile phones and computers to identify potential child abuse material within encrypted communications services. However, this initiative has sparked significant opposition from security experts and privacy advocates.
On September 9th, over 500 cryptographers and security researchers issued an open letter cautioning that the proposals are technically unfeasible and would “completely undermine” the security and privacy of European citizens. They argue that such measures would create vulnerabilities that could be exploited by hackers and hostile nation-states.
WhatsApp, a widely used encrypted messaging service, has also voiced concerns regarding the EU’s draft proposals. A spokesperson for WhatsApp stated that the proposals would compromise end-to-end encryption, thereby endangering the privacy, freedom, and digital security of users.
The European Commission initially proposed mandating tech companies to scan emails and messages for potential child abuse content in 2022. However, these plans were stalled due to opposition from a minority of member states who feared the proposals would compromise the security and privacy of EU citizens.
In July 2025, the Danish presidency introduced a compromise aimed at balancing the security of encrypted communications with the need to identify potentially illegal content. This compromise asserts that the proposed regulation should not be interpreted as prohibiting, weakening, or circumventing encryption, and it permits technology companies to continue offering end-to-end encrypted services.
However, the compromise also requires technology companies to implement “vetted technologies” on devices to scan messages for images, videos, or URLs potentially associated with known child abuse content before encryption and transmission. These companies would also be required to deploy artificial intelligence (AI) and machine learning algorithms to detect previously unknown abuse images.
As of September 10th, 15 member states supported the Danish proposals, while six remained undecided and six opposed them. Opposing states, including Belgium, Poland, Finland, and the Czech Republic, have raised concerns about the mass surveillance of citizens’ communications. Supporters include France, Italy, Spain, and Sweden, while Germany remains undecided. The voting power of each member state is proportional to its number of representatives.
The Danish compromise agreement outlines specific requirements regarding encryption:
- Publicly available messaging services using end-to-end encryption would be required to detect abuse material before transmission.
- Providers should remain free to offer services using end-to-end encryption and should not be obliged to decrypt data or create access to end-to-end encrypted data.
- Users of encrypted services would be asked to consent to having images, videos, and URLs they send monitored.
- Users who do not consent may be able to send messages without images, videos, or URLs.
- Detection technologies for end-to-end encrypted services would be certified and tested by an EU center to verify that their use does not weaken encryption.
- The EU Commission would have the power to approve detection technologies.
- Providers of detection services should have human oversight to reduce false positives and false negatives.
- Detection technologies must not “introduce cyber security risks for which it is not possible to take any effective measures to mitigate such risk”.
Opponents of the proposals argue that “Chat Control” effectively introduces “suspicionless” mass surveillance for hundreds of millions of Europeans. The open letter from cryptographers and security researchers warns that on-device detection, also known as client-side scanning, inherently undermines the protections of end-to-end encryption without guaranteeing improved protection for children.
They argue that the detection mechanism would become a prime target for hackers and hostile nation-states, who could reconfigure it to target other types of data, such as financial or political interests. This would undermine the security of encrypted messaging apps used by politicians, journalists, human rights workers, EU civil servants, law enforcement officers, and ordinary citizens.
The proposals “unequivocally violate” the principles of end-to-end encryption and weaken its protection, threatening the public’s right to privacy. The scientists warn of potentially serious consequences for democracy and national security. They also claim that scanning technology could be repurposed by less democratic regimes to monitor dissidents and opponents or to censor communications, creating unprecedented capabilities for surveillance, control, and censorship.
The Danish proposals could lead to large numbers of innocent people being wrongly investigated for sending images incorrectly identified as suspicious. The researchers warn that existing detectors would yield unacceptably high false positive and false negative rates, making them unsuitable for large-scale detection campaigns. They also argue that there is no known machine-learning algorithm that can reliably identify illegal images without making large numbers of mistakes.
German encrypted email provider Tuta Mail has stated that it would take legal action against the EU rather than compromise its users’ privacy by introducing backdoors into its encrypted messaging service. CEO Matthias Pfau believes the proposals would undermine trust in European technology, driving users to foreign tech giants.
Alexander Linton, president of the Session Technology Foundation, argues that it is impossible to introduce scanning without creating new security risks. He states that none of the available technologies meet the standard of not introducing unmitigable risks.
Matthew Hodgson, CEO of Element, a secure communications platform used by European governments, believes the proposed “Chat Control” regulation is fundamentally flawed and would put the privacy and data of 450 million citizens at risk. He argues that undermining encryption by introducing a backdoor for lawful intercept is deliberately introducing a vulnerability that will be exploited.
Hodgson referenced a years-long Chinese hacking operation, dubbed Salt Typhoon, which used law enforcement backdoors in the US public telephone network to access call records and unencrypted communications of US citizens. He noted that the US is still urging its citizens into end-to-end encrypted systems as a result.
Signal warned last year that it would pull its messaging service out of the European Union rather than undermine its privacy guarantees. Callum Voge, director for government affairs and advocacy at the Internet Society, said client-side scanning creates opportunities for bad actors to reverse engineer and corrupt scanning databases on devices. He likened client-side scanning to someone reading over your shoulder as you write a letter, as opposed to breaking encryption, which is like having the envelope ripped open.
Voge stated that even if AI scanning were 99.5% effective at identifying abuse, it would lead to billions of wrong identifications every day, potentially overwhelming the system and leading to innocent people being incorrectly labeled as sharing illegal child abuse material.
The scientists argue that, rather than relying on a “technical fix,” governments should invest in education, reporting hotlines, and other proven techniques for tackling abuse. Voge suggests that policymakers should prioritize approaches that protect children while fostering an open and trusted internet. This includes increased resources for targeted approaches such as court-authorized investigations, metadata analysis, cross-border cooperation, support for victims, prevention, and media literacy training.
Apple previously abandoned its plans to introduce client-side scanning to detect child abuse on the iPhone after leading scientists published a paper that found the supplier’s attempts would not be effective against crime or protect against surveillance.




