- In September and early October 2025, social media posts multiplied criticizing a supposed piece of legislation by the European Union designed to combat child pornography. They claimed the so-called “chat control” draft law would allow powerful forces — ranging from the EU, to unspecified governments to “artificial intelligence” to read everyone’s private online messages, regardless of whether senders used end-to-end encryption.
- A law proposal regarding the scan of people’s online messages exists. The EU’s executive body, the European Commission, introduced it in 2022. It calls for social media and texting platforms to scan all communication by users in an effort to curb child pornography.
- However, the law would not allow government officials to do the scanning, nor would it necessarily give government officials access to the data, contrary to social media posts’ claims. The law would require the social media and texting companies to establish systems to do the scanning themselves. The law would require those companies to send suspicious content to law enforcement.
- The legislation has generated heated debate between supporters, such as Interpol and organizations committed to fighting the proliferation of child sexual abuse material online, and detractors, including digital-rights activist groups and texting and social media platforms like X, Signal and WhatsApp.
- As one of the EU’s two legislative bodies, the Council of the EU (not the European Council, which does not legislate), prepared to vote on the draft law in mid-October 2025. Due to disagreement between countries, however, the Council of the EU postponed the vote to late 2025.
In fall 2025, posts multiplied on social media expressing privacy concerns over a supposed piece of legislation by the European Union designed to combat the spread of child pornography.
People claimed the so-called “chat control” draft law would allow a variety of powerful forces — ranging from the EU to unspecified governments to “artificial intelligence” — to read everyone’s private online messages.
The posts claimed even messages between senders on end-to-end encrypted platforms like WhatsApp or Signal would be susceptible to the scans to flag child sexual abuse material.
For example, an Oct. 4, 2025, an X post warned people about the so-called “chat control” law (archived):
Many posts on X shared the same claim. Some people, like the author of an August 2025 post on Reddit (archived), alleged the proposal would violate people’s privacy.
It is true that the European Union is considering legislation that would compel technology companies to scan all communications and stored content to flag child sexual abuse material. Supporters say technology companies already scan for illegal content; for example, the platforms monitor for content related to terrorism. By adding scans for child pornography, proponents argue, law enforcement could curb abuse by identifying offenders not yet known to them.
However, the law would not allow government officials to do the scanning, nor would it necessarily give government officials access to the data, contrary to social media posts’ claims. The law would require the social media and texting companies to establish systems to do the scanning themselves. The law would require those companies to send suspicious content to law enforcement.
The legislation has generated heated debate between supporters, such as Interpol and organizations committed to fighting the proliferation of child sexual abuse material online, and detractors including digital-rights activist groups and texting and social media platforms like X, Signal and WhatsApp.
Some opponents claim the legislation leaves a door open for criminals and hackers to gain access to people’s data and undermines rights protecting people’s privacy and personal information, such as Articles 7 and 8 of the European Union’s Charter of Fundamental Rights.
Below is an explanation of the legislation’s history and the debate surrounding it.
The proposal’s history
The EU Commission, the EU’s executive body, introduced the legislation in May 2022 to “prevent and combat child sexual abuse” by identifying and removing such material online and prosecuting those who seek or share it.
If passed, the law would compel online messaging services — such as WhatsApp, Signal, Telegram — as well as email, cloud storage, social media and forums sites to scan all types of content (emails, text messages, posts, links, videos and images) for child sexual abuse material.
To achieve this, companies that offer end-to-end encryption to guarantee users’ privacy (for example, platforms like WhatsApp and Signal) would need to weaken or break their current systems for shielding data.
Tech companies would use automated scanners powered by artificial intelligence to monitor massive amounts of data. Then, under the law, the tech companies would flag suspicious content to local authorities.
One legislative body, the EU Parliament, voted in 2023 to protect encryption against mass surveillance, signaling the possibility of the legislation changing before a final vote.
On July 1, 2025, the day Denmark began its presidency of the European Union, the country introduced a compromise text — draft legislation that replaced the 2022 “chat control” proposal — with some changes. (EU member countries take turns presiding over the Council of the EU, the second legislative body, in six-month rotations. Before Denmark, the Czech, Spanish, Belgian, Hungarian and Polish presidencies tried and failed to secure a compromise on the “chat control” legislation. Denmark’s presidency will end on Dec. 31, 2025.)
Denmark’s revised draft law categorizes communications and storage platforms into “low, medium and high risk” using certain criteria, such as how proactive the platform is in protecting children. Only high-risk services would need to scan for child sexual abuse material, and that scanning would focus only on visual content and URLs, as opposed to text and voice messages — at least at first.
With that initial focus primarily on child pornography, cases of grooming would be “out of the scope subject,” the legislation says. But that could change later:
The scope of detection orders covers known and new child sexual abuse material, while grooming is out of the scope subject to its possible inclusion in the future through a review clause.
In order to protect end-to-end encryption, Denmark’s compromise text proposes “client-side scanning,” a system that scans content before it’s sent and can block its transmission. This would require users to consent to the scanning, and the text says users who opt out would have limited access to platforms.
Further, Denmark’s text encourages service providers to assess and verify users’ ages — something some of them already do.
Critics say Denmark’s suggested changes to the original 2022 law proposal did not go far enough to protect digital rights and allowed for an expansion of scanning if the system is effective.
Weeks after Denmark introduced its text, in August 2025, a Denmark-based software engineer known as Joachim set up a website that emailed representatives of all EU member states at the European Parliament. Joachim’s efforts fueled a swarm of social media posts on the issue, including posts with misleading claims about what the legislation would do.
Who’s for and against so-called ‘chat control’
Critics say such a law would violate EU citizens’ right to privacy and could create a slippery slope to stronger government censorship and control.
Also, because tech companies would need to use automated scanners to monitor massive amounts of users’ data, critics worry about “false positives” — or, instances of the scanners incorrectly flagging noncriminal content by failing to interpret the context.
Opponents include tech industry leaders like Elon Musk’s X, Meta’s WhatsApp (archived), and Signal, whose president Meredith Whittaker described the draft law as an “existential threat” to the platform (archived). Critics also include the European Digital Rights network (EDRi), a collective of European non-governmental organizations focused on protecting digital rights.
Proponents include Interpol, which in 2021 claimed encryption enabled the proliferation and circulation of child sexual abuse material, and Thorn, a U.S.-based organization co-founded by actors Ashton Kutcher and Demi Moore.
Thorn, which developed a proprietary system to scan communications, has lobbied EU officials since at least 2022 to implement the draft law. That year, the organization signed an open letter with more than 70 organizations with similar missions, thanking the EU’s commission for introducing the proposal and urging the legislative bodies to pass it.
What’s next for the legislation?
At the Council of the EU, ministers from all EU member states meet periodically with their counterparts to vote on issues in their areas of focus. (The group is different than the European Council, which gathers heads of state and does not legislate.)
In the case of “chat control,” the group of justice and home affairs ministers, the Justice and Home Affairs Council, met in mid-October to vote on Denmark’s compromise text. But ongoing gridlock resulted in the group postponing the vote to late 2025.
For the legislation to pass the group, it must reach a “qualifed majority” by meeting two conditions at once: 55% of EU member countries must vote in favor (15 countries out of 27) and that voting block must represent at least 65% of the EU population. If that happens, the legislation would go to the EU Parliament for a final vote.
As of this writing, Germany, the largest country in the EU by population, opposes the draft law, along with eight EU members states. France, Ireland, Spain, Portugal and Denmark are among 12 countries that support the draft law. Six countries are undecided, including Italy, Belgium, Sweden and Greece.



