Investigation exposes massive child abuse content network on X

A European investigation has uncovered a large, coordinated network distributing child sexual abuse material (CSAM) on social media platform X.

Researchers from the nonprofit Alliance4Europe found at least 150 accounts promoting explicit images of children during a four-day period in July. They estimate the network began around 17 May and posted “millions” of times, operating with little interference.

The group’s report says criminals targeted pornography-related hashtags, flooding them with illegal content and using them to link users to other abusive accounts. Many posts included links to Telegram or Discord groups, dating sites, or pages selling folders of CSAM.

One linked Bitcoin wallet had received $660 (€573) in 23 transactions, suggesting buyers were paying for illegal content. Some videos were described as extremely graphic, depicting sexual assault and rape of children.

When researchers flagged posts, X began removing material more quickly and blocking underage users, though the network’s activity continued. They noted new accounts were created continuously, possibly through automated systems, allowing content to resurface.

The findings emerge as an American court revived part of a negligence lawsuit against X over failing to promptly report child abuse videos. In the EU, lawmakers are debating how to tackle online CSAM without undermining privacy rights.

X told Euronews Next in June it had “zero tolerance” for child exploitation and has invested in “hash matching” technology to detect and remove CSAM. In 2024, the platform reported suspending 4.5 million accounts and making 686,176 reports to US authorities, leading to 94 arrests and one conviction.