A disturbing incident on a gaming platform recently led to the discovery of a violent plan by two minors. They were reportedly preparing an armed attack on a school, but thanks to the quick identification and reporting of their chat, authorities intervened in time. Experts say that user reporting is vital in stopping such threats before they become real.
Online gaming is one of the world’s biggest industries, with over 900 million players and yearly revenues in the tens of billions. Its rapid growth is fuelled not only by new games, but also by game-hosting services and communication tools used by players.
However, concerns are rising about the use of gaming platforms by extremists and terrorists. These actors, often skilled in digital spaces, find new ways to spread harmful messages and radicalise users.
At a recent event in Athens, held under the GEMS project, experts discussed how games are being misused. Extremist-made games often promote hateful views, painting enemies from certain communities and promoting violence as acceptable.
Daniella Pisoiu, scientific director at SCENOR in Austria, warned that extremists are now targeting children. She explained that even 12-year-olds are recruiting others, making extremism an issue among minors.
The gaming industry is responding. Yari Peka Kaleva from the European Game Creators Federation said platforms are working to build safe, healthy communities.
The event also introduced the “Watchtower” tool, an AI system developed under GEMS, aimed at detecting and preventing extremist activities in gaming spaces.