UK Watchdog Launches Probe Into Telegram Over Child Abuse Allegations
The UK communications regulator, Ofcom, has initiated a formal investigation into the Telegram messaging platform. This inquiry focuses on whether Telegram is failing to prevent the sharing of child sexual abuse material (CSAM) in compliance with the UK's Online Safety Act.
Evidence and Regulatory Action
Ofcom decided to launch the investigation after receiving evidence from the Canadian Centre for Child Protection. This evidence suggested that illegal content, specifically child sexual abuse material, was allegedly present and being disseminated on Telegram. The regulator conducted an assessment prior to taking this step, highlighting the seriousness of the allegations.
Suzanne Cater, the director of enforcement at Ofcom, emphasized the importance of this action. "Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities," she stated. "We work closely with partners in law enforcement and child protection organisations to identify where these harms are occurring and hold providers to account where they're failing to meet their obligations."
Broader Context and Global Concerns
This investigation comes amid growing concerns about illegal content on user-to-user services. A report by AI Forensics identified 24,671 Telegram users in Italy and Spain actively sharing non-consensual intimate images, including child sexual abuse material. The report noted that perpetrators were predominantly young heterosexual men, and content was often monetized through fees or subscriptions.
Silvia Semenzin, a senior researcher at AI Forensics, urged regulators to act decisively. "Regulators must act with urgency and courage, listening to survivors' experiences and demands," she said. "They should mandate Telegram to cooperate with law enforcement and civil society, and to remove channels and groups sharing illegal content immediately and permanently – not after prolonged negotiation at victims' expense."
Legal Framework and Penalties
The UK Online Safety Act requires providers of user-to-user services, such as Telegram, to restrict illegal content. Ofcom has the authority to impose significant penalties for non-compliance. These include fines of up to £18 million or 10% of the company's worldwide revenue. In extreme cases, courts could require advertisers or payment providers to withdraw services from a platform, or mandate internet providers to block access to the site in the UK.
Telegram's Response and Ongoing Investigations
A Telegram spokesperson strongly denied the allegations. "Telegram categorically denies Ofcom's accusations," they stated. "Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with NGOs. We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy."
In related enforcement actions, Ofcom has also opened investigations into other platforms, such as Teen Chat and Chat Avenue, to examine whether they are protecting children from grooming. Previous proceedings led to hash-matching techniques being implemented on Pixeldrain, and Yolovit was made unavailable in the UK for failing to comply with the Online Safety Act.



