Ofcom Launches Major Telegram Probe Over Child Abuse Content Concerns
Ofcom Launches Major Telegram Probe Over Child Abuse

The UK's online safety regulator, Ofcom, has launched a formal investigation into the messaging application Telegram to ascertain whether it has failed, or is currently failing, to adequately address and remove child sexual abuse material from its platform. This significant probe follows the submission of compelling evidence by the Canadian Centre for Child Protection, which alleged the presence and active sharing of such illegal content on Telegram's services.

Regulatory Action Under the Online Safety Act

Under the provisions of the UK's Online Safety Act, providers of user-to-user services, including Telegram, are legally mandated to assess and mitigate the risks of such horrific crimes being perpetrated on their platforms. Ofcom's investigation will specifically examine potential failings by Telegram to comply with its statutory duties concerning illegal content.

Suzanne Cater, the Director of Enforcement at Ofcom, emphasised the critical nature of this issue. "Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities," she stated. "We work closely with law enforcement and child protection organisations to identify where these harms are occurring and hold providers to account where they're failing to meet their obligations."

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Consequences for Non-Compliance

Ofcom has issued a stark warning that firms which fail to implement the necessary protections for children will "face serious consequences." Should the investigation identify breaches of the Online Safety Act, the regulator possesses the authority to impose substantial financial penalties. These can include fines of up to £18 million or 10% of the company's qualifying worldwide revenue, whichever sum is greater.

In the most severe cases, Ofcom can also seek a court order requiring internet service providers within the UK to block access to the non-compliant service entirely, effectively removing it from the British digital landscape.

Broader Scrutiny on Chat Services

In a related development announced concurrently, Ofcom has also opened investigations into two other chat service providers: Teen Chat and Chat Avenue. These probes will assess whether these platforms are taking appropriate steps to evaluate and mitigate the risks of UK users encountering illegal content and activities, including online grooming.

The watchdog indicated that its collaborative work with child protection agencies has raised specific concerns about the risks to children on these platforms. Both services feature chatrooms, private messaging functions, and media-sharing capabilities, which regulators fear could be exploited by predators.

Cater further elaborated on the ongoing challenge, noting, "Progress has undeniably been made, particularly with file-sharing services, which are too often used to share horrific child sexual abuse imagery. But this problem extends to big platforms too, and teen-focused chat services are too easily being used by predators to groom children. These firms must do more to protect children, or face serious consequences under the Online Safety Act."

This coordinated regulatory action underscores a heightened focus on enforcing online safety standards and protecting vulnerable users from digital harm.

Pickt after-article banner — collaborative shopping lists app with family illustration