UK Watchdog Ofcom Investigates 4chan and Kiwi Farms Under New Online Safety Act
Ofcom probes 4chan and Kiwi Farms under Online Safety Act

In a landmark move for British digital regulation, communications watchdog Ofcom has launched its first formal investigations under the new Online Safety Act. The probes target two of the internet's most controversial platforms: the anonymous imageboard 4chan and the discussion forum Kiwi Farms.

The investigations will examine whether the sites failed to comply with their legal obligations to protect users from harmful content. Specifically, Ofcom will assess if the platforms adequately implemented measures to prevent the spread of illegal material, particularly content related to terrorism and child sexual abuse.

A New Era of Digital Accountability

The Online Safety Act, which gained Royal Assent in October 2023, represents one of the most comprehensive attempts globally to regulate online spaces. It places legal duties on tech companies and platform operators to keep their users safe, with severe penalties for non-compliance including fines of up to £18 million or 10% of global annual revenue.

Ofcom's decision to target these particular platforms sends a clear message about the regulator's intent to tackle some of the internet's most challenging environments head-on.

The Platforms Under Scrutiny

4chan, founded in 2003, is an anonymous English-language imageboard where users can post without registration. The platform has been associated with various internet subcultures and has repeatedly faced criticism for hosting extremist content and coordinating harassment campaigns.

Kiwi Farms, a discussion forum known for tracking and harassing individuals, primarily those within online communities, has faced widespread condemnation. Cloudflare, the internet infrastructure company, previously dropped the site as a client in 2022, citing "exceptional circumstances."

What the Investigations Will Examine

Ofcom's investigation will focus on several key areas of compliance:

  • Assessment of illegal content risks, particularly terrorism content and child sexual abuse material
  • Adequacy of content moderation systems and processes
  • Effectiveness of user reporting mechanisms
  • Implementation of age verification measures where appropriate

The regulator has stated that these initial investigations will help shape how it approaches enforcement across a wider range of services in the future.

Potential Consequences

Should Ofcom find these platforms in breach of their duties, it could issue substantial fines or even pursue criminal charges against senior managers. In extreme cases, the regulator has the power to seek court orders requiring internet service providers to block access to non-compliant services within the UK.

This development marks a significant moment in the UK's approach to online governance, potentially setting precedents for how Western democracies balance free expression with user protection in digital spaces.