Elon Musk's X Under Fire as Hate Tweets Soar 500% in UK
Hate speech on X rockets 500% in UK, MPs warn

A shocking parliamentary investigation has laid bare a dramatic surge in hate speech on the social media platform X, formerly known as Twitter, since its acquisition by billionaire Elon Musk. The report, spearheaded by the cross-party Digital, Culture, Media and Sport (DCMS) Committee, reveals that hateful content has increased by a staggering 500% in the United Kingdom following Musk's takeover in October 2022.

A Platform in Freefall: The Evidence of Escalating Hate

The committee's findings are based on compelling data from organisations like the Center for Countering Digital Hate (CCDH). This evidence paints a grim picture of a platform where moderation has been severely weakened. MPs heard that not only has the volume of hateful posts skyrocketed, but the response from X's own safety mechanisms has become alarmingly slow and ineffective.

In one stark example, the committee was told that the average time for X to respond to user reports of hate speech has ballooned from under 24 hours to several days. Furthermore, the platform now appears to act on less than 4% of reported hate speech content, a figure that indicates a systemic failure in enforcement. This toxic environment, the report concludes, is a direct consequence of decisions made by Musk and his leadership team, which have included mass layoffs of trust and safety staff and the reinstatement of thousands of previously banned accounts.

MPs Demand Government Action on Online Safety

The DCMS Committee has issued a forceful call to the UK government, urging ministers to wield the forthcoming Online Safety Act with greater strength. The MPs argue that the current approach to regulating tech giants like X is insufficient and that the government must be prepared to take decisive action, including the possibility of imposing substantial fines.

Committee Chair Dame Caroline Dinenage MP stated that the evidence is clear: X has allowed harmful and illegal content to proliferate on its watch. The report emphasises that the platform's policies under Musk have created a perfect storm, enabling bad actors and undermining the safety of UK users. The MPs are demanding that Ofcom, the communications regulator tasked with enforcing the new online safety laws, must be robust and uncompromising in holding X accountable for the content it hosts.

The Human Cost and the Path Forward

Beyond the statistics, the report highlights the real-world impact of this unchecked hate speech. It cites testimony from advocacy groups detailing how the rise in antisemitic, Islamophobic, and racist rhetoric on X has contributed to a more hostile environment for targeted communities across the UK. The platform's reduction in moderation capacity, coupled with policy changes that favour 'absolute free speech,' is seen as a primary driver of this deterioration.

The committee's recommendations are unequivocal. They urge the government to:

  • Ensure Ofcom has the resources and mandate to investigate X's operations thoroughly.
  • Consider designating X as a platform of 'systemic risk' due to its scale and the severity of the issues.
  • Press for greater transparency from X regarding its content moderation data and algorithms.

As the Online Safety Act moves towards full implementation, this report places immense pressure on both the government and X's leadership. The message from Westminster is clear: the era of self-regulation for social media is over, and platforms that fail to protect users from hate and harm will face serious consequences under British law.