Government Scrambles to Justify 'Racist' Road Scheme Algorithm After Equality Impact Report Leak
Govt Used 'Racist' Algorithm for Road Funds Despite Warnings

The Department for Transport (DfT) is facing severe criticism after a leaked internal report revealed its own officials warned that a new algorithm used to fund road schemes would have a "racist" outcome and severely disadvantage deprived areas across England.

The controversial algorithm, a key component of the £1.7 billion Active Travel Fund, was designed to rank local bids for cycling and walking infrastructure. However, the leaked Equality Impact Assessment, obtained by The Independent, concluded the formula would entrench existing inequality by systematically favouring affluent, predominantly white rural areas over ethnically diverse and poorer urban centres.

‘Blatant Discrimination’ Buried in the Data

Officials explicitly stated that the algorithm's core metrics—which heavily weighted historic cycling frequency and existing infrastructure—were fundamentally flawed. The report bluntly warned the model would "worsen existing inequalities" and have "a disproportionately negative impact on ethnic minority groups."

Despite these stark warnings in November 2020, the DfT pressed ahead with the algorithm. The result was a funding distribution that critics have labelled as blatant discrimination. Dozens of deprived boroughs in cities like London, Birmingham, and Manchester saw their bids rejected, while wealthier, less populated counties received significant allocations.

Government in Damage Control Mode

In response to the leak, a government spokesperson offered a weak defence, stating the algorithm was merely "one of a number of inputs" used to make decisions. This justification directly contradicts the contents of the leaked report and the explicit admissions from officials that the model was the primary determinant for funding.

This scandal raises serious questions about transparency and accountability within Whitehall. The fact that a major policy with such clear discriminatory potential was implemented against explicit expert advice points to a profound failure in the policymaking process.

A Pattern of Problematic Policies

This is not an isolated incident. The controversy echoes previous government algorithm scandals, such as the A-level results fiasco of 2020, which also disproportionately affected students from disadvantaged backgrounds. It suggests a systemic issue within government: a reliance on techno-solutionism that ignores real-world societal structures and inherent biases within data.

Campaigners and local MPs from affected constituencies are now demanding a full inquiry and an immediate review of the allocated funds. The leak forces an uncomfortable question: how many other departmental decisions are being made by flawed algorithms that perpetuate inequality under the guise of data-driven neutrality?