Immigration judges across the United Kingdom are now employing artificial intelligence technology to assist in drafting legal rulings, having received official permission to utilise chatbots for reviewing their decisions. These judicial officers have undergone specialised training to operate a restricted version of Microsoft's AI Copilot tool, which aids in hearing preparations and the creation of skeleton judgements.
Record Backlog Strains Justice System
The British justice system is currently experiencing unprecedented strain due to a record-breaking backlog of immigration appeals, significantly hindering government efforts to deport individuals residing in the country without legal authorisation. The number of asylum seekers challenging their rejected claims has surged dramatically, nearly doubling within the past year to reach 104,400 cases.
Those engaged in the appeals process are permitted to remain in taxpayer-funded accommodation, which includes hotel placements, while their cases undergo review. This situation has created substantial pressure on immigration tribunals to expedite proceedings while maintaining judicial integrity.
Government Support for AI Integration
In February, a government advisor advocated for the implementation of artificial intelligence to better assess risks associated with releasing criminal offenders. Martyn Evans, who chairs the Sentencing and Penal Policy Commission, emphasised that AI should play a significant role within the criminal justice framework, potentially assisting judges in determining appropriate sentencing measures.
Justice Secretary David Lammy confirmed last year that the system was actively testing transcription technologies within courts and tribunals. He specifically noted that immigration and asylum chamber judges were already utilising AI to formulate notes and draft judicial remarks.
AI Applications in Judicial Proceedings
According to recent reports, training materials distributed to immigration judges encourage the application of artificial intelligence to generate comprehensive case outlines that summarise evidence presented by all parties involved. The technology can also produce bundle summaries that establish chronological timelines of events and delineate each side's legal arguments.
Furthermore, AI systems can compile lists of disputed issues and utilise this information to create decision templates that structure judicial rulings. In an instructional video, Lord Justice Dingemans, the senior president of tribunals, explained that judges could employ AI's decision-making tree functionality to summarise findings regarding anonymity requests, case backgrounds, witness statements, and legal arguments.
"All of that work is pre-done," Lord Justice Dingemans stated. "What that will do is mean that when you get to the hearing, you will be a better judge because you're completely on top of the issues."
Limitations and Responsibilities
Judges are expected to deliver their decisions within two weeks following hearings and have received explicit instructions that artificial intelligence must not be employed for analytical purposes or evidence balancing. Judicial officers retain sole responsibility for their final judgements, though chatbots may review decisions against evidence summaries and legal submissions.
The technology can additionally provide commentary regarding how comprehensively decisions address matters raised in evidence and submissions, potentially identifying any omissions. HM Courts and Tribunals Service has clarified that AI will not contribute to analytical processes or the weighing of evidence and arguments presented during proceedings.
A spokesperson explained that chatbots might be utilised to convert judicial audio recordings into text format, with judges required to verify all transcriptions before official issuance. "HMCTS welcomes the appropriate use of artificial intelligence in supporting an efficient and effective courts and tribunals system," the spokesperson affirmed.
"However, while technology may assist in some legal work and associated administrative tasks, it cannot replace the pivotal judgement and responsibilities required to make decisions on cases."
Previous AI Controversies in Legal Practice
This development follows an incident in October involving an immigration barrister accused of employing artificial intelligence to prepare for an asylum case. The legal professional reportedly confused a judge by citing legal precedents that were either entirely fictitious or completely irrelevant to the matter at hand.
Chowdhury Rahman, an experienced immigration barrister, was found to have utilised software including ChatGPT for legal research preparation. The tribunal determined that he not only employed AI for his preparatory work but subsequently failed to conduct proper accuracy verification of the generated content.



