Palantir Extends Influence in UK with Access to Critical FCA Intelligence
The US artificial intelligence firm Palantir, co-founded by billionaire and former Donald Trump supporter Peter Thiel, has been granted a significant contract to access highly sensitive data from the Financial Conduct Authority (FCA) in the United Kingdom. This three-month trial arrangement, valued at over £30,000 per week, allows Palantir to analyse the regulator's extensive "data lake" as part of efforts to combat financial crimes such as fraud, money laundering, and insider trading.
Deepening Ties and Mounting Controversies
This latest deal marks a further expansion of Palantir's footprint within British governmental operations, adding to existing contracts exceeding £500 million with entities like the NHS, military, and police forces. The company will deploy its AI platform, Foundry, to process vast quantities of information, including case files marked as highly sensitive, reports on problematic firms, and consumer complaints lodged with the financial ombudsman. Data sources encompass phone call recordings, emails, and social media posts, raising immediate alarms among privacy advocates and campaign groups.
Internal and external concerns have surfaced regarding the ethical implications of this partnership. One source within the FCA questioned Palantir's reliability, asking, "Once Palantir understands how we detect money-laundering threats, how do we know that they are ethically reliable enough not to share that information?" This sentiment echoes broader criticisms from leftwing MPs in the House of Commons, who have labelled Palantir as "highly questionable" and "ghastly" due to its involvement with the Israeli military and US immigration enforcement under ICE.
Balancing Innovation with Data Protection
Professor Michael Levi, a renowned expert in money laundering at Cardiff University, acknowledged the potential benefits of AI in tackling financial crime, noting a "serious under-exploitation" of regulatory data. However, he highlighted critical questions about whether Palantir's owners might inadvertently disclose methodologies to associates, emphasising the need for clear protocols on data usage.
In response to these apprehensions, the FCA has outlined stringent safeguards. Palantir will operate as a "data processor" rather than a "data controller," meaning it can only act under direct instruction from the regulator. The FCA retains exclusive control over encryption keys for the most sensitive files, with all data hosted and stored within the UK. Upon contract completion, Palantir must destroy any data, and intellectual property derived from the analysis will remain with the FCA. Additionally, the regulator confirmed that Palantir cannot use the data to train its AI products.
Legal and Privacy Implications Under Scrutiny
Christopher Houssemayne du Boulay, a partner at Hickman & Rose specialising in financial crime defence, underscored the privacy risks involved. He explained that FCA investigations often compel firms to surrender extensive data, potentially including bank account details and personal information from innocent individuals. "If you ingest that data and use it to train an AI system, there are very significant privacy concerns," he stated, advocating for robust confidentiality measures.
Despite guidelines promoting the use of synthetic data in pilots, the FCA opted for real data to ensure a meaningful test, arguing that effective technology deployment is crucial in the fight against financial crime. A spokesperson for the FCA defended the procurement process, stating, "We ran a competitive procurement process and have strict controls in place to ensure data is protected." Palantir redirected requests for comment to the FCA, maintaining its stance on respecting human rights and highlighting past contributions, such as scheduling additional NHS operations and aiding UK police in domestic violence cases.
As Palantir's role in the British state continues to grow, this contract with the FCA underscores the ongoing tension between leveraging advanced AI for public good and safeguarding individual privacy in an increasingly digital age.



