UK officials uneasy about data safety with tech partner Palantir
John Kjorstad
Analysts say it's nearly impossible to verify whether Palantir is complying with the terms of its NHS contract, especially after the company published a controversial 22-point manifesto. The trust built through a nominal £1 deal during the pandemic has now seriously eroded. Concerns are mounting that a defence contractor with militarised values may not be suitable to manage the UK's most sensitive patient data.
London, United Kingdom – Trust, once lost, is hard to regain. For Palantir Technologies, the top U.S. defence and intelligence software firm, the trust it built in the UK through a £1 contract with the National Health Service (NHS) during the COVID-19 pandemic in March 2020 – later evolving into a six-year, nearly £400 million ($546 million) relationship – is now severely eroding.
Much of the reason stems from Palantir's own conduct. Recently, the company's X account posted a 22-point manifesto calling for compulsory military service and the development of “AI weapons,” sparking concerns among critics about whether a firm with such militarised values is fit to handle the most sensitive patient data.
“Palantir is essentially seen as a defence contractor,” said Duncan McCann, head of technology and data at the Good Law Project. “If they were just in that space, people might accept it. But a defence company has different core values from a health organisation like the NHS, and that's where the concern arises.”
Opposition to Palantir's flagship £330 million ($450 million) Federated Data Platform (FDP) – run by the NHS – has shifted from a niche worry to a serious governance issue for NHS England and the UK government. Officials are now considering ending the contract in 2027.
On Monday, Palantir faced further scrutiny after the Financial Times reported, based on an internal document, that NHS England had allowed Palantir employees “unrestricted access” to patient data.
Palantir's roots lie in defence. Its Gotham platform is used by intelligence, military and police communities worldwide. Foundry – the firm's civilian solution – is the foundation for the NHS's FDP. Despite different names, a 2020 review by Privacy International and No Tech For Tyrants showed the two systems share the same 'Palantir DNA'. This shared architecture is at the heart of governance problems that critics say have never been adequately addressed.
According to NHS England, Palantir “only operates under the direction of the NHS when processing data on the platform” and “will not own or be allowed to access, use or share data for its own purposes.” Palantir insists it “does not use patient data or any NHS data for its own purposes. Palantir only acts as a data processor under the guidance of the NHS.”
Charles Carlson, a Palantir UK representative, told Al Jazeera: “When verifying, auditors assess our controls and compliance, and we undergo many audits.” He noted that “clients themselves, with support from the NCSC [National Cyber Security Centre], also carry out their own validation.”
Even though audits may show Palantir meets industry standards for data protection, observers question the level of real compliance. “We won't know if Palantir is doing anything nefarious with NHS data,” said Eerke Boiten, professor of cybersecurity at De Montfort University. “But that's also the case with Microsoft, Google and other US tech firms providing IT solutions to the NHS or anyone else.” Boiten advocates “technical realism” and argues these companies are so large and their products so complex that clients must trust them not to exploit the situation.
Legal pressure from the Good Law Project forced NHS England to release a less redacted version of the FDP contract, but according to McCann, about 100 pages remain withheld. These pages directly concern the method of anonymising patient data before it enters the platform – the sole element of the data protection framework in the contract that the public and experts cannot scrutinise.
Analysts agree the FDP is generally good and that alternatives exist. The leadership of the Greater Manchester health coordination board spent six years building its own analytics platform without Palantir. The question is not whether the NHS can manage data effectively, but whether it needs Palantir to do so.
“Palantir's political leanings, as expressed in its language, make them a potential security risk,” Boiten said. A less discussed risk is data aggregation. Palantir's Foundry platform supports contracts with at least ten UK government agencies, but the company denies any possibility of merging these datasets. “Every client interaction is contractually, operationally and technically separate,” Carlson said, stressing that unauthorised data sharing is illegal.
Two senior systems engineers from the UK Ministry of Defence have warned that by aggregating data across government datasets, Palantir could generate top-secret information from unclassified sources. Sarah Simms, senior policy expert at Privacy International, argues this risk is established by the company's actions abroad. “Trust is fundamental in health care delivery and the NHS,” she said. “People need to believe their data is handled safely and ethically. If not, the consequences could be devastating for healthcare for everyone.”