A small London law firm has reshaped the conversation over tech regulation in the UK over the course of just a few months.
Foxglove, a law firm started by Cori Crider and Martha Dark less than two years ago, has been instrumental in the government backtracking on the A-levels grading algorithm and abandoning the Home Office’s discriminatory visa algorithm, as well as preemptively halting a Covid-19 data sharing agreement between the NHS and Palantir, a shadowy surveillance firm used by US Immigration and Customs Enforcement (ICE).
“What [the government is] doing is taking that which involves politics and power and trying to hide behind a curtain of technical neutrality,” explains Crider. “What we’re trying to do with the cases is sort of like that moment at the end of the Wizard of Oz, where you pull back the curtain and you’re like, no, it’s not the computer, it’s the guy with a hand crank.”
‘In the name of efficiency, we are potentially going to sacrifice fairness and justice.’
Foxglove is part of a growing movement calling for more democratic accountability in the tech industry, with antitrust investigations ramping up in the US, and the European Union prepared to announce a major overhaul in the regulation of digital platforms.
The firm, which operates as a not-for-profit organisation, has done its most impactful work through investigations into algorithmic tools and data practices employed by governments. With a barebone staff and limited funds, Foxglove has been punching well above its weight with its challenges to government departments and multinational tech companies.
While algorithmic governance has become common practice internationally, these issues have become particularly pronounced in the UK, where a highly technocratic government has faced push-back from Foxglove and other specialised law firms who have been leading high-profile cases. Working alongside firms like Leigh Day and Matrix Chambers, and groups like the Good Law Project, Foxglove faces a government intent on embedding new technologies in everyday decision-making, particularly as part of a Covid-19 recovery strategy.
“Some special advisors and ministers are really driving and pushing forward the idea of ‘digital-by-default’ within [UK] government,” says Dark, “and some of them very clearly have the ear of the [prime minister].”
“The key thing to understand is that it isn’t just grades and isn’t just visas,” adds Crider. “This logic now is totally pervasive in government…in the name of efficiency, we are potentially going to sacrifice fairness and justice.”
‘Someone just needs to sue these people.’
Foxglove was started after Crider and Dark branched out from Reprieve, an international collective of lawyers, investigators and campaigners committed to fighting on behalf those who have suffered extreme human rights abuses. During this time, the pair became increasingly concerned with injustices driven by new technology. “We felt that the unchecked power of huge technology platforms was the biggest next threat to society, social justice, and democracy,” explains Dark.
Crider, for her part, also cites an ongoing frustration with digital ethics conferences as a reason for the firm’s inception. Often headed up by executives and engineers in the tech industry, the conference discussions felt painfully out of touch. “Listening to these people bloviate, I just sat there and I thought: someone just needs to sue these people,” says Crider.
Debates over the use of algorithmic systems and surveillance technologies were “too consensus-driven” and cloistered within the tech industry, says Crider. In response, Foxglove is trying to democratise the process, involving those who are affected by the digital ethics issues in question, rather than leaving the decisions to those inside frosted glass meeting rooms and at gently moderated panel discussions.
‘Fuck the algorithm.’
There are few government scandals that could have been more effective at shifting the conversation around digital ethics than the A-levels algorithm.
Crider and Dark were coming off of an investigation into a health data-sharing agreement between Palantir and the NHS when they came across a video of secondary school children protesting outside of Whitehall. As a response to the A-Levels distributed grading system rolled out by the government following the cancellation of end-of-year exams, the protests – a mass of teenagers chanting “Fuck the algorithim” and “Fuck Eton” – represented an important turning point.
“It’s become kind of iconic,” says Crider. “For most people, the moment that you really feel it isn’t when you know about the listening, or even when the creepy ad follows you all over the internet, but when some information [that’s] taken from you has been used and weaponized towards an outcome that you don’t agree with.”
Foxglove set about investigating the statistical model used for grading A-Level results which had left students short of university grading requirements and drastically downgraded results with very little explanation. The real-world effects of the grading on students’ university admissions and the clear biases that played out against schools in the algorithm’s outcomes led to an uproar.
The firm launched a legal challenge on behalf of Ealing A-level student Curtis Parfitt-Ford worked and learnt that the algorithmic model had disproportionately impacted state school pupils, in part because of previous national averages, the over-weighing of certain subjects (such as classics), and the way that algorithm designers had chosen to deal with small class sizes. The revelation prompted the government’s backtracking on the A-Levels algorithm.
In achieving this win, Foxglove lawyers used a legal process called Judicial Review (JR) to gather more information about the A-levels grading process. Filing a JR with the government prompts it to provide legal justification for its decision-making.
“It’s much more useful than FOI at the moment, which the government feels perfectly able to ignore,” explains Crider.“[JRs] actually render an affirmative duty to disclose evidence.” In short, the onus is placed on the government to publicly lay out the technology they’re using, instead of on activists to chase down the right documents, as is the case with FOIs. This strategy has been central to Foxglove’s success in interrogating Westminster’s algorithmic governance – despite worries from Crider and others that the government has been taking steps to limit JR with a set of constitutional reforms.
The Home Office’s justification for the A-levels algorithm involved hundreds of pages of dense statistical workings. But what became known as the “racist algorithm” in reality was little more than a “fancy flowchart,” says Dark. The visa application sorting system, which the Home Office had used since 2015, Foxglove’s JR review discovered, made immigration more difficult to those arriving from certain countries, while offering what the firm deemed, “speedy boarding for white people.”
‘Transparency is only the beginning of the conversation about justice.’
While Foxglove’s legal actions around algorithms are holding the UK government to account, Crider and Dark have also embarked on a series of ongoing legal battles against Big Tech. Inspired by actions like the 2018 Google employees walkout and “techlash” protests, Foxglove works closely with tech workers themselves, who are aware of how algorithms are being used on the inside. The firm has been part of a vital push against Facebook’s labour practices, working with the platform’s third-party content moderators, based mainly in Ireland, who have been fighting against unfair and often dangerous working conditions.
A broader suite of Foxglove’s legal actions has focused on tech companies’ harvesting of user data. Foxglove helped launch a class action case in October against the tech giant Alphabet, aimed at YouTube’s harvesting of children’s data for targeted advertising. Meanwhile, the firm announced last week that it would be representing two users of MuslimPro, a highly popular prayer app that has been accused of selling location data to the US military without their consent.
But Crider and Dark are quick to emphasise that the technology sector’s lack of openness on questions of data harvesting is only one part of a much bigger problem. “I don’t want to say I hate transparency,” says Crider, “but… [it’s] the beginning of …the conversation about justice… not the end.”
‘I think the important thing is whether these systems should be used at all.’
While revelations from Edward Snowden’s NSA leaks started the conversation, Foxglove is expanding it, moving beyond a critique based in civil liberties and towards a shift in power that would allow for a fairer, more just and less intrusive design of technologies in the first place.
In the US, that discussion has been unfolding as part of investigations into the monopoly power of Facebook, Apple, Amazon and Alphabet. Crider and Dark are keen to see a spotlight pointed at the economic control wielded by the tech industry, but are wary of the limitations of the US antitrust regime – a system of laws focused on competition and economic control – in carrying out lasting change. Crider says that it’s the democratic foundations of the early 20th-century antitrust battles that remain the most promising in curbing the power of Big Tech.
The way governments use technology seems like a fitting place to start. The municipal governments of Amsterdam and Helsinki have both created public registries of AI and algorithms used to make local government decisions. Foxglove is hoping its work can make the case for similar measures in the UK, so that the technologies receive prior consent, rather than being an object of scrutiny after the fact.
The firm is also hoping to expand its work to technology export controls. Much like the prohibition on arms sales to certain countries, there are currently regulations on UK exports of hacking and surveillance tools. However, the rules on selling these technologies to repressive regimes are often drastically under-enforced.
“There are laws on the books that would definitely suggest that some of the hacking and surveillance equipment that gets sold out of this country to, let’s say, the Philippines or the Egyptian regime, should never be sold, because they could never clear the human rights requirements. And yet still this stuff goes through.”
By calling into question the export of facial recognition and other forms of intrusive monitoring technology being produced in the UK, Foxglove would also be making a compelling argument against the use of these same technologies at home.
Crider and Dark are cautiously optimistic about changes to the tech regulatory landscape in the UK. Not only have they been able to rack up a series of legal victories this summer; issues of algorithmic governance have also become part of the public imagination.
“I think the important thing, in the long-term, is the question about whether these systems should be used at all, and on what terms,” says Dark, reflecting on Foxglove’s progress over the past few months.
The UK courts have become a crucial arena in the fight over governments’ use of technology. Foxglove’s wager is that a more democratic approach to data-collection and decision-making algorithms will win out.
Josh Gabert-Doyon is a writer and radio producer based in London.