DNA Phenotyping is the Latest Racist Addition to Policing Technology

Phrenology's sci fi cousin is back on the menu, boys.

by Sanjana Varghese

11 October 2022

Image shows a line of test tubes filled with pink liquid on a stainless steel bench. Beside them, a pair of hands in blue medical gloves marks a line of dots in red pen on a small piece of white cardboard.
DNA phenotyping is increasingly used by police forces in the West. Henry Romero/Reuters

Last week a police force based in Edmonton, Canada tweeted a 3D sketch of a young Black man they claimed was a suspect in a sexual assault investigation. The image, said the force, was the first it had produced with the help of DNA phenotyping.

First developed in the early 2000s, DNA phenotyping has been employed more frequently in recent years by police forces across the US and Canada. Using DNA collected from a crime scene – in the case of the Edmonton suspect, by a company called Parabon NanoLabs – phenotyping allows the police to produce a “suspect profile” which, according to Parabon, can predict traits such as eye and hair colour, face shape and ancestry.

Yet while the perpetrator Edmonton police seek remains at large, the person in the generic sketch generated by phenotyping – a young Black man – does not exist.

Unscientific suspicion. 

The police have been using DNA analysis since the 1980s. However, phenotyping marks a clear departure from previous techniques. Usually, DNA retrieved from a crime scene is linked to DNA retrieved from a suspect, or to an existing database of DNA. All contain margins of error, but are relatively accurate. 

DNA phenotyping is different: it uses genetic material to predict physical traits of a hypothetical suspect rather than identifying an actual one. In a 2015 piece for the New Inquiry, information artist Heather Dewey-Hagborg details Parabon’s process which, she explains, is premised “on the calculation of what is called genomic ancestry, friendlier terms for […] ‘the heritable component of race.’” 

After DNA phenotyping assigns sex, it calculates an individual’s percentages of roughly four ‘ancestral’ types: African, east Asian, European and Native American – similar to the categories used by commercial DNA testing companies such as 23andMe.

This division, writes Dewey-Hagborg, “recapitulates the centuries-old racial categories of Caucasian, Mongoloid, and Negroid”.

One of the earliest uses of DNA phenotyping was by police in the Netherlands in 2002, after a young woman was murdered. At the time, phenotyping usage was unlawful in the country, but the public outcry the murder provoked led to its legalislation. 

In EPS’s statement in response to the phenotyping incident, the force claimed it was following the example set by forces in Canada, the US and Sweden; in 2019, Parabon claimed the company had assisted authorities in all three countries with around 200 criminal cases. 

Science fiction.

For years, civil liberties advocates, as well as researchers working in the area, have pointed out that DNA phenotyping is closer to science fiction than fact. Facial recognition technology and other kinds of biometrics have also become commonplace as a part of policing, despite significant legal challenges and public discontent. 

Investment in wide-scale surveillance, often in partnership with private companies and vendors that aren’t household names, is commonly sold as a fix for budget cuts (at least in the US and Canada, despite overarching trends pointing towards increases in police funding). 

In a 2019 article for Slate on phenotyping, tech reporter Aaron Mak speculates that the difficulty of rolling out other forms of biometric policing may partly account for the success of phenotyping in the US, since restrictions on police access to commercial DNA databases may have encouraged forces to employ phenotyping instead. DNA phenotyping may seem ‘legitimate’ – after all, it seems to draw on something immutable, DNA and unlike predictive technology, it’s deployed after an incident has occurred. 

But the hierarchies that this technology reifies, are part of the nexus of criminality that racialised communities already deal with at the hands of police and other state agencies. Usage of techniques like DNA profiling and algorithmic policing adds credibility to flawed judgements – such as the 2020 wrongful imprisonment of Robert Julian-Borchak Williams, after he was flagged by a facial recognition system – because they’re derived from what’s considered ‘value-neutral’ data. 

Racism and problems around police brutality within the Canadian police force are well known – Edmonton police service has a long history of carrying out ‘starlight tours’, where police officers pick up and abandon indigenous and First Nations people on the outskirts of the city. Other police forces that have used Parabon’s phenotyping services in Calgary and Saskatchewan also have well-documented histories of systematically violating the rights of minoritised communities. 

Phenotyping also provides additional justification for police attempts to intimidate and harass members of certain, minoritised communities — officers can use the likes of composite sketches to pressure people into giving DNA samples when they don’t have to, thereby enrolling larger swathes of the general population into the state’s reach. 

Predictive policing technologies – many of which use geographical markers as proxies for race – discriminate and disproportionately lead to the over-policing of minoritised and racialised communities. Automatic number plate readers were packaged as a value-neutral technology, but there are indications that its use by a police force in London contributed to the death of Chris Kaba, whose family are still searching for answers. While predictive policing and automatic license plate readers may seem dissimilar to DNA phenotyping, all of these systems are part of the same systems of surveillance against minoritised communities. 

Where next? 

As more and more of these policing technologies become the subject of significant scrutiny and backlash, the parameters of what constitutes lawful use of these systems continually get moved too. Once a methodology and its associated infrastructures are entrenched, it’s often very difficult to stop its spread, or to even call into doubt its viability. 

In China, there are reports that the same kind of process is being used to track and identify members of minoritised communities, such as Uyghurs, through forcibly collecting blood samples. In EPS’s statement in response to the phenotyping incident, the force claimed it was following the example set by forces in Canada, the US and Sweden; in 2019, Parabon claimed the company had assisted authorities in all three countries with around 200 criminal cases. The underlying issue isn’t necessarily that Parabon Nanolabs is selling this technology to police forces, but rather that police forces are turning to these technologies in the first place.

EPS ended up apologising for their use of phenotyping in this instance. But it won’t be the last time. On the off chance that Parabon Nanolabs goes out of business, there will be a host of companies ready to take its place. One unscientific, deeply flawed kind of technology will be replaced by another, maybe acting on a different set of biases or using different inputs, but contributing to the same vectors of oppression.

Sanjana Varghese is a writer, journalist and researcher based in London. 

We’re up against huge power and influence. Our supporters keep us entirely free to access. We don’t have any ad partnerships or sponsored content.

Donate one hour’s wage per month—or whatever you can afford—today.

We’re up against huge power and influence. Our supporters keep us entirely free to access. We don’t have any ad partnerships or sponsored content.

Donate one hour’s wage per month—or whatever you can afford—today.