Automated Policing Helped Kill Chris Kaba

Flawed data + human prejudice = deadly violence.

by Griff Ferris

14 September 2022

Image shows British rapper, singer and songwriter Stormzy speaks during Black Lives Matter protesters march from Parliament Square to New Scotland Yard in central London demanding justice for 24 year old Chris Kaba, who was shot dead by the police last week
Chris Kaba, 24, was shot dead by police in south London last week. Maja Smiejkowska/Reuters

Many details regarding last week’s police shooting of Chris Kaba, a young musician and expectant father, are still unknown or uncertain. 

But what we do know is this: Kaba was driving in south London. The car he was in was identified by a police Automated Number Plate Recognition (ANPR) camera, which alerted police that the vehicle was linked to a previous “firearms incident” – despite not being registered to Kaba. Officers then rammed and blocked Kaba’s car, and killed him, in what looks close to an execution – there was a single shot through the windscreen. He was unarmed.

The brutality of Kaba’s death is another chilling and enraging example of racist state violence, in the same vein as Jermaine Baker, Anthony Grainger and Mark Duggan, all of whom were shot and killed by police while unarmed. But the killing of Chris Kaba is also a consequence of expanded automated policing. This authoritarian and dangerous new data-driven strand of policing is enacted through three major sites: surveillance, unnervingly detailed databases, and shadowy algorithms that predict or assess people’s ‘risk’ of criminality.

Combined, these create an expansive and flawed system used to exert power and control over people, especially minoritised communities, positioned as ‘threats’ to the state.

Surveillance state.

ANPR cameras are just one of the surveillance tools used in automated policing. There are around 13,000 across the UK, mostly located on roads and motorways, recording every car that passes, creating 60 million records every day. Due to the breadth of the network, this data also functions as a meticulous record of people’s movements; cars can be tracked across the country, either in real time or retrospectively. Many police vehicles also carry mounted ANPR cameras, checking every licence plate around them. Once a plate is captured, it’s checked against police databases to see what records are associated with that vehicle, including links to previous offences.

Similarly, police are increasingly using automated face recognition surveillance in public places, scanning and checking people’s faces against police databases, targeting not just ‘wanted’ people, but also individuals with mental health issues and protestors. Police are also seeking to use facial recognition retrospectively, which brings into play the vast CCTV network in Britain. 

Apart from being inherently invasive, giving huge and untrammelled power to the state, facial recognition surveillance systems don’t work. They misidentify black people and people of colour at significantly higher rates than white people – resulting in wrongful stops, ID checks, questioning and searches, and significantly widening the net of justification for police intervention and unwarranted criminalisation.

This form of automated policing does not reduce human fallibility; often, it exacerbates it. In Chris Kaba’s case, a police surveillance system generated an automated alert that told police that the car he was driving was linked to a firearms offence, so police took it as fact that he was armed and dangerous. In addition to the officers’ own prejudices, the information provided by the automated system will have influenced every subsequent interaction that followed, up to Kaba’s killing. They saw a car marked as ‘dangerous’ and a young Black man – racialised as a threat – driving it. Instead of de-escalating or investigating further, they used the data they’d received as justification for the extreme violence that followed.

Dangerous databases.

Interlinked with these surveillance systems are the vast databases that police and state authorities retain.

Police databases hold millions of records about people, including not just details of convictions and cautions but images, addresses, linked vehicles, other property, and more. Perhaps most dangerous of all is police ‘intelligence’ – often uncorroborated information about people’s alleged involvement in crime or other activity. Under the new Policing Act, providers of education, housing, welfare and healthcare are also obliged to share people’s information with the police.

These databases are inherently biased, especially along race, class and gender lines: the data and records held do not represent an accurate record of criminality, as defined by the state, but are actually a document of the crimes, locations and groups that are most policed and criminalised within society. The data reflects the structural biases and inequalities in the UK, where, for example, black people are criminalised disproportionately more than white people on any metric: stop and search, arrest, prosecution, imprisonment and more.

The best – or worst – example is the Metropolitan police’s ‘gangs matrix’; a deeply racist database of alleged ‘gang’ activity, developed to address the perceived cause of the London riots in 2011. The database contains details on several thousand people believed to be involved or ‘at risk’ of involvement in ‘gang’ activity. 80% of those on the database are young Black men and are themselves victims of violence. 

Populated by uncorroborated hearsay ‘intelligence’ from police, referencing social media and youtube videos allegedly showing ‘gang’ activity or affiliation, the gangs matrix is data-based oppression that criminalises friendships between young Black men. Those on the database face constant monitoring, stop and search, exclusion from school, eviction for them and their families, immigration control, and the denial of welfare and employment. In Manchester, police have run a similar ‘gang’ initiative, XCalibre, and recently tried to ban groups of young Black men from the local carnival.

Violent algorithms. 

The final element in this oppressive nexus are the algorithms used by police in the UK and across Europe to ‘analyse’ and find links and patterns in this data, drawing correlations, often based on prejudicial stereotypes and associations, resulting in ‘predictions’ or assessments of ‘risk’ of future criminal action. 

The young men on the gangs matrix are given ‘risk’ scores of their likelihood of committing gang violence. For years, Durham police ran an automated system assessing people’s risk of future criminality, using almost unbelievably racist data profiles and stereotypes such as “Asian Heritage”. West Midlands police are developing predictive algorithms to assess “serious violence”, despite a previous iteration being flawed and inaccurate. Similar systems are also used to decide which locations to patrol, leading to the same areas being aggressively policed, as well as influencing sentences and when people can be released from prison.

In this way, the data reflecting structural racism and other prejudices in policing and criminal justice is then automated. The likelihood of criminality is assessed via automatic, digitised stigmatising, stereotyping and profiling, which is then taken as ‘fact’ and acted upon by human officers on the ground, leading to a never-ending loop of discrimination. 

The structural element is compounded by the very deliberate focus of these databases and of these algorithms, on low-level street crime, property crime or racist notions of ‘gangs’.

There is a growing movement to ban data-driven predictive and profiling systems in Europe, as well as against surveillance systems like facial recognition. Until those succeed, these automated systems and their pronouncements of ‘risk’ will continue to be used, their automated alerts or analysis taken as absolute and objective. They provide the suspicion, the reasonable cause and the justification for police intervention: whether to question someone, stop and search, arrest – and now, apparently to shoot and kill.

This data-driven policing is reinforcing the criminalisation of minoritised people. Police are already an institutionally racist organisation, the biggest and most violent gang in any city or town in the country, but these surveillance tools, databases and algorithmic profiling systems reflect and exacerbate that racism and oppression, often with violent consequences – and for Chris Kaba, it was deadly.

Griff Ferris researches and campaigns against the use and abuse of technology and data in policing and criminal justice.

Build
 people-
  powered
   media.

Build people-powered media.

We’re up against huge power and influence. Our supporters keep us entirely free to access. We don’t have any ad partnerships or sponsored content.

Donate one hour’s wage per month—or whatever you can afford—today.

We’re up against huge power and influence. Our supporters keep us entirely free to access. We don’t have any ad partnerships or sponsored content.

Donate one hour’s wage per month—or whatever you can afford—today.