Skip to main content
Amnesty International UK
Log in

UK: Police forces ‘supercharging racism’ with crime predicting tech – new report

Amnesty’s new report ‘Automated Racism’ reveals dangerous discrimination in police prediction tools

Almost three-quarters of police forces attempt to predict crime by racially profiling communities across the UK

‘These systems have been built with discriminatory data and serve only to supercharge racism’ – Sacha Deshmukh

A new 120 - page report from Amnesty International UK ‘Automated Racism – How police data and algorithms code discrimination into policing’ has exposed the grave dangers to society from ‘predictive policing’ systems and technology used across almost three quarters of the UK’s police forces.

This is the first report to demonstrate how these systems are in flagrant breach of the UK’s national and international human rights obligations

Amnesty found that at least 33 police forces – including the Met Police, West Midlands, Avon and Somerset, Manchester and Essex police - across the UK have used predictive profiling or risk prediction systems. Of these forces, 32 have used geographic crime prediction, profiling, or risk prediction tools, and 11 forces have used individual prediction, profiling, or risk prediction tools. 

Sacha Deshmukh, Chief Executive at Amnesty International UK, said:

“No matter our postcode or the colour of our skin, we all want our families and communities to live safely and thrive. 

“The use of predictive policing tools violates human rights. The evidence that this technology keeps us safe just isn't there, the evidence that it violates our fundamental rights is clear as day. We are all much more than computer-generated risk scores.

“These technologies have consequences. The future they are creating is one where technology decides that our neighbours are criminals, purely based on the colour of their skin or their socio-economic background.

“These tools to “predict crime” harm us all by treating entire communities as potential criminals, making society more racist and unfair.

“The UK Government must prohibit the use of these technologies across England and Wales as should the devolved governments in Scotland and Northern Ireland. Right now, they can demand transparency on how these systems are being used.  People and communities subjected to these systems must have the right to know about them and have meaningful routes to challenge policing decisions made using them.

“These systems have been built with discriminatory data and only serve to supercharge racism.”

 There are two main types of racist predictive policing systems that raise several human rights concerns: 

Location: make predictions about the likelihood of crimes being committed in geographic locations in the future. The systems in all locations specifically targeted racialised communities. The chair of the National Police Chiefs Council has publicly admitted that policing is ‘institutionally racist’. In the year ending March 2023 there were 24.5 stops and searches for every 1,000 Black people, 9.9 stops and searches for every 1,000 people with mixed ethnicity, 8.5 for every 1,000 Asian people – and 5.9 for every 1,000 white people. Racialised people are over-represented in stop and search compared to both their representation in the population and even their involvement in police records of crime.

The vast majority of stops and searches in the UK – 69 per cent – lead to no further action

Profiling: individuals placed in a secret database and profiled as someone at risk of committing certain crimes, in the future. 

Areas such as London, West Midlands, and Manchester with high populations of Black and racialised people are repeatedly targeted by police and therefore crop up in those same police records. Black people and racialised people are also repeatedly targeted and therefore over-represented in police intelligence, stop-and-search or other police records.  

Forces using racist and failing systems

The Metropolitan Police Service’s Violence Harm Assessment profiles people based on intelligence reports and about people who are ‘suspects’ and an individual can be profiled without ever having offended or committed a crime.  

An initial period of Risk Terrain Monitoring-influenced policing targeted the north of the boroughs of Lambeth and Southwark from September 2020 onwards. Between December 2020 and October 2021 Lambeth had the second highest volume of stop and search of all London boroughs. In the same period, people of ‘black ethnic appearance’ (as defined by the Metropolitan Police Service) had the highest rate of stop and search encounters per 1,000 population of any ethnic group: they were stopped and searched more than four times, than people of white ethnic appearance. 80 per cent of these stops and searches resulted in no further action. In the same period, Lambeth had the second highest volume of police uses of force in all London boroughs, and police used force most against people recorded as ‘black or black British’. 

In Southwark in the year ending March 2021, Black people were stopped and searched 3.3 times more than white people. Police used force against people in Southwark at least 8,924 times between September 2020 and September 2021, and 45 per cent of those times it was against ‘black or black British’ people.  (p67)

West Midlands Police has deployed predictive crime mapping tools to predict knife crime and serious violence since 2021 and 2022, respectively. These tools have been funded by the Home Office Grip ‘hotspot’ policing programme and are part of West Midlands Police’s ’Project Guardian’ team, which focuses on youth violence and knife crime. 8 times out of 10  the system got it wrong.

Influenced by the knife crime and prediction tool, West Midlands Police continues to conduct racial profiling and discriminatory policing. In the force area in 2024 white people were stopped and searched 2.3 times out of every 1,000, while Black or Black British people were stopped and searched 10.3 times out of every 1,000, almost five times as much.  (p44)

Essex Police’s Knife Crime and Violence Model’s use of data on criminal associates criminalises people by association, without any evidence of criminality. The use of data on people’s mental health and drug use is another way in which health issues are taken to be markers of criminality. In other words, people are being criminalised for health issues. In the Essex Police force area in 2024 Black people were on average almost three times more likely to be stopped than white people, and in some areas of Essex as much as six and seven times more likely.

There is no conclusive evidence from the Essex Police pilot or subsequent studies of the implementation that the use of so-called hotspot mapping had any impact on crime. There is, however, evidence that the use of the system reinforced and contributed to racial profiling and racist policing. (p38)

Greater Manchester Police’s gang profiling is based on suspicion or even ‘perception’ without objective evidence of offending, or even any evidence of offending.

The disproportionate representation of Black and racialised people on the ‘gang profiling’ XCalibre database is discriminatory and evidences the racial profiling that XCalibre conducts. This police tactic is also clear infringement of these young people’s right to freedom of association. It continues the targeting of black cultural and music events, as with the Metropolitan Police’s Form, which required events spaces to provide details to the police about the type of music played and the ethnic background of attendees.

The Greater Manchester Police tactic of banning people from events in Manchester because they were perceived to be linked with gangs is one element of their so-called gang profiling. The XCalibre Task Force sought to exclude people from a cultural event based on its data-based profiling of their alleged involvement in gangs. (p91)

Human rights violations exposed

Racial profiling: The use of these systems by police results in, directly and indirectly, racial profiling, and the disproportionate targeting of Black and racialised people and people from lower socio-economic backgrounds. This in turn leads to their increased criminalisation, punishment, and exposure to violent policing. 

There’s no right to a fair trial: Predictive systems target people and groups before they have actually offended, which risks infringing on the presumption of innocence and the right to a fair trial.

Mass surveillance:  This is indiscriminate and can never be proportionate interference with the rights to privacy, freedom of expression, freedom of association and of peaceful assembly.

Zara Manoehoetoe, Kids of Colour and Northern Police Monitoring Project3, said:

The way in which these systems work is that you’re guilty until you can prove yourself innocent. Criminalisation is a justification for their existence. There is the presumption that people need to be surveilled and that they need to be policed.” 

Chilling effect 

People who live and reside in areas targeted by predictive policing will seek to avoid those areas as a result, leading to a chilling effect. Participants in the Essex discussion group said that if police were targeting certain areas, they would avoid those areas.

Recommendations

  • A prohibition on predictive policing systems
  • Transparency obligations on data-based and data-driven systems being used by authorities, including a publicly accessible register with details of systems used. 
  • Accountability obligations including a right and a clear forum to challenge a predictive, profiling, or similar decision or consequences leading from such a decision. 

Secrecy, scare tactics and surveillance – the view from those affected

Anon contributor to the report said:

“It’s not fair to over-police areas that have these challenges because of intentional underfunding, and to now [be] adding police to a situation that you’ve created as a part of the state system, is just adding to the problems of the community that you claim you want to protect.”

John Pegram, Bristol Copwatch, said:

“It doesn’t matter if you offended 13 or 14 years ago for something, you’re known to us for this, and therefore we’re going to assign a score to you. It’s risk scoring, it’s profiling, often racist profiling.”

Hope Chilokoa-Mullen from the 4Front Project, said:

We’ve had members who have been stopped and told: ‘You’ve been stopped because you’re on a database.’ They don’t know what database it is. I suppose that’s the point of it, you’re not really meant to know how it’s used.”

Anon contributor said:

“It targets and profiles entire areas. It targets you based on the community you live in. It’s a clear example of how racism structures policing.”

See full report here

View latest press releases