Press releases
Wales: Police secrecy in attempting to predict crime by racist profiling of communities
Amnesty’s new report ‘Automated Racism’ reveals dangerous discrimination in police prediction tools
All police forces in Wales attempt to predict crime by profiling communities
Black people more likely to be searched than white people in Wales, with it being six times more likely in Gwent
‘Police forces need to be clear and transparent about the systems they’re using and the impact they’re having’ – Glenn Page
A new report from Amnesty International UK ‘Automated Racism – How police data and algorithms code discrimination into policing' has exposed the grave dangers to society from ‘predictive policing’ systems and technology used by all police forces in Wales.
This is the first report to demonstrate how these systems are in flagrant breach of the UK’s national and international human rights obligations
Amnesty found that at all police forces in Wales have used geographic crime prediction, profiling, or risk prediction tools.
All police forces have confirmed their use of ‘hotspot mapping’ a risk assessment tool that targets people due to geographic region. Whilst South Wales Police confirmed they are also trialling individual risk assessment mapping, but their data and usage remains veiled in secrecy leading to concerns about human rights violations.
Black people in Wales are more likely than white people to be subject to stop and search by police in each area of Wales, including more than six times likely in Gwent.
Glenn Page, Amnesty International Wales Government & Political Relations Manager, said:
“Amnesty’s report reveals an alarming rise in targeting of racialised communities and those from more deprived socio-economic areas, and flags have been raised over the methods used and lack of transparency by police forces in Wales.
“Police forces need to be clear and transparent about the systems they’re using and the impact they’re having.
“The use of predictive policing tools violates human rights. The evidence that this technology keeps us safe just isn't there, the evidence that it violates our fundamental rights is clear as day. We are all much more than computer-generated risk scores.
“These technologies have consequences. The future they are creating is one where technology decides that our neighbours are criminals, purely based on the colour of their skin or their socio-economic background.
“These tools to “predict crime” harm us all by treating entire communities as potential criminals, making society more racist and unfair.
“The UK Government must prohibit the use of these technologies. Right now, they can demand transparency on how these systems are being used. People and communities subjected to these systems must have the right to know about them and have meaningful routes to challenge policing decisions made using them."
There are two main types of racist predictive policing systems that raise several human rights concerns:
1. Location: make predictions about the likelihood of crimes being committed in geographic locations in the future. The systems in all locations specifically targeted racialised communities. The vast majority of stops and searches in the UK – 69 per cent – lead to no further action.
2. Profiling: individuals placed in a secret database and profiled as someone at risk of committing certain crimes, in the future. Areas with high populations of Black and racialised people are repeatedly targeted by police and therefore crop up in those same police records. Black people and racialised people are also repeatedly targeted and therefore over-represented in police intelligence, stop-and-search or other police records.
Recommendations
- A prohibition on predictive policing systems
- Transparency obligations on data-based and data-driven systems being used by authorities, including a publicly accessible register with details of systems used
- Accountability obligations including a right and a clear forum to challenge a predictive, profiling, or similar decision or consequences leading from such a decision