Predictive policing

Liberty report exposes police forces’ use of discriminatory data to predict crime

Posted on 04 Feb 2019

At least 14 UK police forces have used or intend to use discriminatory computer programs to predict crime.

At least 14 UK police forces have used or intend to use discriminatory computer programs to predict where crime will be committed and by whom, according to new research published today by Liberty.

The new report, “Policing by Machine”, collates the results of 90 Freedom of Information requests sent to every force in the UK, laying bare the full extent of biased ‘predictive policing’ for the first time – and how it threatens everyone’s rights and freedoms.

It reveals that 14 forces are using, have previously used or are planning to use shady algorithms which ‘map’ future crime or predict who will commit or be a victim of crime, using biased police data.

The report exposes:

  • police algorithms entrenching pre-existing discrimination, directing officers to patrol areas which are already disproportionately over-policed
  • predictive policing programs which assess a person’s chances of victimisation, vulnerability, being reported missing or being the victim of domestic violence or a sexual offence, based on offensive profiling
  • a severe lack of transparency with the public given very little information as to how predictive algorithms reach their decisions – and even the police do not understand how the machines come to their conclusions
  • the significant risk of ‘automation bias’ – a human decision-maker simply deferring to the machine and accepting its indecipherable recommendation as correct.

Hannah Couchman, Advocacy and Policy Officer for Liberty, said: “Predictive policing is sold as innovation, but the algorithms are driven by data already imbued with bias, firmly embedding discriminatory approaches in the system while adding a “neutral” technological veneer that affords false legitimacy.

“Life-changing decisions are being made about us that are impossible to challenge. In a democracy which should value policing by consent, red lines must be drawn on how we want our communities to be policed.”

Biased Machines

Predictive policing algorithms analyse troves of historical police data – but this data presents a misleading picture of crime due to biased policing practices.

The computer programs are not neutral, and some are even capable of learning, becoming more autonomous in their predictions and entrenching pre-existing inequalities while disguised as cost-effective innovation.

Predictive mapping programs

Predictive mapping programs use police data about past crimes to identify “hot spots” of high risk on a map. Police officers are then directed to patrol these areas – many of which will already be subject to policing interventions that are disproportionate to the level of crime in that area.

The following police forces have used or are planning to use predictive mapping programs:

  • Avon and Somerset Police
  • Cheshire Constabulary
  • Dyfed Powys
  • Greater  Manchester Police
  • Kent Police
  • Lancashire Constabulary
  • Merseyside Police
  • Metropolitan Police
  • Northamptonshire Police
  • Warwickshire and West Mercia Police
  • West Midlands Police
  • West Yorkshire Police
  • Individual risk assessment programs

Individual risk assessment programs predict how people will behave, including whether they are likely to commit – or even be victims of – certain crimes.

Durham Constabulary has used a program called Harm Assessment Risk Tool (HART) since 2016. It uses machine learning to assess the likelihood of a person committing an offence, but is designed to overestimate the risk.

HART bases its prediction on 34 pieces of data, including personal characteristics such as age gender and postcode, which could encourage dangerous profiling. And it has also considered factors such as “cramped houses” and “jobs with high turnover” when deciding the probability of a person committing crime.

Avon and Somerset Police’s risk assessment program even predicts the likelihood of a person perpetrating or suffering serious domestic violence or violent sexual offences.

Individual risk assessment programs are being used by:

  • Avon and Somerset Police
  • Durham Constabulary
  • West Midlands Police
  • Threat to rights

Like any data collected from society, predictive policing programs reflect pre-existing patterns of discrimination – further embedding them into policing practice.

Mapping programs direct officers to attend already over-policed areas, while individual risk assessment programs encourage an approach to policing based on discriminatory profiling – lending unwarranted legitimacy to these tactics.

Predictive algorithms also encourage reliance on ‘big data’ – the enormous quantities of personal information accumulated about everyone in the digital age – which is then analysed to make judgments about people’s character, violating their privacy rights.

This problem is compounded by the fact that the public – and the police – do not know how the programs arrive at a decision. This means they are not adequately overseen, and the public cannot hold them to account or properly challenge the predictions they make.

Recommendations

The report makes a number of recommendations, including:

  • Police forces in the UK must end their use of predictive mapping programs and individual risk assessment programs.
  • At the very least, police forces in the UK should fully disclose information about their use of predictive policing programs.
  • Where decision-making is informed by predictive policing programs or algorithms, this information needs to be communicated to those directly impacted by their use, and the public at large, in a transparent and accessible way.
  • Investment in digital solutions for policing should focus on developing programs that actively reduce biased approaches to policing. A human rights impact assessment should be developed in relation to new digital solutions, which should be rights-respecting by default and design.

I'm looking for advice on this

Did you know Liberty offers free human rights legal advice?

What are my rights on this?

Find out more about your rights and how the Human Rights Act protects them

Did you find this content useful?

Help us make our content even better by letting us know whether you found this page useful or not

Need advice or information?