Report: Policing by Machine

Policing by Machine – Predictive Policing and the Threat to Our Rights

Policing by Machine – Predictive Policing and the Threat to Our Rights collates the results of 90 Freedom of Information requests sent to every force in the UK, laying bare the full extent of biased ‘predictive policing’ for the first time – and how it threatens everyone’s rights and freedoms.

It reveals that 14 forces are using, have previously used or are planning to use shady algorithms which ‘map’ future crime or predict who will commit or be a victim of crime, using biased police data.

The report exposes:

  • police algorithms entrenching pre-existing discrimination, directing officers to patrol areas which are already disproportionately over-policed
  • predictive policing programs which assess a person’s chances of victimisation, vulnerability, being reported missing or being the victim of domestic violence or a sexual offence, based on offensive profiling
  • a severe lack of transparency with the public given very little information as to how predictive algorithms reach their decisions – and even the police do not understand how the machines come to their conclusions
  • the significant risk of ‘automation bias’ – a human decision-maker simply deferring to the machine and accepting its indecipherable recommendation as correct.

Download Policing by Machine Report – February 2019 (PDF)