Police surveillance technology

The police are rolling out a range of sinister new technologies – and they’re doing it without proper trials, consultation or even law.

New police technologies are being “trialled” through active deployment, without consent from the public, leading to widespread national use before proper guidance can be developed. Many of these technologies threaten to interfere with our right to privacy.

It is often unclear which oversight bodies should lead in relation to new developments, leaving new technologies without proper scrutiny and open to abuse.

Liberty has also questioned the discriminatory impact of new policing technologies, often run off computer programs which exacerbate pre-existing inequalities.

Facial recognition

Facial recognition technology works by matching live images of people walking past special cameras with “probe images” on a watch list.

The watch list is put together by the police from the custody images database – which contains images of people who have come into contact with the police, including thousands of images of innocent people. They might even take the images from elsewhere.

There is no law giving the police the power to use facial recognition, and no policy from the Home Office – leaving individual police forces to make it up as they go along.

Liberty has long raised concerns about the use of facial recognition in public spaces. It is a hugely disproportionate crime-fighting technique, violating the privacy of everyone within range of the cameras by capturing deeply personal biometric data.

This lawless technology is alarmingly inaccurate and biased – it is least accurate when it tries to identify black people and women.

Even if facial recognition systems become more accurate, this enormously invasive technology has no place on our streets.

Challenging facial recognition in the courts

In March this year, police deployed facial recognition technology at a protest for the first time.

Liberty is representing Ed Bridges, a local activist who attended the protest, as he mounts a legal challenge against South Wales Police. He says their indiscriminate use of facial recognition technology on our streets makes our privacy rights worthless and will force us all to alter our behaviour.

Find out more about Liberty’s legal challenge.

IMSI catchers

It’s not just our biometric information that is vulnerable to being monitored – it’s our private messages and phone calls too.

IMSI-catchers are sinister pieces of tech that can be used to locate all switched-on mobile phones at a protest or public event by mimicking mobile phone towers, tricking phones into connecting with them and revealing personal data.

They can even be used to intercept and monitor your calls and messages, and even change their content – and you wouldn’t even know it was happening.

The police “neither confirm nor deny” that they use these devices – but the Metropolitan police were known to have purchased the technology in 2008/2009, and other police forces have paid for this technology too.

Liberty is representing Privacy International in their appeal challenging police forces' refusal to disclose information on their purchase and use of IMSI catchers. Read more about the case.

Digital stop and search

Police need a warrant to search your home – but not your phone. The police are now using mobile phone “extraction technology”, which can download all the data from your phone – messages, photos, videos contacts and visited webpages, even those that are encrypted or deleted.

This can be done without your consent or even your knowledge when your phone is handled by the police in the course of an investigation – even if you are a victim or a witness. They can store that data indefinitely.

Mobile finger print scanners

As technology develops, it becomes more portable – making it easier for police to use it on people when out and about, living their everyday lives. 

The latest technology to become mobile is fingerprint scanning – the Home Office announced in February 2018 that West Yorkshire Police will roll out a scheme letting officers armed with portable scanners check anyone’s fingerprints against both criminal and immigration databases. More recently, the Met police have announced the development of their own version of mobile fingerprint scanners, which can also check immigration and criminal records databases. 

This proposal is incredibly invasive, and there is no discussion of consent or getting access to legal advice when this happens. 

This “pop-up” surveillance technology is dangerous, particularly for children and vulnerable adults. And it’s open to abuse – especially when the Government is so keen to create a hostile environment for migrants.

Predictive Policing

Predictive policing software uses algorithms – a set of instructions designed to perform a specific task – to attempt to predict future criminal activity or behaviour. 

There are deep concerns about the use of algorithms in the criminal justice system, which can exacerbate pre-existing inequalities and lead to the continued over-policing of certain communities. 

For example, a company called PredPol has developed technology which predicts where crimes are likely to take place. It directs officers to police these areas – often communities which are already over-policed, entrenching pre-existing inequalities. 

Another example of predictive policing is the Harm Assessment Risk Tool (HART) used by Durham Police. This programme predicts the risk presented by a detained person – based on information like their postcode and financial information the Police purchased from Experian. These predictions will then be used to make decisions about the detained person – for example, whether they are referred to a rehabilitation programme instead of being prosecuted formally. 

These tools are often owned by a company, which means that the precise way that they work is kept secret. This makes it very difficult – if not impossible – to hold decision-makers in the criminal justice system to account.