Facial recognition

“Not a fool-proof system”: Facial recognition in action

Posted by Hannah Couchman on 29 Jun 2018

What we saw in Stratford when police were using privacy-abusing automated facial recognition tech.

Police stood beside a facial recognition van.
A police officer stands in front of a van fitted with facial recognition

As the afternoon sun beat down on East London on Thursday, the people of Stratford were being watched.

Two CCTV cameras had been placed on the bridge linking the train station to Westfield Shopping Centre. But these cameras were a little different. They were loaded with facial recognition technology to identify members of the public.

This was the latest Metropolitan Police deployment of privacy-abusing automated facial recognition tech on the capital’s streets. Liberty was invited to observe. We were not impressed.

Computers making decisions

Facial recognition technology works by matching live images of people walking past the cameras with “probe images” on a watch list put together by the police. On Thursday the probe images were people wanted by the police or the courts, taken from the custody images database. But images can be taken from anywhere, including social media. In other instances, police have targeted people suspected of having mental health issues.

Facial recognition risks taking the decision of who is stopped by police out of human hands and giving it over to computer algorithms. In a busy environment with moving crowds, officers might apprehend first and check the accuracy of the match later.

We were told this would never happen – that human intervention was key, and an officer would always check the images to confirm the match.

This was not the case.

Instead, officers and security staff spoke of the need to move quickly. When a match alert came up on the computer – a young black man who bared little resemblance to the probe image from the watch list – officers were immediately radioed with a description, and we watched as the man was stopped and searched, his ID checked.

The ordeal went on for some time while they radioed back and forth and eventually concluded he was not known to the police. Afterwards, he expressed his frustration as he had done nothing wrong and his time had been wasted. Officers admitted that the person apprehended was clearly not the gentleman in the probe image – it was not, as one noted “a fool-proof system”.

A covert operation?

The Met had been keen to emphasise that this deployment was overt – it was not intended to be secret or to catch people out. Although the operation made for an intimidating scene with a line of police officers and dogs alongside a knife-arch – a sort of walkthrough metal detector – there was actually alarmingly little information about the use of facial recognition technology.

Having been told there would be plenty of posters and information leaflets, we saw two small posters, positioned below people’s sightlines and just one leaflet being given out – to the man who was incorrectly apprehended, after the fact.

Not everyone was asked to go through the knife-arch, but the police made clear that anyone deliberately avoiding it would be monitored. The cameras were right next to the arch, so there was no way of swerving those and preventing your face from being scanned either without being treated with suspicion and subjected to a police interaction.

“Misinformation”

The police believe the public have been “misinformed” about the accuracy of their tech. They believe it is highly accurate in terms of correctly identifying people included on the watch list. But what is concerning is the number of people who are stopped following a false match, particularly when 98 per cent of “matches” are false.

It’s important to note that the Met’s figures on successful matches include their own officers on the scene, whose pictures are included and who are encouraged to pass the cameras to test out accuracy with different walking speeds.

We observed only one alert relating to a non-officer – the gentleman who was mistakenly stopped, searched and then kindly leafleted.

Privacy invasion

The police’s creeping rollout of facial recognition is not authorised by any law, guided by any official policy or scrutinised by any independent body. But even if it were, that couldn’t allay concerns of the enormous invasion of privacy the technology’s use represents.

Every person in range of a facial recognition camera will have their face scanned and their personal biometric information stored so they can be watched, monitored and judged. It is likely to have a chilling effect on where we go and who we spend time with – encouraging us to self-police when we attend a football match, go to a concert or even pop to the shops.

The Met aren’t alone in using the facial recognition. Liberty is taking South Wales Police to court to end its ongoing deployments – a case which will have repercussions for forces up and down the country.

Thursday’s operation proved beyond doubt this sort of surveillance technology has no place on our streets.

I'm looking for advice on this

Did you know Liberty offers free human rights legal advice?

What are my rights on this?

Find out more about your rights and how the Human Rights Act protects them

Did you find this content useful?

Help us make our content even better by letting us know whether you found this page useful or not

Need advice or information?