Facial recognition
Facial recognition consultation: have your say
Posted on 30 Jan 2026
Many years on from the first use of facial recognition technology by police in the UK, and half a decade since the Court of Appeal ruled against South Wales Police in Liberty’s world-first legal challenge to the technology, the public is finally being given the chance to have its say on facial recognition.
For all this time, this powerful and dangerous technology has been in operation without any specific legislation to govern its use. We have had no dedicated law in place to set guardrails, implement safeguards, and keep us all safe.
Now the Government has announced plans to roll out live facial recognition across the whole country, with 40 new vans equipped to carry out deployments, up from the current fleet of ten. It is more important than ever that we get the regulation right.
What you can do
A consultation is now live, running until Thursday 12 February, and asking how you feel about this technology and what sort of law you think should be put in place to protect your rights. At the links below you can find:
- the consultation document
- an online tool for responding
- the main page with all of the details and extra documents for further reading, as well as an email address if you prefer to send the Home Office a document, rather than filling out the tool.
There are 17 questions, and you can answer as many or as few as you like.
We very strongly urge anyone with a view on facial recognition to respond to the consultation in your own words, outlining how you feel and what you think should be put in place. This is a great opportunity to influence the creation of these laws, and we should not pass it up.
How Liberty is responding
We at Liberty will be answering all of the questions in the consultation, basing our response around the principles that we have developed relating to the policing use of facial recognition technology. Below, we will outline some of them and show how they relate to the questions being asked.
If facial recognition is to be used, we need strict safeguards to protect people’s privacy
Question 6 of the consultation acknowledges the impact that that facial recognition has on people’s privacy, outlining several factors that should be taken into account in assessing the interference with privacy that facial recognition has, including questions of consent, the handling of data, and the size and nature of the database or watchlist.
These are all very important factors, but they must be translated into specific safeguards to protect people’s privacy under the framework. For example, advanced notification of deployments must be made more meaningful than the current sign placed near a van and social media post that morning. Where good practice exists, such as with the instantaneous deletion of data that does not produce a match, this must be put into legislation. And ‘watchlists’ – the bank of people looked for at each deployment, must be bespoke, targeted, limited and transparent.
In December, Liberty Investigates found that hundreds of children as young as 12 have been included on watchlists for deployments. The police were not able to say exactly how many children had been included, or provide the reasons that they appeared on these lists.
Many members of the public may feel that the impact on people’s privacy of live facial recognition technology is so great that it should never be used. That is a legitimate view, but with the Home Office and the police committed to using it, we will be using the opportunity of the consultation to get what safeguards we can in place to reduce that interference as much as we can.
Facial recognition should never be used at protests
Question 7 asks how the new legal framework should protect other rights, specifically naming the right to freedom of expression and freedom of assembly. The question asks how facial recognition can be used in a “balanced way” at protests that is justified and proportionate. We do not believe there is a way.
Liberty’s position is that the use of facial recognition at protests would create such a ‘chilling effect’, dissuading people from exercising their rights to freedom of expression and assembly, that it should not be allowed.
Protest is the lifeblood is our democracy. Our freedoms of expression and assembly allow us to stand up for our other freedoms when they are threatened. We have seen many restrictions on protest these past few years – we should not allow another one.
Facial recognition should only be used where there is serious need
Question 8 asks if respondents agree that seriousness of harm should be a factor in deciding how and when law enforcement organisations can use facial recognition. We agree strongly that it should.
At present, facial recognition is used for a wide variety of reasons, often targeting people wanted for very low-level offences. We believe there should be a higher bar.
Under the EU’s AI Act for example, live facial recognition may only be used for:
- the targeted search for specific victims of abduction, human trafficking or sexual exploitation, or missing people
- the prevention of a specific, substantial and imminent threat to life, or a genuine threat of a terrorist attack
- the localisation or identification of a person suspected of having committed a serious criminal offence, listed in the Act.
If the Government and police are completely committed to the ongoing use of facial recognition technology, restricting its use in this way to the most serious circumstances appears to be a sensible compromise.
Using facial recognition should require independent authorisation
Questions 10 and 11 ask whether some uses of facial recognition should require more senior authorisation, and whether in some cases that authorisation should come from an independent body, as opposed to a policing figure. Current practice is that authorisation should come from an officer not below the rank of superintendent.
We believe that all deployments of live facial recognition technology should require authorisation from an independent authority. The police should have to outline the details of their proposed deployment, the policing need for it, and how they will minimise the interference with people’s rights, and if authorisation is not forthcoming, the deployment should not take place.
As above, this mirrors a provision in the EU AI Act, which states that any use of live facial recognition by law enforcement shall be subject to a “prior authorisation granted by a judicial authority or an independent administrative authority whose decision is binding of the Member State in which the use is to take place”. The Act makes provision for a deployment to start without authorisation in a “duly justified situation of urgency”, but a request must be made without delay and within 24 hours at the latest, and stopped if the request is denied.
We believe this should be replicated here in our new legal framework for the use of these technologies.
Rigorous, effective, meaningful oversight is necessary
Questions 14 and 15 ask about the creation of a new body to oversee law enforcement use of facial recognition and similar technologies. The consultation document outlines several functions such a body could undertake and asks what powers or obligations it should have.
Liberty believes the following factors are absolutely necessary in the creation of a new body:
- It must be truly independent of government and law enforcement
- It must be properly funded, and have adequate capacity, staffing and expertise to carry out its functions
- It must have full, free and immediate access to the information it needs to carry out its work without obstruction
- It must itself be fully transparent, publishing findings and ensuring accountability
- It must have the power to ensure compliance with the rules that it lays down, with enforceable sanctions where necessary.
Everything possible must be done to tackle bias and discrimination
The final two questions, 16 and 17, ask about how to guard against bias and discrimination in the use of facial recognition, whether the body should concern itself with this and if so how.
We strongly agree that the body should do all it can to tackle bias and discrimination in the use of facial recognition technology.
Bias and discrimination remain significant problems in the use of facial recognition. The technology may improve over time, and setting these tools to a higher ‘threshold’ can help to reduce false positives, but this requires rules to be set and the police to follow them.
In December, the Home Office revealed that the retrospective facial recognition tool the police had been using to search the Police National Database was extremely racially disproportionate at certain settings, with false positive rates at 0.1% for white women and 9.9% for black women. Liberty Investigates found that the police not only knew about this, but actually lobbied to use the biased settings.
It is unlikely that bias will ever be fully removed from these systems, but the new body must set a high threshold for these tools and ensure that the police stick to it.
Your response
Above are just a few of the principles that will be guiding our response to the consultation. We have not covered everything or touched on every question, but we hope it will help you to think about your own response.
We have been calling for this opportunity for a long time, and it may be even longer before another comes along, so please do have your say. Whether you want to answer all of the questions, pick out one or two, or just send some thoughts through to the Home Office, we encourage you to let them know what you think.
Click on this link to read the consultation document, find the tool to answer, and get your response in by 12 February.
This has been years in the making. Let’s not miss our chance.
I'm looking for advice on this
Did you know Liberty offers free human rights legal advice?
What are my rights on this?
Find out more about your rights and how the Human Rights Act protects them
Did you find this content useful?
Help us make our content even better by letting us know whether you found this page useful or not
