MPs Call To Stop Police Facial Recognition

  • Post author:
  • Post category:News
  • Post comments:0 Comments
  • Reading time:4 mins read

Following criticism of the Police use of facial recognition technology in terms of privacy, accuracy, bias, and management of the image database, the House of Commons Science and Technology Committee has called for a temporary halt in the use of the facial recognition system.

Database Concerns

Some of the key concerns of the committee were that the Police database of custody images is not being correctly edited to remove pictures of unconvicted individuals and that innocent peoples’ pictures may be illegally included in facial recognition “watch lists” that are used by police to stop and even arrest suspects.

While the committee accepts that this may be partly due to a lack of resources to manually edit the database, the MP’s committee has also expressed concern that the images of unconvicted individuals are not being removed after six years, as is required by law.

Figures indicate that, as of February last year, there were 12.5 million images available to facial recognition searches.

Accuracy

Accuracy of facial recognition has long been a concern. For example, in December last year, ICO head Elizabeth Dunham launched a formal investigation into how police forces use facial recognition technology (FRT) after high failure rates, misidentifications and worries about legality, bias, and privacy.  For example, the trial of ‘real-time’ facial recognition technology on Champions League final day June 2017 in Cardiff, by South Wales and Gwent Police forces was criticised for costing £177,000 and yet only resulting in one arrest of a local man whose arrest was unconnected.

Also, after trials of FRT at the 2016 and 2017 Notting Hill Carnivals, the Police faced criticism that FRT was ineffective, racially discriminatory, and confused men with women.

Bias

In addition to gender bias issues, the committee also expressed concern about how a government advisory group had warned (in February) that facial recognition systems could produce inaccurate results if they had not been trained on a diverse enough range of data, such as types of faces from different races e.g. black, asian, and other ethnic minorities.  The concern was that if faces from different races are under-represented in live facial recognition training datasets, this could lead to errors.  For example, human operators/police officers who are supposed to double-check any matches made by the system by other means before acting could defer to the algorithm’s decision without doing so.

Privacy

Privacy groups such as Liberty (which is awaiting a ruling on its challenge of South Wales Police’s use of the technology) and Big Brother Watch have been vocal and active in highlighting the possible threats posed to privacy by the police use of facial technology.  Also, even Tony Porter, the Surveillance Camera Commissioner,  has criticised trials by London’s Metropolitan Police over privacy and freedom issues.

Moratorium

The committee of MPs has therefore called for the government to temporarily halt the use of facial recognition technology by police pending the introduction of a proper legal framework, guidance on trial protocols and the establishment of an oversight and evaluation system.

What Does This Mean For Your Business?

Businesses use CCTV for monitoring and security purposes, and most businesses are aware of the privacy and legal compliance aspects (GDPR) of using the system and how /where the images are managed and stored.

As a society, we are also used to being under surveillance by CCTV systems, which can have real value in helping to deter criminal activity, locate and catch perpetrators, and provide evidence for arrests and trials. The Home Office has noted that there is general public support for live facial recognition in order to (for example) identify potential terrorists and people wanted for serious violent crimes.  These, however, are not the reasons why the MP’s committee has expressed its concerns, or why ICO head Elizabeth Dunham is launched a formal investigation into how police forces use FRT.

It is likely that while businesses would support the crime and terror-busting, and crime prevention aspects of FRT used by the police,  they would also need to feel assured that the correct legal framework and evaluation system are in place to protect the rights of all and to ensure that the system is accurate and cost-effective.

Leave a Reply