A man was given a public order fine after being stopped by police because he covered his face during a trial of facial recognition cameras in Romford, London.
What Facial Recognition Trial?
A deliberately “overt” trial of live facial recognition technology by the Metropolitan Police took place in the centre of Romford, London, on Thursday 31st January. This was supposed to be the first day of a two-day trial of the technology, but the second day was cancelled due to concerns that the forecast snow would only bring a low level of foot-fall in the area.
Live facial recognition trials of this kind use vehicle-mounted cameras linked to a police database containing photos from a watchlist of selected images from the police database. Officers are deployed nearby so that they can stop those persons identified and matched with suspects on the database.
In the Romford trial, the facial recognition filming was reported to have taken place from a parked police van and, according to the Metropolitan Police, the reason for the use of the technology was to reduce crime in the area, with a specific focus on tackling violent crime.
Why The Fine?
The trial also attracted the attention of human rights groups, such as Liberty and Big Brother Watch, members of which were nearby and were monitoring the trial.
It was reported that the man who was fined, who hasn’t been named by police, was observed pulling his jumper over part of his face and putting his head down while walking past the police cameras, possibly in response to having seen placards warning that passers-by were the subjects of filing by police automatic facial recognition cameras.
It has been reported that the police then stopped the man to talk to him about what they may have believed was suspicious behaviour and asked to see his identification. According to police reports, it was at this point that the man became aggressive, made threats towards officers and was issued with a penalty notice for a disorder as a result.
8 Hours, 8 Arrests – But Only 3 From Technology
Reports indicate that the eight-hour trial of the technology resulted in eight arrests, but only three of those arrests were as a direct result of facial recognition technology.
Some commentators have criticised this and other trials for being shambolic, for not providing value for money, and for resulting in mistaken identity.
Research Questions Reliability
Research by the University of Cardiff examined the use of facial recognition technology across several sporting and entertainment events in Cardiff for over a year, including the UEFA Champion’s League Final and the Autumn Rugby Internationals. The research found that for 68% of submissions made by police officers in the Identify mode, the image had too low a quality for the system to work. Also, the research found that the locate mode of the FRT system couldn’t correctly identify a person of interest for 76% of the time.
Also, in December 2018, ICO head Elizabeth Dunham was reported to have launched a formal investigation into how police forces use facial recognition technology (FRT) after high failure rates, misidentifications and worries about legality, bias, and privacy.
What Does This Mean For Your Business?
It has been reported that spending over £200,000 on the deployment of facial recognition trials on 6 deployments between August 2016 and July 2018, no arrests were made. On the surface, these figures suggest that, although the technology has the potential to add value and save costs, and although businesses in town centres are likely to welcome efforts to reduce crime, the trials to date don’t appear to have delivered value-for-money to taxpayers.
There was also criticism of the facial recognition system used in Soho, Piccadilly Circus and Leicester Square over two days in the run-up to Christmas, where freedom campaigners such as Big Brother Watch and Liberty were concerned about mixed messages from police about how those who turn away from facial recognition cameras mounted in/on police vans because they don’t want to be scanned could be treated.
Despite some valid worries and criticism, most businesses and members of the public would probably agree that CCTV systems have a real value in helping to deter criminal activity, locating and catching perpetrators, and providing evidence for arrests and trials. There are, however, several concerns, particularly among freedom and privacy groups, about how just how facial recognition systems are being (and will be) used as part of policing e.g. overt or covert, issues of consent, possible wrongful arrests due to system inaccuracies, and the widening of the scope of its purpose from the police’s stated aims. Issues of trust where our personal data is concerned are still a problem, as are worries about a ‘big brother’ situation for many people.