The UK’s data protection watchdog (the Information Commissioner’s Office i.e. the ICO) has said that it will be investigating the use of facial recognition cameras at King’s Cross by Property Development Company Argent.
Following reports in the Financial Times newspaper, the ICO says that it is launching an investigation into the use of live facial recognition in the King’s Cross area of central London. It appears that the Property Development Company, Argent, had been using the technology for an as-yet-undisclosed period, and using an as-yet-undisclosed number of cameras. A reported statement by Argent (in the Financial Times) says that Argent had been using the system to “ensure public safety”, and that facial recognition is one of several methods that the company employs to this aim.
The ICO has said that, as part of its enquiry, as well requiring detailed information from the relevant organisations (Argent in this case) about how the technology is used, it will also inspect the system and its operation on-site to assess whether or not it complies with data protection law.
The data protection watchdog has made it clear in a statement on its website that if organisations want to use facial recognition technology they must comply with the law and they do so in a fair, transparent and accountable way. The ICO will also require those companies to document how and why they believe their use of the technology is legal, proportionate and justified.
The main concern for the ICO and for privacy groups such as Big Brother Watch is that people’s faces are being scanned to identify them as they lawfully go about their daily lives, and all without their knowledge or understanding. This could be considered a threat to their privacy. Also, with GDPR in force, it is important to remember that if a person’s face (if filmed e.g. with CCTV) is part of their personal data, and the handling, sharing, and security of that data also becomes an issue.
An important area of concern to the ICO, in this case, is the fact that a private company is using facial recognition because the use of this technology by private companies is difficult to monitor and control.
Problems With Police Use
Following criticism of the Police use of facial recognition technology in terms of privacy, accuracy, bias, and management of the image database, the House of Commons Science and Technology Committee has recently called for a temporary halt in the use of the facial recognition systems. This follows an announcement in December 2018 by the ICO’s head, Elizabeth Dunham, that a formal investigation was being launched into how police forces use facial recognition technology (FRT) after high failure rates, misidentifications and worries about legality, bias, and privacy.
What Does This Mean For Your Business?
The use of facial recognition technology is being investigated by the ICO and a government committee has even called for a halt in its use over several concerns. The fact that a private company (Argent) was found, in this case, to be using the technology has therefore caused even more concern and has highlighted the possible need for more regulation and control in this area.
Companies and organisations that want to use facial recognition technology should, therefore, take note that the ICO will require them to document how and why they believe their use of the technology is legal, proportionate and justified, and make sure that they comply with the law in a fair, transparent and accountable way.