ICO Warns Police on Facial Recognition

In a recent blog post, Elizabeth Denham, the UK’s Information Commissioner, has said that the police need to slow down and justify their use of live facial recognition technology (LFR) in order to maintain the right balance in reducing our privacy in order to keep us safe.

Serious Concerns Raised

The ICO cited how the results of an investigation into trials of live facial recognition (LFR) by the Metropolitan Police Service (MPS) and South Wales Police (SWP) led to the raising of serious concerns about the use of a technology that relies on a large amount of sensitive personal information.

Examples

In December last year, Elizabeth Denham launched the formal investigation into how police forces used FRT after high failure rates, misidentifications and worries about legality, bias, and privacy.  For example, the trial of ‘real-time’ facial recognition technology on Champions League final day June 2017 in Cardiff, by South Wales and Gwent Police forces was criticised for costing £177,000 and yet only resulting in one arrest of a local man whose arrest was unconnected.

Also, after trials of FRT at the 2016 and 2017 Notting Hill Carnivals, the Police faced criticism that FRT was ineffective, racially discriminatory, and confused men with women.

MPs Also Called To Stop Police Facial Recognition

Back in July this year, following criticism of the Police usage of facial recognition technology in terms of privacy, accuracy, bias, and management of the image database, the House of Commons Science and Technology Committee called for a temporary halt in the use of the facial recognition system.

Stop and Take a Breath

In her blog post, Elizabeth Denham urged police not to move too quickly with FRT but to work within the model of policing by consent. She makes the point that “technology moves quickly” and that “it is right that our police forces should explore how new techniques can help keep us safe. But from a regulator’s perspective, I must ensure that everyone working in this developing area stops to take a breath and works to satisfy the full rigour of UK data protection law.”

Commissioners Opinion Document Published

The ICO’s investigations have now led her to produce and publish an Opinion document on the subject, as is allowed by The Data Protection Act 2018 (DPA 2018), s116 (2) in conjunction with Schedule 13 (2)(d).  The opinion document has been prepared primarily for police forces or other law enforcement agencies that are using live facial recognition technology (LFR) in public spaces and offers guidance on how to comply with the provisions of the DPA 2018.

The key conclusions of the Opinion Document (which you can find here: https://ico.org.uk/media/about-the-ico/documents/2616184/live-frt-law-enforcement-opinion-20191031.pdf) are that the police need to recognise the strict necessity threshold for LFR use, there needs to be more learning within the policing sector about the technology, public debate about LFR needs to be encouraged, and that a statutory binding code of practice needs to be introduced by government at the earliest possibility.

What Does This Mean For Your Business?

Businesses, individuals and the government are all aware of the positive contribution that camera-based monitoring technologies and equipment can make in terms of deterring criminal activity, locating and catching perpetrators (in what should be a faster and more cost-effective way with live FRT), and in providing evidence for arrests and trials.  The UK’s Home Office has also noted that there is general public support for live FRT in order to (for example) identify potential terrorists and people wanted for serious violent crimes.  However, the ICO’s apparently reasonable point is that moving too quickly in using FRT without enough knowledge or a Code of Practice and not respecting the fact that there should be a strict necessity threshold for the use of FRT could reduce public trust in the police and in FRT technology.  Greater public debate about the subject, which the ICO seeks to encourage, could also help in raising awareness about FRT, how a balanced approach to its use can be achieved and could help clarify matters relating to the extent to which FRT could impact upon our privacy and data protection rights.  

Posted in Artificial Intelligence (AI), Data Security, Hardware, Legislation, Software.

Leave a Reply

Your email address will not be published. Required fields are marked *