Skip to content
Blog

Regulation of biometrics debated

To mark the beginning of an independent review on the governance of biometric data, Ada hosted a debate on UK biometrics regulation

30 January 2020

Reading time: 5 minutes

The Facial Recognition and Biometrics – Technology and Ethics event, held in London on 29 January 2020, was jointly organised with the Foundation for Science and Technology and chaired by Baroness Beeban Kidron OBE.

It was attended by over 150 people from the public sector and industry with an interest in biometrics, including academics, parliamentarians, regulators, scientists and technologists. Speeches were given by Matthew Ryder QC, who is leading on the independent review, Carly Kind for the Ada Lovelace Institute, James Dipple-Johnstone for the Information Commissioner’s Office, and Carsten Maple from the University of Warwick. Guests included the Forensic Science Commissioner Gill Tully, the Biometrics Commissioner Paul Wiles and Cressida Dick, the Metropolitan Police Commissioner.

The debate contemplated the wide range of uses of new digitised forms of biometric recognition and the challenges and risks surrounding them. In particular, the debate conjured a puzzled Live Facial Recognition – or LFR – elephant, enabled by advances in technology, but waiting to be dealt with legally and ethically, one step at a time. When other new forms of biometrics are additionally considered, the scope for legal analysis is vast. This is rousing interest at all levels of government, and in companies.

Here are some of the themes that emerged in the debate:

Public persuasion vs trustworthiness

The frames of ‘protect and defend’, ‘nothing to hide, nothing to fear’ and ‘convenience’ influence public acceptance of biometric technologies, but are nonetheless destabilising for rights and justice concerns.

For example, a survey commissioned in 2019 by the Ada Lovelace Institute found that the public are mindful of both security threats and the need for convenience in airports, and are more readily accepting of facial recognition technology in this setting than in others, such as daily public transport (the relative infrequency of attending an airport may also be a factor).

More than once in the debate, assertions reflected the difference between inspiring trusting thoughts in the public (for example, security and protection from danger) and ensuring trustworthy governance processes (assessing, for example, the effectiveness of technology and its accordance with the law).

Outreach and engagement

Increasingly principled approaches to data protection may come to instil a more consistent legal framework for biometrics. However, without outreach and engagement, businesses and the public may remain mystified as to how to approach this little-known technology, especially if there is a distance between the rigour of legal analysis and the public’s knowledge and power.

Intelligent efforts are needed to bridge the gap, and this is where Ada Lovelace Institute’s support for the Citizens’ Biometrics Council comes in. The Council is intended to provide its own recommendations on the acceptable and unacceptable forms of governance for biometrics and the questions that publics ask of governance.

Police trials

Arguments for and against the regulation of live facial recognition technology have been strongly influenced by trials in some police forces. The police’s power to scan faces passing through designated areas, by reference to ‘watchlists’ or custody images, for example, has attracted public protests and challenges in the courts. A legal challenge over South Wales Police’s use of the technology was defeated in the Supreme Court, however the government strategy for oversight remains open to questions.

A perpetual line-up

The scope for unwanted retention of custody images or watchlists, and for images to be taken without a person’s knowledge, gives rise to concern about being monitored in perpetuity. A similar concern applies for private settings as in public sector policing, with shops and bars redlining some customers.

A parallel was drawn to earlier days of police retention of DNA profiles. Under past, more permissive, approaches to the retention of DNA evidence, the courts challenged DNA data being retained after charges had been dropped or acquitted. One complainant was eleven years old at the time of his arrest – he was acquitted of the crime, but even subsequent to his acquittal, DNA deletion had been refused.

The power of analysis

Biometric systems use biometric data, which is sensitive as it’s based on biological characteristics. Biometric analysis should therefore be regarded differently to some other types of analysis, as, unlike the monitoring of mobile phones or devices, what is being observed cannot be separated from people’s bodies.

The sensitivity of different forms of biometrics seems mainly to be assessed, under EU data protection law, through an ‘identity’ frame. This may give the impression that non-identifying uses are not as sensitive. However, biometric profiling and assessment tools span the assessment of other intimate personal details such as race, gender and behaviour.

In addition, while the automated monitoring of video feeds for signs of conflict or threat tries to identify behaviours in a crowd (rather than identifying individuals), it could still have implications for people’s liberties.

The case for pausing sales

The Ada Lovelace Institute has sought to influence supply-side thinking, by recommending that companies pause sales of facial recognition technologies until there is a known regulatory infrastructure that satisfies conditions for public trust and appropriate governance.

In recent weeks, Google’s Sundar Pichai announced his support for a moratorium (or a temporary ban) on sales of facial recognition technology, pending regulation. Supply-side steps against excessive uses of biometrics are in this way vying for attention. However, as some companies’ support for regulation grows, observers of the industry think that shaping the power of regulation to their advantage may shape the market further to anti-competitive ends. A blog by a member of our board, Azeem Azhar, weighs up this possibility in his independent newsletter Exponential View.

The Ryder Review

Matthew Ryder QC is leading an independent review of the governance of biometric data. The review will examine the existing regulatory framework and identify options for reform that will protect people from misuse of their biometric data such as facial characteristics, fingerprints, iris prints and DNA. The Review’s period of evidence gathering will launch from spring to summer 2020, with a report due in October. Further information on the Ryder Review and the working Terms of Reference can be found here.

Speakers’ remarks given in presentations during the Facial Recognition and Biometrics – Technology and Ethics debate have been reported in full by the Foundation for Science and Technology.

Speech by Carly Kind, Director of the Ada Lovelace Institute

Speech by Matthew Ryder QC, Review Chair

More audio and video presentations from the event are available on the Foundation for Science and Technology’s website.

Methodology
Event