Automated facial recognition

Read article on automated facial recognition

The High Court in Cardiff has now found that South Wales Police's use to date of automated facial recognition (AFR) has been consistent with the requirements of the Human Rights Act and data protection legislation. It also found that the current legal regime is adequate to ensure the appropriate and non-arbitrary use of the AFR system that has been used in trials by South Wales Police. It should be noted that this is a judgment in the first instance and that the claimant now intends to appeal. This is not surprising given the fundamental rights at stake and that, as is stated in the judgment, this is the first time that any court in the world has considered AFR.

The judgment itself considers solely the use of AFR by the police rather than any other public or indeed private bodies. Moreover, the judgment is specific to the particular circumstances in which South Wales Police used their AFR system. Whilst South Wales Police should be commended for the thoughtful, considered way in which they have carried out their trials to date there remains a wider issue, which is not limited to whether there is a legal basis for the police to carry out trials of AFR. The bigger question going forward is whether there should be a specific legal framework for the police (and others) to routinely deploy new biometrics including AFR but also voice recognition, gait analysis, iris analysis or other new biometric technologies as they emerge. The judgment in this case does not provide the answer to this, which is, in my view, for Ministers and Parliament to decide.

My role as Biometrics Commissioner was created under the Protection of Freedoms Act 2012. The act's title reflected the fact that it aimed to ensure that the police use of DNA and fingerprints delivered public benefits for law enforcement and national security whilst the intrusion into the individual's liberty and freedom to have a private life without state interference was limited and regulated by law.

Since that Act was passed the use of new biometrics, such as facial image matching is rapidly being adopted by both law enforcement agencies and private sector companies. This is not simply the use of CCTV or other camera systems that have existed for some time but the linking of cameras with a biometric technology that enables the identification and matching of individuals and is also claimed to identify a person's emotional reactions to stimuli such as advertising or to predict future action and threats. This is happening because three technologies - biometric matching, artificial intelligence and big data analytics - are reinforcing each other and producing technical improvements very rapidly. Facial matching is just one example, but an important one because our faces are always on display and so can easily be captured.

Just like DNA and fingerprints, all such systems are an intrusion into an individual's privacy and potentially into their liberty and therefore it is not surprising that groups such as Liberty and Big Brother Watch have challenged its legality or called for its banning.

Given that these new technologies have multiple and widespread uses the question of whether we allow such systems to be used, and for what purposes and within what legal control will shape the nature of our social and political world well into the future. For that reason the choices that are now before us about the use of biometric systems are strategic decisions about the future world we want to live in.

China has already made such a strategic choice and is trialling biometric systems that are designed to allow the state to constantly monitor the behaviour of its citizens and control their future actions and thinking. China is also aiming to be the leading country in the development of these new technologies and to export its technology to other countries.

I am not suggesting that the UK will want to make the same strategic choice as China but simply that we also have to decide how we do wish to see the new technologies used and what kind of future world we want to thereby to create.

Up until now, insofar as there has been a public debate, it has been about the police trialling of facial image matching in public places and whether this is lawful or whether in future it ought to be lawful. As Biometrics Commissioner I have reported on these police trials and the legal and policy question they have raised to the Home Secretary and to Parliament. However, the debate has now expanded as it has emerged that private sector organisations are also using the technology for a variety of different purposes. Public debate is still muted but that does not mean that the strategic choices can therefore be avoided, because if we do so our future world will be shaped in unknown ways by a variety of public and private interests: the very antithesis of strategic decision making in the collective interest that is the proper business of government and Parliament.

The use of biometrics and artificial intelligence analysis is not the only strategic question the country presently faces. However, that is no reason not to have an informed public debate to help guide our lawmakers. I hope that ministers will take an active role in leading such a debate in order to examine how the technologies can serve the public interest whilst protecting the rights of individuals citizens to a private life without the unnecessary interference of either the state or private corporations. As in 2012 this again is about the 'protection of freedoms'.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.