BIOMETRICSFacial Recognition: U.K. Plans to Monitor Migrant Offenders Are Unethical – and They Won’t Work

By Namrata Primlani

Published 18 August 2022

The UK Home Office plans to make migrants convicted of criminal offences scan their faces five times a day using a smart watch equipped with facial recognition technology. This is a mistake. The difficulty working with darker skin tones reflects the experiences of people of color who try to use facial recognition technology. In recent years, researchers have demonstrated the unfairness in facial recognition systems, finding that the software and algorithms developed by big technology companies are more accurate at recognizing lighter skin tones than darker ones.

One afternoon in our lab, my colleague and I were testing our new prototype for a facial recognition software on a laptop. The software used a video camera to scan our faces and guess our age and gender. It correctly guessed my age but when my colleague, who was from Africa, tried it out, the camera didn’t detect a face at all. We tried turning on lights in the room, adjusted her seating and background, but the system still struggled to detect her face.

After many failed attempts, the software finally detected her face – but got her age wrong and gave the wrong gender.

Our software was only a prototype, but the difficulty working with darker skin tones reflects the experiences of people of color who try to use facial recognition technology. In recent years, researchers have demonstrated the unfairness in facial recognition systems, finding that the software and algorithms developed by big technology companies are more accurate at recognizing lighter skin tones than darker ones.

Yet recently, the Guardian reported that the UK Home Office plans to make migrants convicted of criminal offences scan their faces five times a day using a smart watch equipped with facial recognition technology. A spokesperson for the Home Office said facial recognition technology would not be used on asylum seekers arriving in the UK illegally, and that the report on its use on migrant offenders was “purely speculative”.

Get the Balance Right
There will always be a tension between national security and individual rights. Security for the many can take priority over privacy for a few. For example, in November 2015 when the terrorist group ISIS attacked Paris, killing 130 people, the Paris police found a phone that one of the terrorists had abandoned at the scene, and read messages stored on it.

There is a lot of nuance to this issue. We must ask ourselves, whose rights are curbed by a breach of privacy, to what degree, and who judges if a breach of privacy is in balance with the severity of a criminal offence?