You look at a machine, the machine looks back.
It feels like 2001: A Space Odyssey has come back with a creeping vengeance, but the reality is, as seen in the Hong Kong protests or at football matches, that the technology of facial recognition has been used by police forces and private companies globally to comb through large groups of people to identify a select few.
With technology developing quicker than laws can be made to understand and navigate it, there have been considerable concerns that, despite poor results in trials, the Metropolitan Police has started utilising facial recognition.
The decision to roll out technology should always be carefully considered, especially if this technology is used invasively, in a way that is at odds with our rights, and leads to overwhelmingly incorrect results. All of this has been true of facial recognition, with 96% of results across 8 trials in London leading to false positive results.
If these results were not shocking enough, evidence from graduate student Joy Buolamwini suggests that this technology cannot tell people of colour apart.
Labour MP Chi Onwurah spoke about the fact that facial recognition ‘automates the prejudices of those who design it and the limitations of the data on which it is trained.’ This is heavily supported by Buolamwini’s research, as well as further research. Algorithmic discrimination like this serves to perpetuate and validate prejudices based on race in a way that have genuine and palpable consequences in people’s lives.
Not only is it a crass and blatant discrimination at the heart of the algorithms that power these machines, which sits in tandem with the Metropolitan Police’s institutionally racist beliefs, but it has already led to people being racially profiled, with a 14 year-old boy black schoolboy being fingerprinted after he was misidentified.
How can the Met justify the decision to deploy a technology like this?
It has to be accepted that as technology advances those who fight crime should have access to the very best up to date systems and equipment to assist in their work. However the tools used by authorities should be free of ingrained biases and should not be used if they are based on and perpetuate discriminatory practices. It is highly troublesome that a police force would continue to use faulty and potentially invasive surveillance, such as facial recognition, to prevent crime.