A report published by the Independent newspaper has revealed that the expensive Facial Recognition Software being trialled by a UK Police Force is hardly ever accurate, and statistically is wrong almost 100% of the time.
The Independent obtained the information using a Freedom of Information request, and reveals that the Facial recognition software used by the UK Metropolitan Police (the biggest single police force in the UK, and one of the largest in Europe) has returned false positives in more than 98 per cent of alerts generated.
How does (should) it work?
Facial recognition software works by analyzing photos of peoples faces and attempting to pinpoint certain biometric identifiers, such as the distance between eyes, skin color, and the length of the person’s nose. The results are then checked for positive matches existing in police databases. However, the system is expensive to operate and requires high quality cameras to work effectively, and its proponents have been accused of attempting to herald the arrival of a 1984 ‘Big Brother’ State.
Becoming an Orwellian state?
Silkie Carlo, director of the Big Brother Watch pressure group based in the UK said that “…it is alarming and utterly reckless that police are using a technology that is almost entirely inaccurate, that they have no legal power for, and that poses a major risk to basic democratic freedoms. It must be dropped.”
The software in use has been defended as a prototype software and one that is still in a trial phase, but further concerns have also been raised by academics on the use of facial recognition software, who argue that there is a real danger than UK and European law are unable to adapt to the ever changing use of technology, and the pace with which it is being rolled out.
“In terms of governance, technical development and deployment is running ahead of legislation and these new biometrics urgently need a legislative framework, as already exists for DNA and fingerprints”, the UK’s independent biometrics commissioner, Paul Wiles said in an interview.