Technology, Top News

Met Police’s facial recognition tech is worse than using a magnifying glass yourself

THE UK’S BIOMETRICS REGULATOR (because we have one) has warned that the facial recognition used by the Metropolitan Police is ’98 per cent inaccurate’.

The technology, which was developed to help find faces in crowds like some sort of dystopian Where’s Wally? book on CCTV, has so far produced 104 alerts. A Freedom of Information request showed that only two were positive IDs.

South Wales Police has a similar system which was rolled out on 15 occasions since launch and yet only got 234/2400 identifications right. In spite of all this, the first arrest based on facial recognition was nearly a year ago

The Met argues that these are not “false positives” because they were double checked. Which is a bit like choosing the wrong card during a card trick, and then saying you deserve to be in the magic circle because it’s definitely in the deck somewhere.

Meanwhile, Professor Paul Wiles, the biometrics commissioner warned: “I have told both police forces that I consider such trials are only acceptable to fill gaps in knowledge and if the results of the trials are published and externally peer-reviewed. We ought to wait for the final report, but I am not surprised to hear that accuracy rates so far have been low as clearly the technology is not yet fit for use.”

He goes on to warn that the facial recognition should be regulated in the same way as other biometrics like retinal and fingerprint technology.

The technology has been under investigation for over 10 years before most of us knew it was a thing. 

Described as ‘intrinsically Orwellian’ by campaigners, there are mounting calls for the whole system to be scrapped. Silkie Carlo, director at Big Brother Watch argues:

“It is alarming and utterly reckless that police are using a technology that is almost entirely inaccurate, that they have no legal power for, and that poses a major risk to basic democratic freedoms. It must be dropped.”

Others warn that the technology could be used for exploitative surveillance and cite the “social credit” system being used in China as an example. µ  

Source : Inquirer

Previous ArticleNext Article
Founder and Editor-in-Chief of 'Professional Hackers India'. Technology Evangelist, Security Analyst, Cyber Security Expert, PHP Developer and Part time hacker.

Send this to a friend