Technology, Top News

Amazon’s Rekognition tech misidentifies 28 Congress members as criminals

EVERYTHING DO-ER Amazon’s controversial facial recognition technology falsely identified 28 members of Congress as people who have been arrested for crimes, according to the American Civil Liberties Union (ACLU).

The ACLU, which used Amazon’s open Rekognition API to scan the faces of all 535 members of Congress against 25,000 public mugshots, found that the software the software incorrectly matched 28 of them as criminals. 

The tests also showed indications of racial bias, with 11 of the 28 false matches identifying people of colour as criminals, despite the fact that people of colour make up only 20 per cent of those in Congress.

“An identification — whether accurate or not — could cost people their freedom or even their lives,” the ACLU said. “Congress must take these threats seriously, hit the brakes, and enact a moratorium on law enforcement use of face recognition.”

The ACLU’s report follows revelations from May that Amazon has been selling the Rekognition technology to US police, causing privacy advocates to raise concerns about “automating mass surveillance”. 

“Congress must take these threats seriously, hit the brakes, and enact a moratorium on law enforcement use of face recognition. This technology shouldn’t be used until the harms are fully considered and all necessary steps are taken to prevent them from harming vulnerable communities,” the ACLU added.

Amazon, naturally, defended its Rekognition software in a statement, saying that the ACLU’s results could “probably be improved” if the test had increased the “confidence thresholds”.

ACLU used an 80 per cent confidence threshold, but Amazon said in its statement, “When using facial recognition for law enforcement activities, we guide customers to set a threshold of at least 95 per cent or higher.”

“We remain excited about how image and video analysis can be a driver for good in the world, including in the public sector and law enforcement,” Amazon added.

The ACLU’s report comes after UK privacy campaigning group Big Brother Right Watch this week launched a legal challenge against the Met’s “dangerously authoritarian” facial recognition cameras.

The organisation is calling on the UK government and Metropolitan Police to “immediately end” the force’s use of real-time facial recognition cameras, claiming rozzers don’t have a lawful basis for using the “China-style” technology. µ

Further reading

Source : Inquirer

Previous ArticleNext Article
Founder and Editor-in-Chief of 'Professional Hackers India'. Technology Evangelist, Security Analyst, Cyber Security Expert, PHP Developer and Part time hacker.

Send this to a friend