NOBODY LIKES THE FEELING that they’re being spied on, and fewer still like to know that footage of them is being used to make said surveillance easier. So a report from The Intercept highlighting IBM’s secret arrangement with the New York Police Department (NYPD) should make extremely uncomfortable reading.
When a company boasts about its ability to recognise traits like age, gender and ethnicity it’s worth asking the question of how it got so good at it. The answer, it turns out, is by tapping into CCTV cameras around New York for practice on unsuspecting citizens between 2007 and 2016.
Originally, the NYPD acquired the video analytics software from Vexcel – a Microsoft subsidiary – with a view to testing it in the counterterrorism command centre. At this point it was a relatively minor part of the NYPD’s plans, being used on “fewer than 50” of the 512 cameras in use.
Fast forward five years and IBM was “testing out the video analytics software on the bodies and faces of New Yorkers, capturing and archiving their physical data as they walked in public.” The kind of physical data examined was straight out of a game of ‘Guess Who’ – hair colour, gender, facial hair, age and skin tone.
The NYPD, for its part, states that it didn’t use the more controversial of these profiling features.
“While tools that featured either racial or skin tone search capabilities were offered to the NYPD, they were explicitly declined by the NYPD,” it said.
But former IBM researcher Rick Kjeldsen casts some doubt on that response, suggesting that the features wouldn’t be there if the NYPD hadn’t expressed an interest.
“We would have not explored it had the NYPD told us, ‘We don’t want to do that,'” he said. “No company is going to spend money where there’s not customer interest.”
Kjeldsen wants the morality of these systems being developed on the quiet to be part of a public conversation – which must absolutely delight his former employers.
“Are there certain places on the boundaries of public spaces that have an expectation of privacy? And then, how do we build tools to enforce that? That’s where we need the conversation. That’s exactly why knowledge of this should become more widely available — so that we can figure that out.” µ
Source : Inquirer