Autonomous vehicles, false positive, Information Security, self-driving cars, Top News, Uber

Uber car software detected woman before fatal crash but failed to stop

In March, 49-year-old Elaine Herzberg became what’s believed to be the first pedestrian killed by a self-driving car.

It was one of Uber’s prototypes that struck Herzberg as she walked her bicycle across a street in Tempe, Arizona on a Saturday night. There was a human test driver behind the wheel, but video from the car’s dash cam published by SF Chronicle shows that they were looking down, not at the road, in the seconds leading up to the crash.

Police say that the car didn’t try to avoid hitting the woman.

The SF Chronicle reports that Uber’s self-driving car was equipped with sensors, including video cameras, radar and lidar, a laser form of radar. Given that Herzberg was dressed in dark clothes, at night, the video cameras might have had a tough time: they work better with more light. But the other sensors should have functioned well during the nighttime test.

But now, Uber has reportedly discovered that the fatal crash was likely caused by a software bug in its self-driving car technology, according to what two anonymous sources told The Information.

Uber’s autonomous programming detects objects in the road. Its sensitivity can be fine-tuned to ensure that the car only responds to true threats and ignores the rest – for example, a plastic bag blowing across the road would be considered a false flag, not something to slow down or brake to avoid.

The sources who talked to The Information said that Uber’s sensors did, in fact, detect Herzberg, but the software incorrectly identified her as a “false positive” and concluded that the car did not need to stop for her.

The Information’s Amir Efrati on Monday reported that self-driving car technologies have to make a trade-off: either you can have a car that rides slow and jerky as it slows down or slams on the brakes to avoid objects that aren’t a real threat, or you have a smoother ride that runs the risk of having the software dismiss objects, potentially leading to the catastrophic decision that pedestrians aren’t actual objects.

Efrati pointed to GM’s Cruise self-driving cars as being prone to falling on the overly cautious end of the spectrum, as they “frequently swerve and hesitate.”

[Cruise cars] sometimes slow down or stop if they see a bush on the side of a street or a lane-dividing pole, mistaking it for an object in their path.

In March, Uber settled with Herzberg’s family, avoiding a civil suit and thereby sidestepping questions about liability in the case of self-driving cars, particularly after they’re out of the test phase and operated by private citizens.

Arizona halted all of Uber’s self-driving tests following the crash. Other companies, including Toyota and Nvidia, voluntarily suspended autonomous vehicle tests in the wake of Herzberg’s death, while Boston asked local self-driving car companies to halt ongoing testing in the Seaport District.


Source : Naked Security

Previous ArticleNext Article
Founder and Editor-in-Chief of 'Professional Hackers India'. Technology Evangelist, Security Analyst, Cyber Security Expert, PHP Developer and Part time hacker.

Send this to a friend