Skip to main content

The first known pedestrian fatality caused by an Uber self-driving vehicle has prompted major concerns of the safety and moral responsibility of autonomous vehicles.

Who is really at fault when we transfer responsibility from a human driver to an automated system and what are the implications when tragedy strikes?

US woman Elaine Herzberg, 49, was walking her bicycle on the crosswalk of four-lane road in the Phoenix suburb of Tempe in the US state of Arizona about 10pm local time on Sunday evening. Police say Herzberg was struck by the Uber vehicle travelling at about 65 km/h, while the vehicle was in autonomous mode with an operator behind the wheel.

Herzberg was taken to hospital, but later died from her injuries sustained in the crash. Local television footage showed images of a crumpled bike and a Volvo XC90 SUV with a smashed-in front.

As a result, Uber has immediately suspended the use of self-driving cars it was testing or using in Tempe, Pittsburgh, Toronto, and San Francisco.

“Our hearts go out to the victim’s family. We are fully cooperating with local authorities as they investigate this incident,” Uber said in a statement.

Today’s deadly accident followed an earlier self-driving car accident that was reported in mid-2016, involving a Tesla. The Tesla Model S, cruising on autopilot, failed to detect a crossing tractor-trailer against a bright sky, killing the driver – who it later emerged had kept his hands off the wheel for extended periods of time despite automated warnings not to do so.

On top of widespread damage to Uber as a brand, there are reignited concerns that the industry is moving too fast to deploy self-driving vehicles safely.  One of the primary aims of the autonomous sector is to improve road safety and reduce traffic fatalities that come mostly from human error – and incidents fail to satisfy genuine industry concerns.

There is growing research that autonomous vehicles are lulling drivers into a false sense of security. Current autopilot systems are intended only for use with a fully attentive driver, but studies have shown that those in the driver’s seat are often slower to react when required to intervene.

“People are already inclined to be distracted. We’re on our phones, eating burgers, driving with our knees,” said Nidhi Kalra, senior information scientist at the Rand Corporation.

“Additional autonomy gives people a sense that something else is in control, and we have a tendency to overestimate the technology’s capabilities.”

Conversely Tesla’s CEO Elon Musk remains bullish about the future of driverless vehicles, previously saying that by 2019 drivers would “be able to sleep in their cars”.

The future is very clearly an autonomous one, but of greater challenge for the industry is how to transition in the safest possible way. Fatal incidents from Tesla and now Uber, indicate that the transition might be longer and more complex than originally thought.