Uber Suspends Driverless-Car Program After Pedestrian Death

A self-driving car from Uber Technologies Inc. struck a woman in Tempe, Ariz., who later died, police say, in what is believed to be the first known fatality of a pedestrian from a driverless vehicle.

Following the accident late Sunday night, Uber is temporarily pulling its self-driving cars off the roads in Tempe, San Francisco, Pittsburgh and Toronto, where it is testing them, a spokeswoman said. She said Uber is investigating the incident and cooperating with authorities.

Tempe police said the Uber vehicle, which included a human operator to assist at the wheel, struck 49-year-old Elaine Herzberg while she was walking her bicycle on the street outside of a crosswalk. The woman later died of her injuries, according to the police statement.

“Some incredibly sad news out of Arizona,” wrote Uber CEO Dara Khosrowshahi on Twitter Monday. “We’re thinking of the victim’s family as we work with local law enforcement to understand what happened.”



Both the National Transportation Safety Board and the National Highway Traffic Safety Administration said they were dispatching teams to Tempe to investigate the accident. NHTSA said it was in contact with Uber, state and local authorities as well as Volvo, the car maker Uber relies on for its self-driving vehicles.

Volvo, a unit of China’s Zhejiang Geely Holding Group Co., said in a statement: “We are aware of this incident and our thoughts are with the family of the woman involved.”

The first known fatality of a pedestrian by an autonomous vehicle threatens to stir regulators and damage public perception of driverless vehicles, a critical project for auto makers and technology companies who feel they can reduce deaths and costs by eliminating human error. Uber has called its self-driving-vehicle efforts “existential” and just wrapped up a costly lawsuit from rival Alphabet Inc. over allegedly stolen trade secrets.

Missy Cummings, a professor at Duke University, cautioned Congress in 2016 about companies rushing to put systems into widespread deployment and warned that a death could set back development of the potentially lifesaving technology.

“There is no question that someone is going to die in this technology,” she said to Congress in 2016. “The question is when and what can we do to minimize that?” On Monday, Ms. Cummings repeated that message, saying, “It is a day that we knew would come.”



Congress, trying to balance safety while encouraging technology development, has been mulling legislation to clear up regulatory questions about autonomous-vehicle deployment. The legislation appeared to be moving quickly until stalling in the Senate this year under concerns about the safety of the technology.

Tesla Inc. became the first auto maker to come under significant government scrutiny for a semi-autonomous driving system when a man driving one of its Model S electric cars operating with the company’s Autopilot system died in a May 2016 collision with a truck on a Florida highway.

Ultimately, NHTSA concluded Tesla’s technology didn’t contain a safety defect while the NTSB decided that the company shared blame in the crash by failing to include enough safeguards.

Tesla has said Autopilot significantly makes its vehicles safer and that the company would continue to evaluate recommendations as the technology evolves while ensuring drivers understand the system doesn’t render cars fully self-driving.

While the technology is still largely unproven, a range of auto makers like Volvo, General Motors Co. and Ford Motor Co. and tech giants such as Uber and Alphabet, parent of Google, are racing to put driverless systems on the road and claim a stake in the $2 trillion of revenue tied annually to autos, according to Deloitte Consulting. Alphabet’s Waymo, which has been testing vehicles in the Phoenix metro area without humans behind the wheel, plans to begin commercial robot taxi services there this year.

These companies have braced for the inevitability of a fatality caused by an autonomous vehicle. But car and tech executives contend that while people are bound to die in the pursuit of fully driverless vehicles, the technology ultimately could save thousands of lives.

According to government figures, 94% of crashes involve human error. The number of lives lost on U.S. roads surged nearly 6% to 37,461 in 2016, according to the most recent government data. A recent nonprofit study found motor-vehicle deaths remained near decade-high levels in 2017.

U.S. highway safety regulators, in the absence of codified rules, are working on updating guidance for autonomous vehicles that began under the Obama administration. The Trump administration has taken pains to emphasize that safety assessment letters the guidance suggests companies submit to regulators are voluntary.

Regulators maintain a role overseeing safety of automated driving systems, but they are “not going to be top-down, we are not going to be command and control, we’re tech neutral,” said Transportation Secretary Elaine Chao during a January stop in Detroit at the city’s annual auto show. She put the onus on auto makers and Silicon Valley firms to persuade consumers that driverless-car systems are safe lest motorists lose confidence in them, which could hinder companies’ technology development and growth.

While robot cars are being created to follow traffic rules, interactions with humans continue to present hurdles. Pedestrians, in particular, can confuse systems because they are unpredictable.

Even human drivers struggle to deal with pedestrians. About 50% of pedestrian fatalities involve people running into the road, failing to yield the right of way or otherwise crossing improperly, according to research by Duke’s Ms. Cummings. The Governors Highway Safety Association estimates nearly 6,000 pedestrians were killed in the U.S. last year, representing about 16% of all motor-vehicle deaths.

Autonomous cars often use a complicated fusion of data coming from cameras, radar and laser sensors used by artificial intelligence to identify other cars, pedestrians and obstacles.

“The computer vision systems are incredibly brittle in these cars,” Ms. Cummings said Monday. “There’s a strong, high probability that the computer vision system failed to detect the person.”

For example, the 2016 Tesla crash involved the Model S hitting a truck that was crossing the road. The company said its car’s system couldn’t see the truck’s white trailer against a bright sky. It isn’t clear if Uber’s laser senors failed to detect the pedestrian.

This is the second crash in Tempe, a city of about 180,000 that is home to Arizona State University, to rock Uber’s autonomous-car program in the past year. Almost exactly a year ago, Uber briefly suspended its testing after one of its Volvo SUVs collided with another in an intersection, flipping the Uber vehicle on its side. No one was seriously injured, and police determined the Uber vehicle wasn’t at fault. The company resumed operations in three days.

Uber began testing the autonomous vehicles in Tempe last year after California regulators revoked the cars’ registrations because the startup failed to get a permit to operate them there. Uber has since paid $150 to get a permit in California, where it is testing them without customer passengers. It has also been testing in Pittsburgh since September 2016, and last year it put autonomous vehicles in Toronto.