Don’t Overreact to a Couple of Accidents, Self-Driving Cars Can Save Thousands of Lives
© Dreamstime

Commentary

Don’t Overreact to a Couple of Accidents, Self-Driving Cars Can Save Thousands of Lives

Driverless cars don’t need to be perfect in order to be dramatically better than human drivers.

A couple of fatal crashes recently put a spotlight on the technological advancements of cars and how we’re going transition to the promising future of completely self-driving cars. In the Bay Area last month, a Tesla driver was killed while his car was on Autopilot. In that crash, Tesla reported, “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.”

And in Arizona a few weeks ago, a self-driving Uber car hit and killed a pedestrian while the human safety driver was in the vehicle. After the fatal accident, Uber suspended its autonomous car testing and decided it would not try to renew its permit to test self-driving cars in California.

Details of the Uber accident raise important questions about how Uber’s testing is being conducted, how we should think about the safety of driverless cars compared to the cars and human drivers of today, and how much regulation there should be for autonomous vehicles.

Before focusing on the future though, we should be realistic about the present. Human error causes 90 percent of today’s car crashes. There are more than 35,000 traffic fatalities a year in the United States, with over 3,500 traffic deaths in California in 2017.

Autonomous vehicles are a promising part of efforts to reduce traffic accidents and save lives. The calls for more regulation of self-driving cars in the wake of these accidents need to keep perspective on the number of fatalities on our roads today. There is nothing about the testing of autonomous vehicles thus far that indicates they present more of a threat than human drivers. Nor do they deserve additional regulatory attention.

In, fact, I’d argue that the system is working very well. Tesla reported that “Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40 percent.” And the company has been so open and transparent with the public about the details of the March 23 accident that the National Transportation Safety Board actually said it was “unhappy” with the release of too much “investigative information by Tesla.”

And in Arizona, Uber is suffering consequences from its mistakes. The governor suspended Uber indefinitely from testing in the state, and Uber halted its testing elsewhere. In effect, Uber has given up on being “first to market” with a driverless car, a very lucrative goal.  The company has taken a repetitional hit and will have to invest a great deal of money to get its testing program up and running again if it decides to go forward.

With Uber being punished for the accident, imposing new regulations on the rest of the industry would be a self-defeating form of group punishment.  The numerous companies testing driverless cars are taking very different approaches. Treating them with flexibility is fair and more likely to encourage the advancement of the technology. Continuing the current approach to testing driverless cars looks like the best way to protect public safety and see if the technology can develop solutions to some of our automotive safety problems.

Pedestrian deaths from cars, for example, are overwhelming problems of driver inattention.  And accident rates among elderly drivers are rising faster than the growth in their population, a problem very hard to address without a change in technology to something like autonomous vehicles. Self-driving cars provide the opportunity to address these problems in ways we simply can’t currently.

It’s important to remember that driverless cars don’t need to be perfect in order to be dramatically better than human drivers. It would be an epic tragedy for society to accept tens of thousands of deaths caused by human drivers every year because we want to hold autonomous vehicles to the impossible standards of zero accidents and zero deaths. Self-driving cars won’t be perfect, but they offer amazing safety potential that must be pursued.

This column originally ran in The Orange County Register.