In a well publicized incident former Navy Seal Joshua Brown was killed when his self-driving Tesla failed to notice a truck approaching at a right angle. The car slid under the tractor trailer with the bottom of the truck crushing the vehicle’s windshield.
The incident is a tragedy for the Brown family. But the long-term legacy may be how this accident causes government officials to change their approach to AVs. Tesla’s failure to fix all the bugs in its system before it was released may set back autonomous vehicle development.
Joshua Brown was an automated vehicle enthusiast. He liked to show off the features of his car on YouTube. He operated his own YouTube channel where he uploaded some 25 videos of his car operating in Autopilot mode. One video showed how his car narrowly avoided a sideswipe with a truck. Brown was known to take risks and was hardly a model driver; he had eight speeding citations over the past six years. Witnesses report that Brown was watching a Harry Potter video on an auxiliary DVD player at the time of the crash.
Tesla, for its part, noted how CEO Elon Musk has advised drivers to keep their hands on the steering wheel at all times. Brown said the same thing in one of his YouTube videos but there are many videos of him taking his hands off the wheel for long periods at a time.
This crash occurred for several reasons,and even though Brown is to blame, a more responsible approach by Tesla could have prevented the accident. The crash occurred because the Tesla software could not distinguish between the white truck crossing perpendicularly to the car and the blue sky. Considering that self driving automated vehicles will need to differentiate more difficult situations than the difference between blue and white, this is a troubling development.
Tesla is correct that Brown should have been paying attention to the road. But humans do stupid things, and ultimately human behavior is one of the major factors AV makers need to consider.
Originally, Google, Apple and the traditional automakers were planning to proceed with automation one level at a time, according to a chart developed by the Society of Automotive Engineers. Most of today’s luxury cars have NHTSA level 2 automated features where the car can take control of driving-level specific automated features such as steering or braking. Originally, car makers were planning to add SAE level 3, in which the system handles all aspects of driving followed by SAE level 4, in which the system could handle all aspects of driving with limited periods of human interaction, followed by level 5, in which humans did not need to be able to take control.
Around 18 months ago, during its testing Google noticed that once level 3 systems were available, drivers stopped paying attention to driving. The company noticed again and again that once the car was automated at a certain level, drivers’ minds wandered and the chances of a crash increased exponentially. So the company shifted its goal from manufacturing vehicles with level 3 features to producing only vehicles with level 5 features. Traditional automakers such as GM and Toyota have made similar observations and switched their goals as well.
Yet Tesla stuck with its original plan of step-by-step levels. As a result, Tesla released a software update to provide level 3 automation in its vehicles. We now know that Tesla’s software has a holebig enough to drive a truck through. One of the major differences between the software world and the automotive world is what happens when the technology does not work. In the software world, when Microsoft Word fails, it’s annoying but not dangerous. However, when automated vehicle software fails, somebody dies. It’s ironic that the internet company Google grasped this concept but Elon Musk who owns a car company failed to understand it.
The bigger worry is what will happen now. And the likely answer is that NHTSA will promulgate new regulations for the AV industry. Researchers across the political spectrum have been advising NHTSA to take a light touch on regulation, arguing that AV technology is still developing and imposing draconian regulations would blunt the creative potential of the technology. However,safety advocates have long been concerned about the safety of AVs. Now they have a reason to justify more regulations. Ironically,if regulations significantly delay the release of fully automated vehicles, they could actually lead to more deaths since level 5 automated vehicles will be significantly safer drivers than their human counterparts. If that thought bothers you, write to NHTSA and urge them not to make a rushed decision. After that, write Elon Musk and ask him why he thought level 3 automated vehicles were safe when every other carmaker was convinced they were not.