Even though the public only found out about the fatal accident involving a Tesla Model S on Autopilot almost two months after it occurred, the incident raised a lot of questions about the safety of driverless cars, increasing the risk of further diminishing public trust in autonomous driving technology.
Tesla released a statement explaining the event only after the National Highway Traffic Safety Administration (NHTSA) informed the public that it has launched an investigation into the tragic accident, which only made matters worse and didn't help increase people's confidence in self-driving vehicles.
Failure of the Self-Driving Technology
While Tesla offered a logical and detailed explanation of events that led to the accident, the fact remains that the Model S's Autopilot feature failed to perform a basic task and detect an obstacle on the road, resulting in a collision with fatal consequences.
As Tesla says, the crash occurred while the Model S was traveling along a divided highway, with the Autopilot activated, and collided with a tractor-trailer that was moving across the highway in front of the all-electric sedan. The reason why the Autopilot did not notice the tractor-trailer is because the tractor-trailer's white side was “aligned against a brightly lit sky”, so the car's self-driving software did not activate the brakes.
This basically means that the sedan's autonomous driving equipment, which includes forward-facing cameras and radars, didn't detect an object on the road that should be identified as an obstacle. The failure to detect the tractor-trailer is due to its color, which in combination with the weather conditions at the moment, caused the object to be practically invisible to the Model S's radars and cameras.
Also, Tesla states that the trailer's “high ride height” presented an additional reason why the Autopilot was not able to see it and avoid it.
Reliability of Autonomous Cars Brought into Question
Although the Model S's autonomous driving system clearly failed in this case, there are other aspects to this incident that should be taken into consideration. The investigation is supposed to show whether blame should be attributed to the driver of the Model S, as it is possible that he was distracted when the accident occurred and was not able to take control of the car and perform an evasive maneuver or stop the car.
The NHTSA clearly states that Tesla's Autopilot is designed to help a driver with certain driving tasks, and the driver should keep their hands on the steering wheel at all times and be ready to assume control of the vehicle at any given moment.
What's more, Tesla itself warns Model S owners that it is their responsibility to maintain control of their vehicles, even when the Autopilot is engaged.
These two facts could protect Tesla from liability, and help the company avoid prosecution.
However, the Autopilot's inability to see an obstacle on the road in front of the car highlights a major flaw in the technology that could be fatal, as it turned out to be in this highly-publicized event.
The safety and reliability concerns that surfaced following this accident are well founded, given that it had fatal consequences. This collision is yet another indication that autonomous driving technology is far from perfect, and automakers still have a lot of work to do before they can make it reliable enough so that drivers can use features similar to Tesla's Autopilot without risking getting killed.
In the wake of such a serious incident, one thing is for sure – the NHTSA will certainly review current autonomous driving technology regulations and will reconsider the rules that apply to the use of features such as the Autopilot on the Model S – so it may well lead to the creation of new guidelines that will probably be stricter and more concise than the ones that are in effect at the moment. Also, the requirements that automakers and other companies working on self-driving vehicles have to meet now to be allowed to test the technology on public roads will probably be tightened, to make sure that manufacturers do everything they can to ensure self-driving systems are properly tested before they implement them into their vehicles.
One thing is for sure, though: the auto industry, as well as some of the world's leading tech companies, will have to work much harder now to convince the public that autonomous driving technology is safe and reliable, which has been a pretty tough task even before this accident occurred, and now it's going to be that much more difficult to achieve.