Three Recent Crashes May Park Tesla’s Self-Driving Dreams

The most significant problem fully autonomous vehicles will encounter in the real world is ensuring the safety of their passengers and those around them. Three recent fatalities resulting from a pair of crashes from Tesla vehicles have forced the company to take notice of its system’s failures. Tesla’s Autopilot system was designed to keep vehicles in their lanes and driving safely, albeit with the aid of the primary driver. These three deaths follow a worrying trend where users depend on the automatic system too much, without paying proper attention to the road.

Three Dead in Two Crashes

Both of the crashes occurred on January 3rd. In California, a Tesla Model S left the freeway, and, after running a red light, drove into a Honda Civic, killing both passengers inside. In Indiana, a Tesla Model 3 hit a parked firetruck, killing the only occupant of the autonomous vehicle. These crashes come on the heels of another crash on December 7th, where a Model 3 hit a police cruiser in Connecticut, although this incident resulted in no injuries or deaths. The National Highway Traffic Safety Administration (NHTSA) is conducting investigations into both crashes to determine the cause and potential liability.

Is Tesla’s Autopilot Ready for the Real World?

These recent crashes are part of a tapestry of events that cover thirteen reported collisions dating from 2016 to 2020. Critics have called on the NHTSA to stop investigating and take action, citing that these vehicles may pose a public threat to pedestrians and other drivers alike. The critical element in Tesla’s system is that the Autopilot feature still requires users to pay attention to the road around them. Critics are concerned that the company is not doing enough to compel drivers to pay attention, leading to these crashes. Out of these thirteen crashes, a total of five (including the two that occurred on Sunday) resulted in fatalities. The NHTSA has noted that most of these collisions were as a result of users being too dependent on Autopilot to run the vehicle.