Computers have become integral in car manufacturing. Whether it is a rear-facing camera for reversing, alerts about nearby cars on the road, or lane-assist sensors to keep you in your lane, increased reliance on technology can be a lifesaver.
However, there are also some complications with further technological advancements. As cars become more independent, politicians and lawyers alike question the legality of self-driving cars.
“We need to think of the answers to these problems before they begin,” says Kevin Roach, founder of The Law Offices of Kevin Roach, LLC. “That way, we can be prepared for the way of the future to keep people safe on the roadways.”
To simplify the legal issues around self-driving cars, let’s focus on two specifics. First, this case review will focus on Tesla Motors vehicles with the self-automated driving feature. Though other car brands have begun manufacturing their own versions of the self-driving component, most cases involve Tesla vehicles and this case review will review them extensively.
Second, we are going to focus on cases where that self-automated driving feature was engaged to dive into critical questions. What happens when an automated vehicle ends up in a car crash? Who is considered ‘at fault’ in legal terms? Below are a few answers to these questions and the issues with each one.
The Car Owner
Some Tesla vehicles come with a self-automated driving feature that requires the driver to have control over the car in emergencies. It would make sense, then, that if the system fails, the fault falls on the human who decided to drive recklessly and ignore the warning.
One prominent case of this involves the “Tik-Tok Crash” in May of 2021. In summary, a driver was filming a Tik-Tok video to flaunt the self-driving capabilities of his Tesla Model 3 when his car crashed into an overturned truck. The Tesla driver, the truck driver, and another motorist were all killed in the crash.
It is clear that the driver – having taken his eyes off the road to film a video for social media – was acting recklessly and contributed to the car crash. However, other questions come into play if the driver was indeed at fault or if the manufacturer failed to design and market a safe vehicle.
Another notable case occurred in an April 2021 driverless crash. In this crash, two men were riding in a 2019 Model S Tesla. When they were speeding around a curve, the car skidded 100 feet off the road and crashed into a tree. Both men in the vehicle died as a result.
This crash took four hours and 30,000 gallons of water for the firemen to extinguish a fire that would typically take a few minutes to handle. This Tesla’s battery extended across the bottom of the vehicle.This design could have exacerbated the fire as batteries that are on fire can be much hotter and more dangerous than a traditional car crash fire.
Once the car was extinguished, the first responders reported that neither driver was in the driver’s seat, meaning they could not react to the crash appropriately. Some would look at this instance as reckless driving: they chose not to be behind the wheel in case of emergencies. However, their decision was fueled by misleading advertising on Tesla’s part.
Tesla has actively marketed their cars using words like “autopilot” and “full self-driving” that imply to customers that their cars’ fully automated driving feature is, in fact, fully automatic. Though Tesla also releases warnings about relying on the system entirely, their flashy marketing may convince a consumer that it is safe to drive the car without being in the driver’s seat.
False marketing may have led to the deaths of those involved in the driverless crash. The wives of the two men who died in the crash reported that their husbands were discussing the feature before leaving on their drive. Had they been adequately informed about the dangers of fully trusting the self-driving system, they may have avoided the crash entirely.
After both the Tik-Tok and driverless crashes, news organizations tried to reach out to Tesla Motors for a comment. However, their Public relations team was disbanded, meaning that Tesla could not respond to either crash.
Instead of a traditional response from a PR team, Tesla Motors CEO Elon Musk responded via Twitter. In his tweet, he stated that “data logs recovered so far show Autopilot was not enabled,” and that the car did not have Full Self-Driving (FSD) installed. Musk went on to claim that “standard Autopilot would require lane lines to turn on, which this street did not have.”
Several have been unhappy with his explanation as it does not explain how a car can drive itself without a driver in the front seat. This lax manner of reporting on a severe accusation towards a multi-million dollar company has many concerned about how Tesla Motors and other car companies will respond to future crashes involving self-driving cars.