Accountability of Autopilot: Self-Driving Cars and Liability

By: Wade Shaver

 

Autonomous vehicles represent an exciting technological advancement, but this development raises serious safety concerns and legal questions.  Recently, a Tesla Model S car in “Full Self-Driving” mode hit and killed a motorcyclist in Washington state.  The driver admitted he was looking at his cell phone when the accident occurred.  In 2023, a Tesla Model Y car that was utilizing the Autopilot feature –less advanced than Full Self-Driving mode– struck a high school student in North Carolina, resulting in life-threatening injuries, though the student ultimately survived the accident.

These incidents are not isolated, with the Los Angeles Times reporting that 11 people died nationwide in accidents involving vehicles with automated driving systems during a four-month period in 2022.  As companies like Tesla and Waymo continue advancing in this field, questions arise as to who should bear the costs and be held liable for these collisions and any resulting injuries.

Two of the main companies involved in the self-driving vehicles industry are Waymo, formerly Google’s self-driving car project, and Tesla.  These and other similar companies have poured billions of dollars into the venture to make self-driving cars more prevalent, useful, and safe.  According to the Society of Automotive Engineers (“SAE”), there are six different levels of automation within the realm of self-driving vehicles, ranging from “Level 0: No Driving Automation” all the way to “Level 5: Full Driving Automation.”  For example, Tesla’s “Autopilot” feature falls under “Level 2: Partial Driving Automation,” while Waymo’s robotaxis are classified as “Level 4: High Driving Automation”.  Self-driving cars use sensors and cameras to gather information about their surroundings and relay it to onboard computers that analyze the data and then guides the vehicle’s systems to respond according to the information.  Recently, Waymo’s use of robotaxis has surged, with over 100,000 automated rides being taken a week.

Despite significant amounts of time and funds being invested in the automated self-driving vehicle field, recent research shows that self-driving vehicles are over twice as likely to be involved in accidents than non-self-driving vehicles.  Many states have either passed laws addressing self-driving vehicles or their respective governors have issued executive orders relating to their use.  The number of states with laws governing self-driving vehicles will likely continue to grow, as cars featuring at least light autonomous features are expected to comprise nearly 25% of the global market by 2040.  Additionally, S&P Global Mobility’s September 2024 Autonomy Forecasts projects that as many as 230,000 vehicles with autonomous capabilities will be sold in 2034 in the United States, 1.5 million in China, and 37,000 in Europe.

In accidents involving self-driving vehicles, multiple parties could potentially face liability, including the person behind the wheel, the owner of the vehicle, the manufacturing company, the developer of the technology that allows for self-driving, and potentially even government entities.  Some argue that this issue should fall under products liability, while others argue that it is an agency issue.  Ultimately, liability for accidents involving self-driving vehicles should depend on the level of driving automation involved.  For levels 0 through 3, the driver/user of the vehicle should bear responsibility for accidents involving self-driving cars where the operation or technology is found to be at fault.  Partial driving automation technology requires close user attention; if a user acts negligently and causes an accident, they should be liable for the resulting damages and any injuries.

For vehicles featuring level 4 and 5 automation, high driving and full driving, determining liability becomes more complex.  In cases where high driving or full driving automation causes accidents, a products liability claim is likely to arise, which would require the plaintiff to prove that a defect in the product was the proximate cause of their injuries.  The plaintiffs would also have to show that there was a reasonable expectation that, if functioning normally, these automation features would not have posed undue risks to them.  This shift in liability from users to manufacturing companies and software developers signals a “fundamental change in legal liability.”  Both Michigan and Tennessee have enacted laws holding manufacturers liable when the automated driving system in the vehicle contains defects and was being used at the time of the accident.  This seems to be the future of liability for self-driving vehicles.

As the use of self-driving vehicles becomes more common and the technology evolves and expands, incidents like the fatality in Washington state and significant injury in North Carolina will likely increase in frequency.  The issue of liability in self-driving vehicle accidents is nuanced and varies depending on the level of automation involved in the accident.  Currently, accidents involving self-driving cars with higher levels of automation that are at fault will likely lead to products liability claims from affected parties.  It will be interesting to see how this issue develops as self-driving vehicles become more prevalent in society.

 

Student Bio: Wade Shaver is a third-year law student at Suffolk University Law School. He is a staff member for the Journal of High Technology Law.  Wade graduated from the University of North Carolina at Chapel Hill in 2019 where he double majored in History and the interdisciplinary study of Peace, War and Defense (“PWAD”).

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

Print Friendly, PDF & Email