Who Is Liable In A Car Crash When No One Is Driving?

By: Meg Cotter

A self-driving car is a vehicle that can operate completely without human involvement.  What once sounded impossible will likely become the norm in transportation in the next few years.  Right now, very few cars are capable of self-driving, but many cars have some type of automation.  More than 40 companies are actively investing in autonomous vehicle technology.

 

Statistics from the National Highway Traffic Safety Administration (NHTSA) show that human error is the greatest cause of traffic accidents due to speeding, errors of judgment, miscalculations, driving under the influence, and cell phone use.  According NHSTA, 94% of serious crashes are caused by human error.  The use of fully autonomous vehicles arguably would take human error out of the equation, thereby reducing accidents and keeping everyone on the road safer.  Critics, however, argue that they don’t have this level of trust in technology to let it take the wheel.  These systems may not be ready yet and could still fail.  Safety experts question the abilities of automated vehicles.

 

The Society of Automotive Engineers (SAE) defines six different levels of driving automation, ranging from no driving automation (0) to fully automated (5).

Level 1: Driver Assistance.  These cars have at least one driver automation system that can help the driver by controlling speed or direction.  The driver still performs all other driving tasks and supervises.  (Ex: adaptive cruise control or lane centering)

Level 2: Partial Driving Automation.  These cars have two or more driving automation systems that can control speed and direction, but a human driver is still in control of the systems and all other driving tasks.  (Ex: Tesla’s Autopilot)

Level 3: Conditional Automation.  These cars are fully capable of handling driving tasks but drivers have to be available to take the wheel if the driving automation system malfunctions.  (Ex: Mercedes-Benz model approved in 2023.)

Level 4.  High Driving Automation.  These cars can perform all driving functions under certain conditions or in certain locations.  (Cruise and Waymo use robotaxis in certain cities such as Arizona and Texas along specific routes).

Level 5.  Full Driving Automation.  A Level 5 vehicle can operate without human involvement no matter what.

 

Most vehicles we currently see on the road are between levels 0 and 2.  Traditional rules of car accident liability apply when accidents involving these vehicles happen.  The driver is likely liable.  But, as the level of automation increases and these cars become more common, the legal system will need to find a new way to compensate victims of accidents involving autonomous cars.  Liability may shift away from human drivers toward car manufacturers and software developers.

 

This conundrum depends on the level of advancement of the car to determine how liability will likely be determined.  If the car is between fully manual and fully automated, like levels 2 and 3, liability will likely still be determined by traditional negligence rules.  Therefore, the vehicle operator would be responsible.  If the owner of the vehicle is not the driver, then the vehicle owner may also share some responsibility.

 

If a self-driving car crashes because of the negligence of the vehicle manufacturer, the injured person may sue the car makers based on a “product liability” theory of fault.  Product liability cases involving driverless vehicles will likely focus on system design defects, such as sensor placements, manufacturing, and faulty instructions and warnings.

 

Software developers may also be negligent as software plays a key role in autonomous vehicles.  These high technology cars use software to collect and process environmental information using a combination of sensors, cameras, radar, and artificial intelligence  Without these software techniques, the cars could not operate without a human driver.  If a defect in the software causes the crash, the software developer may be liable for damages.

 

If the dream of completely self-driving cars becomes a reality, the companies responsible for the maintenance of the vehicle may be on the hook as well.  In addition, if the accident is not at the fault of the self-driving vehicle, such as a rear-end, the other driver on the road would be negligent and responsible for damages.

 

Proving fault in a self-driving accident will likely be similar to proving fault in any other vehicle.  However, data that is associated with technological advancements will probably tell the story of what happened.  For example, the Institute of Electrical and Electronics Engineers (IEEE) created the world’s first data storage system for automated vehicles in 2023. The data generated from IEEE’s storage system would be available for crash investigations and to make safety improvements.

 

No U.S. state specifically outlaws the use of self-driving vehicles at this moment, but there is specific legislation in 29 states concerning their use.  Companies working on these vehicles include Tesla, Cruise, Waymo by Alphabet Inc., and Aurora.  Billions of dollars will be invested towards the completion of these projects in the next 10 years, and pressure from investors creates the expectation that updated models will be released soon.

 

Student Bio: Meg Cotter is a 2L at Suffolk University Law School.  She is a staff writer on the Journal of High Technology Law.  She received a Bachelor of Arts Degree in English & Textual Studies from Syracuse University.

 

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

 

Print Friendly, PDF & Email