By Margaret Russell
As autonomous vehicles become increasingly popular a new complication becomes evident, in an accident, who is criminally liable? The Society of Automotive Engineers has outlined various levels of automated vehicles. Most vehicles on the road today give the driver some automated features or assistance (levels 0 – 2). Many manufactures are looking to develop vehicles that are able to perform all driving functions under all (level 5) or certain (level 4) conditions. Tesla’s autonomous vehicles offer conditional assistance (level 3) where the driver does not need to monitor the environment but must be ready to take control at any time. A recent executive order in Arizona, allowing for the testing of autonomous vehicles, requires operators to be able to “direct the vehicles movement if necessary.” Many states have followed this framework, which allows for potential criminal liability when a semi-autonomous vehicle (levels 3 – 5) is in an accident.
The user-in-charge of an autonomous vehicle that offers full assistance (level 4 – 5) should not be held criminally liable. The user has placed their trust in the assistance system and is not in control. In the case of a vehicle that does not allow for user operation, the user would not be able to prevent an accident from occurring. Punishing the user would not deter the behavior and the requirements of criminal culpability would not have been met.
If a technical malfunction were to occur, the manufacturer should be held liable as they are at fault for their systems failure. In a recent IEEE article, the senior technical leader for safety and driver support technologies at Volvo explained, “if we made a mistake in designing the brakes or writing the software, it is not reasonable to put the liability on the customer…we say to the customer, you can spend time on something else, we take responsibility.” Manufactures have trended towards accepting liability for accidents resulting from technical malfunctions. Recently, Volvo released a press statement claiming they would accept full liability when their vehicle is in autonomous mode.
Conditional assistance (level 3) presents a new type of problem, how to make sure a distracted driver will pay attention when the vehicle alerts them of the need for driver intervention. In an instance where the vehicle attempts to hand back control there may be criminal culpability for an act of omission. Where the user is given an appropriate amount of time to respond to the alert and either negligently or deliberately fails to take control, the user has met the elements for criminal culpability. However, this is a fact dependent inquiry that can be complicated by elements such as how the accident occurred, if or when the vehicle alerted the user, and the user’s understanding of the vehicle.
Some of these elements were at issue in a March 2018 crash in Tempe, Arizona, where a pedestrian was struck and killed by an automated Volvo operated by an Uber driver. The driver of the car, operating in autonomous mode (level 3), was watching a video when the pedestrian quickly stepped into the road. The vehicle alerted the driver 1.3 seconds before the collision, of the need for intervention and monitored via a video camera as the driver failed to act. The court found Uber was not criminally liable, focusing on the fact the car had notified the driver of the need for intervention. The vehicle was following Arizona laws, as the driver was able to take control when necessary, meaning the driver was at fault and Uber would not be vicariously liable.
The case against the driver is still pending, but it is likely in determining criminal liability, the court will consider factors such as the time the driver had to respond to the alert and whether or not the driver was aware Uber had deactivated the manufacturer-installed emergency breaking system. While it is unlikely the users of full assistance autonomous vehicles will be held criminally liable, users of conditional assistance vehicles could face prosecution for failing to act. So far, courts have held manufacturers and third-party employers will not be held liable so long as the vehicle allows the operator to act and notifies them of the need for intervention. While the legal framework for criminal liability is still evolving, Uber continues testing in Pennsylvania, and a number of different manufacturers have moved forward in developing new autonomous technology.
Student Bio: Margaret Russell is a second-year law student at Suffolk University School of Law and a staff member of the Journal of High Technology Law. She graduated from Worcester Polytechnic Institute with a Bachelor of Science in Chemistry.
Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.