The United States believes autonomous cars drive safer than you.

By Andre A. Janiszewski

 

For anyone who has never driven in a Tesla before, the experience is surreal.  There is no sound, because a conventional internal combustion engine has been replaced by 7,000 lithium ion batteries.  Even more impressive is a feature called Autopilot.  Putting a Tesla in Autopilot allows the car to take control of everything.  Using sophisticated sensors, the Tesla will keep a specified distance from the car in front of you, brake when you get too close, accelerate when you get too far, and steer you around bends in the road.  This technology comes with a myriad of warnings, including keeping your hands on the wheel and remaining awake and alert should anything go wrong.  In the past year, one person was killed in the United States using this technology, yet on September 19, 2016, the Federal Government released a new autonomous vehicle policy.  Even President Obama endorsed the technology, proving this country’s commitment to it going forward.  Driverless cars are here to stay.

 

On September 19, 2016, the Department of Transportation released guidelines designed to promote the safety of autonomous vehicles without sacrificing company innovation through overregulation.  The guidelines cover four main areas: (1) safety standards; (2) encouraging states to maintain uniform policies regarding driverless cars; (3) detailing how current regulations can apply to future technologies; and (4) allowing new regulations to be created in the future.  The Department of Transportation has made it clear that there remains enough flexibility in the guidelines to allow automakers to continue innovating.  Automakers are asked to make sure that fifteen requirements within these guidelines regarding safety and testing procedures be covered before putting autonomous cars on the road.  The National Highway Traffic Safety Administration is likely going to ask approval from Congress to enforce these guidelines.

 

President Obama is also all-in for autonomous vehicles – his administration pledged $4 billion in federal funding to driverless car research over the next decade.  In an editorial for the Pittsburgh Post-Gazette, the President stressed that these vehicles could save tens of thousands of lives a year.  In fact, 2015 was the deadliest year for automobile drivers in the last ten, with 38,300 deaths.  Large manufacturers such as Ford, GM and Toyota have all embraced the Federal Government’s willingness to address the safety concerns associated with the new technology.

 

In May 2016, a Florida driver was killed in his Tesla while the vehicle was in Autopilot.  Tesla’s Chief Executive, Elon Musk, has noted that “perfect safety is an impossible goal” but stressed that “it is about improving the probability of safety.”  He stated that the newest update to Autopilot would have likely prevented that death, which was caused by a tractor trailer crossing in front of the Tesla.  The new update will require drivers to place their hands on the wheel after three minutes of driving at 45 miles per hour in Autopilot.  If they fail to comply within an hour, the system shuts off altogether.

 

While the government is ready and willing to embrace driverless cars, there are still concerns.  What if they crash?  Car accidents have been traditionally covered by tort law, specifically negligence.  Either comparative negligence or contribution are employed by most states to allocate damages based on percentages of fault.  But if, for example, two autonomous cars crash, which driver is at fault?  Or, what if the driverless car crashes into the human controlled car?  These questions are difficult and there is no common law on the subject.  Further, the voluminous amount of personal injury litigation could have a “chilling” effect on automakers, thereby reducing their incentive to push forward.  It seems that Congress will have to take action, possibly in strict liability.  Theoretically, the car’s computers can determine who caused the accident based on impact speed, braking and steering angle.

 

Undoubtedly, this country is far from having purely driverless vehicles on the roads.  There will be a mix of autonomous and human controlled cars on the road for many years to come.  While the questions of legal liability in forthcoming accidents are uncertain, there is something that is clear: humans are more dangerous than computers and cause more fatal accidents.  While skepticism about Autopilot swirled with the death of the Florida driver, there has been only one fatality in a year of Autopilot in America.  Human drivers have caused 38,300 in that time.  While Congress may have to provide guidance for the courts regarding liability, there is no denying that driverless cars are safer at this time.

 

Student Bio: Andre is a staff member on the Journal of High Technology Law.  He is currently a 2L at Suffolk University Law School.  He holds a B.S. in Business Administration with a concentration in Marketing from Bryant University.

 

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

 

Print Friendly, PDF & Email