Tesla Recall: Ensuring Safety or Stifling Innovation?

By: Brenna Ryder

Tesla, Inc. (“Tesla”), is recalling nearly 54,000 vehicles because their “Full Self-Driving” software allows the vehicles to roll through stop signs without coming to a complete stop.  The recall covers: Model S sedans and X SUVs from 2016 through 2022; Model 3 sedans from 2017 through 2022; and Model Y SUVs from 2020 through 2022.  The “rolling stop” feature allows the vehicles to pass through intersections with all-way stop signs at up to 5.6 miles per hour.  After two meetings with officials from the National Highway Traffic Safety Administration (“NHTSA”), Tesla agreed to the recall.  The NHTSA claimed that failing to stop for a traffic sign can increase the risk of a crash, but Tesla said that it knows of no crashes or injuries caused by the feature.  Still, documents posted by U.S. safety regulators say that Tesla will disable the feature with an over-the-internet software update.  Owners will receive required notification letters in late March.

The company first introduced the “rolling stop” feature in October 2020 as part of an update to its “Full Self-Driving” beta software.  This advanced version of Tesla’s Autopilot driver-assistance technology is only available to Tesla owners who pay $12,000 for the upgrade.  Though the name is misleading, the “Full Self-Driving” software does not actually turn Teslas into fully autonomous vehicles, but rather gives drivers access to semi-autonomous features.

A firmware release to disable the “rolling stop” feature is expected to be sent out in early February, but in the meantime, selected Tesla drivers are “beta testing” the “Full Self-Driving” software on public roads.  Tesla has advised these drivers that they must be ready to take action at all times.  Safety advocates, however, are complaining that Tesla should not be allowed to test vehicles on public roads with untrained drivers.  They argue that this testing is putting other motorists and pedestrians in danger.  Many other automotive companies have similar software testing, but complete that testing with trained drivers.

Though Tesla claims that it knows of no crashes or injuries caused by the feature, a complaint was filed in Brea, California by a Tesla Model Y driver.  The driver alleged that the “Full Self-Driving” software caused a crash on November 3, 2021.  The driver allegedly tried to turn the wheel to avoid other traffic, but the car “forced itself into the incorrect lane,” where it was hit by another vehicle, the driver reported.  The vehicle allegedly did not give the driver an alert until halfway through the turn.  No one was hurt in the crash, but the NHTSA has investigated the complaint.  This investigation comes after additional NHTSA investigations into Tesla, which led Tesla to update its less sophisticated “Autopilot” driver-assist system and agree to stop allowing video games to be played on center touch screens while its vehicles are moving.

Taking his message to Twitter, Tesla Chief Executive Officer, Elon Musk, stated there were no safety issues with the “rolling stop” feature.  He claimed, “[t]he car simply slowed to ~2 mph [and] continued forward if clear view with no cars or pedestrians.”  The NHTSA responded to this analysis, explaining that federal law “prohibits manufacturers from selling vehicles with defects posing unreasonable risks to safety, including intentional design choices that are unsafe.”  Furthermore, the feature appears to violate state laws that require vehicles to come to a complete stop at traffic signs.

Tesla appears to be pushing the boundaries of traffic safety laws and this trend seems unlikely to change.  This is especially concerning because the NHTSA, responsible for overseeing the recalls, can only take action after the Tesla software has been released to drivers.  Because Tesla typically pushes out updates to vehicles when they are connected to the internet, Tesla is free to release and test new features with drivers on the road.  Not only does this afford Tesla cost savings on testing, but it also allows the company to release potentially dangerous software to drivers on public roads until the government intervenes, or Tesla makes changes on its own.  While this allows technological innovation to prosper quickly, this raises substantial questions as to how much leeway companies like Tesla should be given, considering the detrimental impact they may have on consumers.

U.S. Secretary of Transportation, Pete Buttigieg, recently made a virtual address, noting the role of government in making cars, and singling out autonomous car manufacturers like Tesla specifically.  Buttigieg was clear that the government does not wish to quash innovation, but explained that it is the government’s duty to keep Americans safe once products are available for purchase.  Comparing the rise of autonomous vehicles to trains and planes, Buttigieg reminded listeners that these vehicles also had serious safety concerns when first on the market.  With each, the government stepped in and instituted certain rules, like air brakes and automatic couplers on every train, which led to a successful drop in deaths and injuries.  Similarly, when automobiles were first invented, laws were passed requiring seatbelts, airbags, and sober driving, leading to a drastic decrease in fatalities.  These examples helped Buttigieg make the persuasive point that in order to protect public safety, the government should not have to wait until a new piece of technology is already endangering consumers.  Rather, he argued, the government should proactively plan to support the growth of new ideas while ensuring high safety standards.

Were the government to take a more proactive approach, as Buttigieg suggests, public roads may be safer, as this would likely limit Tesla’s ability to beta test its products.  On the other hand, this may stifle technological innovation as it would take longer to create testing plans and get new products to market.  This tension will likely escalate as more and more Tesla products and features are investigated and potentially recalled.  How regulators and technological innovation companies, like Tesla, work together to keep consumers at the forefront of technology with appropriate safety standards will be a tough balancing act.

Student Bio: Brenna Ryder is a third-year evening student at Suffolk University Law School.  She is a staff writer on the Journal of High Technology Law.  Brenna received a Bachelor of Science Degree in Business Administration and Management, with concentrations in Management Information Systems and Business Law, from Boston University’s Questrom School of Business.

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

Print Friendly, PDF & Email