Tesla’s Autopilot Heads to Trial
The EV maker is facing litigation over a fatal crash that will decide whether its marketing overstates the capabilities of automated driving systems.
Photo illustration: 731
By
Malathi Nayak
13 September 2022 at 20:00 GMT+10
From
https://www.bloomberg.com/hyperdrive
Ten seconds before Jeremy Banner’s Tesla Model 3 plowed into the underbelly of a tractor-trailer, he switched on Autopilot. The crash killed the father of three when the top of his car was sheared off, so there’s no way to know exactly what happened that Friday in March three years ago. But an investigation by the National Transportation Safety Board found that Banner probably didn’t see the truck crossing a two-lane Florida highway on his way to work. Tesla’s driver assistance feature apparently didn’t see it either. At least not in time to save the 50-year-old’s life.
A court in Palm Beach County has set a February date for a jury to hear testimony on who was at fault, the first of potentially dozens of Autopilot collision trials. Until then, expect the Twittersphere to light up with passionate arguments over a question that’s been debated for years: Does the very name Autopilot lull drivers into a false sense of security that their cars will drive themselves? The trial offers “one of those watershed moments when we have lots of public attention on a verdict, if the jury is sympathetic to the driver and wants to send a message to Tesla,” says Bryant Walker Smith, a law professor at the University of South Carolina.
Tesla’s Autopilot didn’t see the truck in time to prevent the crash and save Banner’s life.
Source: National Transportation Safety Board
Whatever the verdict, it will add urgency to calls by legislators and auto safety advocates for regulatory intervention. A crackdown has been slow to materialize during Tesla’s eight-year
experiment with automated driving, but the idea has gained steam under the Biden administration, with the National Highway Traffic Safety Administration conducting multipronged investigations.
Chief Executive Officer Elon Musk insists Teslas are the safest cars ever made, but the trial will feature a parade of technology experts testifying about the perils of marketing driver assistance in ways that lead to overconfidence. “A big part of the significance of the case is that it actually is being conducted in a public forum,” says Michael Brooks, chief counsel at the Center for Auto Safety, a consumer advocacy group.
Tesla has long said it’s clear about the system’s limits, citing strong language in its driver manuals. And the company’s website says Autopilot features “require active driver supervision and do not make the vehicle autonomous.” Musk insists that proper use of Autopilot by attentive drivers has saved far more lives than have been lost in crashes. “In investors’ minds, Tesla already has its defense, and they have always bought it: Humans are bad drivers,” says Gene Munster, managing partner of Loup Ventures, an investment firm that follows Tesla but isn’t connected with the case.
The NHTSA says at least 18 fatalities may be linked to driver assistance technology. The agency has investigated almost 200 crashes involving vehicles using the feature, including some in which Teslas have rear-ended police cars or firetrucks parked along roadsides. And California’s Department of Motor Vehicles in August accused the company of false advertising, saying it misleads customers into thinking Autopilot and enhanced “Full Self-Driving” features are more sophisticated than they are.
Lake Lytal, a lawyer for Banner’s family, calls the trial an opportunity “to finally hold Tesla and Elon Musk accountable for using the public roadways throughout our country as a testing ground for this company to try and fix their defective Autopilot system, which they know has killed and will continue to kill its customers.” Tesla and its legal team didn’t respond to requests for comment.
Musk is prickly about public criticism of Autopilot. In 2018 he hung up on Robert Sumwalt, then chairman of the NTSB, after Sumwalt took him to task for blog posts casting blame on the driver of a Model X for a fatal crash. Last October, when President Joe Biden appointed Duke University Professor Mary Cummings as a senior safety adviser to the NHTSA, Musk and thousands of Tesla fans
protested on Twitter because of her history as a vocal skeptic of Autopilot, circulating a petition accusing her of bias.
More recently, a furor erupted over a viral YouTube video of a Tesla running over a child-size crash test dummy while purportedly operating in Full Self-Driving mode. In response, some Tesla owners posted videos of their vehicles safely stopping in front of real children, prompting the NHTSA to issue a
“don’t try this at home” warning.
A Model 3 at a Tesla dealership in Chicago.
Photographer: Scott Olson/Getty Images
The trial will feature Tesla engineers and outside experts. These include
Christopher “CJ” Moore, a former member of the company’s Autopilot executive team, now working for
Apple Inc. After interviewing Moore last year, California DMV officials concluded that some of Musk’s tweets
exaggerated Autopilot’s capabilities. One person the jury probably won’t hear from is Musk. A judge ruled that the billionaire entrepreneur didn’t have to sit for a deposition, rejecting the Banners’ argument that Musk has “unique knowledge” of the issues in the case.
Most problematic for Tesla may be what the proceeding reveals about the technology’s shortcomings. The public will get its first glimpse of troves of data Tesla collects, including granular information on Autopilot, according to Dana Taschner, a personal injury attorney who’s represented victims of crashes attributed to defects. “Tesla engineers will be on the stand testifying under oath about data and statistics that are very, very carefully guarded,” he says.
The trial also has ramifications for companies such as Apple and
Alphabet Inc.’s Waymo, which are developing self-driving vehicles, and conventional carmakers such as BMW and Mercedes-Benz that are investing heavily in cars with automated driving features. For those manufacturers and their myriad suppliers, the trial is a cause for concern because of what it says about liability.
Expert insight into the future of cars
The
Banner family’s lawsuit has its weaknesses, particularly a probe of the accident by the NTSB that found there was blame to go around. Investigators said the truck driver had failed to yield the right of way and faulted Banner for his “inattention due to over-reliance on automation.” But in a 2020 report, the agency criticized Tesla’s technology for insufficiently monitoring and enforcing driver engagement. “The Autopilot system did not send a visual or audible warning to the driver to put his hands back on the steering wheel,” the report said.
The NTSB reached similar conclusions in a 2017 report on a remarkably similar accident in northern Florida, also involving a fatal crash into a semitruck that the car’s sensors didn’t detect. Last October, the agency chastised Tesla for failing to respond to its 2017 recommendations, including limiting where Autopilot can be activated and ensuring that drivers pay attention while using the feature.
The NTSB’s findings about what caused Banner’s crash aren’t allowed as evidence under federal law. But independent experts could use the report as a “road map” to reestablish the same conclusions, says Peter Goelz, a former NTSB managing director. For Musk, the stakes couldn’t be higher. He has, after all, said full self-driving technology is the difference between Tesla being “worth basically zero” and making it one of the world’s most valuable corporations. “The technology is so fundamental to the appeal of the car,” Goelz says. “I don’t think Tesla is going to give an inch.”