Tesla to face twin Autopilot trials over fatal incidents

Key Points

  • đźš— Tesla is facing twin trials in September and October to defend its Autopilot technology.
  • 🛑 First trial: A civil complaint alleges Autopilot caused a fatal crash, with accusations that Tesla sold a defective vehicle.
  • 🤖 Second trial: Another fatal incident claims Autopilot failed to prevent a collision, leading to a fatality.
  • 🤝 Tesla denies liability for both cases, emphasizing driver responsibility and no fully self-driving cars on the road.
  • đź’Ľ The trial outcomes could set a precedent for future Autopilot-related cases.
  • đź’° Tesla’s CEO Elon Musk has linked the success of the self-driving program to the company’s financial success.
  • 🌉 San Francisco residents divided on the adoption of self-driving vehicles, with some actively protesting against them.
  • ⚙️ Waymo and Cruise tout safety records, but criticism remains about vehicle malfunctions and inconveniences caused.
  • 🛣️ Autonomous technology’s readiness for mainstream use is questioned as trials and public opinions continue.

Tesla is poised to defend itself and its Autopilot technology in twin trials this coming September and October. The trials could affect the narrative surrounding Tesla and its self-driving program, which CEO Elon Musk has argued is close to being ready for mainstream use. 

Elon Musk has been pretty open about the idea that Tesla’s self-driving program is what will determine whether the company becomes a financial success or not. In this light, Musk has predicted several times now that Tesla would achieve true self-driving, but his predictions are yet to be proven true. 

As noted in a Reuters report, Tesla is currently facing two trials in quick succession. The first is scheduled for mid-September, and it involves a civil complaint alleging that Autopilot caused Micah Lee’s Model 3 to suddenly veer off a highway at 65 mph, strike a palm tree, and burst into flames. The 2019 crash resulted in Lee’s death and serious injuries to two passengers. 

The lawsuit against Tesla was filed by the passengers and Lee’s estate, with the plaintiffs accusing Tesla of pushing Autopilot while knowing that the driver-assist suite and other safety systems were defective when it sold the vehicle. Tesla, for its part, has denied liability for the crash, and argued that Lee had consumed alcohol before getting behind the wheel of the ill-fated vehicle. Tesla also noted that it was not clear if Autopilot was engaged at the time of the crash. 

The next case is set for early October, and it involves a fatal incident regarding Stephen Banner, whose Model 3 crashed into a trailer that had pulled into the road. The incident resulted in the 18-wheeler shearing off the Tesla’s roof, resulting in Banner’s death. As per the lawsuit, which was filed by Banner’s spouse, Autopilot allegedly failed to brake, steer, or do anything to avoid the collision. 

Similar to Lee’s crash, Tesla has maintained that driver error was the reason behind the fatal incident. Tesla also noted in court documents that drivers must always pay attention to the road and keep their hands on the steering wheel while Autopilot is engaged. “There are no self-driving cars on the road today,” Tesla noted. 

Matthew Wansley, a former General Counsel of nuTonomy, an automated driving startup and Associate Professor of Law at Cardozo School of Law, noted that the results of the twin cases could set a precedent for other Autopilot complaints. “If Tesla backs up a lot of wins in these cases, I think they’re going to get more favorable settlements in other cases,” he said. 

On the other hand, Bryant Walker Smith, a law professor at the University of South Carolina, noted that a loss in the cases could affect the narrative surrounding Tesla’s self-driving efforts. “A big loss for Tesla – especially with a big damages award” could “dramatically shape the narrative going forward,” he said. 

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x