All News

Tesla Asks Judge to Toss $243M Autopilot Verdict

Tesla has filed a motion to set aside or retry a $243 million jury verdict from a 2019 Florida crash involving its Autopilot system, arguing the decision contradicts tort law and due process and that the driver’s reckless behavior was the primary cause. Plaintiffs push back, saying the verdict rightly recognizes Autopilot’s role and Tesla’s misleading claims.

Published August 29, 2025 at 03:08 PM EDT in IoT

Tesla asks judge to overturn $243M Autopilot verdict

Tesla filed a motion this week asking a court to set aside or order a new trial for a $243 million jury verdict tied to a 2019 Florida crash involving its Autopilot advanced driver assistance system. The company says the verdict “flies in the face of basic Florida tort law, the Due Process Clause, and common sense,” and again places primary blame on driver George McGee.

Facts in the case are straightforward but tragic: McGee, driving a Tesla Model S at night with Autopilot engaged, failed to brake when approaching a perpendicularly parked SUV, ran a stop sign and struck the vehicle. The crash killed 20-year-old Naibel Benavides Leon and permanently injured her boyfriend, Dillon Angulo.

At trial the jury apportioned two-thirds of the fault to McGee and one-third to Tesla. Plaintiffs settled separately with McGee. Tesla’s filing reiterates that McGee admitted reaching for his phone and calls his conduct “extraordinary recklessness.” The company also notes it rejected a $60 million settlement offer months before the verdict.

In arguing for reversal, Tesla frames the case as a dangerous stretch of product liability law — warning that allowing the verdict to stand would deter safety innovation and punish manufacturers when drivers misuse safety features. The filing further accuses plaintiffs’ lawyers of presenting prejudicial evidence unrelated to the specific Model S collision, including material about data preservation and Elon Musk.

  • Legal precedent: A judge ruling for Tesla could limit manufacturer liability; a denial would broaden it.
  • Industry response: OEMs may rethink disclosure, testing, and how driver-assist features are marketed to avoid similar suits.
  • Regulatory pressure: High-profile rulings influence lawmakers and safety regulators crafting clearer standards for ADAS claims and monitoring.

Plaintiffs’ lead attorney Brett Schreiber called Tesla’s motion another example of the company’s disregard for human cost and argued the jury properly recognized shared responsibility while holding Autopilot’s misrepresentations accountable. The article was updated to include Schreiber’s statement after the filing.

What happens next rests with the judge. The court can toss the verdict, order a new trial, or let the jury’s decision stand. Each outcome carries ripple effects beyond this single case — affecting how manufacturers communicate feature limits, how juries evaluate shared fault, and how quickly the public will trust driver-assist technology.

For policymakers, automakers and fleets this case is a clear reminder: deploying advanced assistance features is as much about engineering as it is about careful messaging, robust data capture, and defensible risk models. A courtroom is where technical choices become legal and reputational liabilities.

QuarkyByte’s approach is to translate complex telemetry and policy nuance into clear risk assessments and communication strategies that reduce uncertainty and prioritize people. Whether the court upholds the verdict or orders a new trial, data-driven reconstruction and policy-aligned insight will shape safer deployments and more defensible decisions industry-wide.

Keep Reading

View All
The Future of Business is AI

AI Tools Built for Agencies That Move Fast.

QuarkyByte can help automakers, regulators and fleets translate crash data into clear, defensible risk models and communication strategies that reduce legal exposure and protect riders. Explore how targeted forensic analysis and policy-aligned insights would change deployment decisions and public messaging.