Skip to main content
  1. AI Legal Resources/

Autonomous Vehicle Litigation Tracker: Tesla, Cruise, Waymo & Self-Driving Car Cases

Table of Contents

The Autonomous Vehicle Liability Crisis
#

Self-driving cars were promised to eliminate human error and make roads safer. Instead, they have created a complex liability landscape where crashes, injuries, and deaths have triggered hundreds of lawsuits, billions in regulatory penalties, and fundamental questions about who bears responsibility when AI-controlled vehicles cause harm.

From Tesla’s Autopilot and “Full Self-Driving” systems, involved in dozens of fatal crashes, to Cruise’s dramatic implosion after pedestrian incidents, autonomous vehicle litigation represents the cutting edge of AI product liability law. These cases will establish precedents governing AI responsibility for decades to come.

Key Autonomous Vehicle Statistics
  • 956 crashes involving Tesla Autopilot investigated by NHTSA (as of 2024)
  • 52+ fatalities linked to Tesla Autopilot/FSD systems
  • $1.5 billion recalled value of Cruise vehicles after pedestrian incident
  • 17 ongoing NHTSA investigations into autonomous vehicle systems
  • $200 million+ in active AV litigation against major manufacturers

Understanding Autonomous Vehicle Technology Levels
#

SAE Automation Levels
#

The Society of Automotive Engineers defines six levels of vehicle automation:

LevelNameDescriptionHuman Role
0No AutomationHuman performs all driving tasksFull control
1Driver AssistanceVehicle assists with steering OR accelerationConstant supervision
2Partial AutomationVehicle controls steering AND accelerationConstant supervision
3Conditional AutomationVehicle performs all driving tasks in limited conditionsReady to intervene
4High AutomationVehicle performs all tasks without human intervention in defined areasOptional control
5Full AutomationVehicle performs all tasks everywhereNo control needed

Current Market Status
#

Level 2 (Mass Market):

  • Tesla Autopilot / FSD (despite “Full Self-Driving” name)
  • GM Super Cruise
  • Ford BlueCruise
  • Mercedes-Benz Drive Pilot (Level 3 certified in limited areas)

Level 4 (Limited Deployment):

  • Waymo (robotaxi service)
  • Cruise (suspended operations)
  • Zoox (testing phase)
  • Aurora (trucking focus)

Level 5: No commercially available vehicles have achieved Level 5 automation.


Tesla Autopilot and FSD Litigation
#

The Tesla Safety Record
#

Tesla’s Autopilot and “Full Self-Driving” (FSD) systems have been implicated in hundreds of crashes, including over 50 fatalities. NHTSA has opened multiple investigations and ordered recalls, yet Tesla continues expanding these features to millions of vehicles.

Key Issues:

  • Marketing “Full Self-Driving” for a Level 2 system requiring constant supervision
  • Alleged defects causing phantom braking, failure to detect obstacles, and unintended acceleration
  • Driver monitoring systems insufficient to ensure attention
  • Over-reliance fostered by automation name and marketing claims

Fatal Tesla Autopilot Cases
#

Wrongful Death / Product Liability

Estate of Banner v. Tesla (Apple Engineer Death)

Confidential Settlement
Settled (2022)

Family of Apple engineer Walter Huang, killed when his Tesla Model X in Autopilot mode struck a highway barrier in 2018, sued Tesla for wrongful death. The vehicle failed to detect the concrete barrier and accelerated into it. Tesla argued Huang ignored warnings to keep hands on wheel. Case settled confidentially after extensive discovery into Autopilot defects.

Santa Clara County Superior Court 2022
Wrongful Death / Product Liability

Varner v. Tesla (Florida Fatal Crash)

Pending
Trial Scheduled 2025

Family of Jeremy Banner, killed when his Tesla Model 3 on Autopilot drove under a semi-truck in 2019, filed wrongful death suit. The vehicle failed to detect the truck crossing the highway, a scenario similar to the first fatal Autopilot crash in 2016. Tesla faces claims of design defect and failure to warn.

M.D. Florida 2024
Criminal Prosecution / Manslaughter

People v. Tesla (Glendale Fatal Crash)

Pending
Trial Pending

First criminal prosecution involving Tesla Autopilot. The driver, Kevin Aziz Riad, was charged with two counts of vehicular manslaughter after his Tesla on Autopilot ran a red light and killed two people in 2019. Tesla is not a defendant but the case raises questions about criminal liability for AV-involved deaths.

Los Angeles County Superior Court 2024
Product Liability / Personal Injury

Rossiter v. Tesla (FSD Beta Crash)

Pending
Discovery Ongoing

Plaintiff injured when a Tesla using FSD Beta suddenly swerved into oncoming traffic. Lawsuit alleges Tesla knew FSD Beta was defective and unsafe for public roads but released it anyway to thousands of untrained 'beta testers.' Claims include strict liability and negligent design.

N.D. California 2024

Tesla Class Actions
#

Consumer Fraud Class Action

In re Tesla Autopilot Marketing Litigation

Pending
Class Certification Briefing

Consolidated class action alleging Tesla defrauded consumers by marketing 'Full Self-Driving' capability since 2016 while knowing the technology could not deliver fully autonomous driving. Plaintiffs claim they paid $5,000-$15,000 for features that don't work as advertised. Tesla argues FSD is still in development and customers received value from existing features.

N.D. California (MDL) 2024
Consumer Class Action / Product Defect

Briggs v. Tesla (Phantom Braking)

Pending
Class Certification Granted (2024)

Class action certified for Tesla owners experiencing 'phantom braking', sudden, unexpected braking without visible obstacles. Plaintiffs allege the defect creates dangerous conditions and Tesla has failed to remedy it despite thousands of complaints. Class includes Model 3 and Model Y owners from 2021-2023.

N.D. California 2024

NHTSA Tesla Investigations
#

InvestigationSubjectStatusVehicles Affected
PE 22-020Autopilot crashes with emergency vehiclesOpen765,000+
PE 23-001Phantom brakingOpen416,000+
EA 23-003FSD Beta sudden steeringOpen362,000+
PE 21-020Autopilot crash data collectionOpen765,000+
Recall 23V-838FSD software inadequate controlsCompleted2,000,000+

Tesla Recall Orders
#

Safety Recall

NHTSA Recall: Tesla Full Self-Driving Software (December 2023)

2,032,000 Vehicles
Software Update Required

NHTSA ordered Tesla's largest-ever recall, finding FSD software allowed drivers to become inattentive and enabled the vehicle to act in unpredictable ways at intersections, including running stop signs. Tesla issued over-the-air software update but NHTSA continues monitoring for compliance.

NHTSA 2023
Safety Recall

NHTSA Recall: Tesla Autopilot Driver Monitoring (April 2024)

1,849,000 Vehicles
Software Update Required

NHTSA found Tesla's Autopilot driver monitoring system failed to ensure driver attention during use, violating previous recall commitments. The agency determined the December 2023 FSD recall remedy was insufficient and ordered additional software updates and monitoring improvements.

NHTSA 2024

Cruise: The Rise and Fall
#

The October 2023 Pedestrian Incident
#

Cruise:GM’s autonomous vehicle subsidiary, experienced a catastrophic collapse following an October 2023 incident where a Cruise robotaxi dragged a pedestrian 20 feet after she was struck by another vehicle and thrown into the Cruise’s path.

What Happened:

  1. A hit-and-run driver struck a pedestrian, throwing her into the Cruise vehicle’s lane
  2. The Cruise robotaxi struck the pedestrian and stopped
  3. Believing the pedestrian was entirely under the vehicle, the AI decided to “pull over” to safety
  4. The vehicle dragged the pedestrian 20 feet before stopping
  5. Cruise initially showed regulators edited video omitting the dragging

Aftermath:

  • California DMV revoked Cruise’s self-driving permit
  • NHTSA opened formal investigation
  • GM paused Cruise operations nationwide
  • CEO Kyle Vogt resigned
  • 24% of Cruise workforce laid off
  • Criminal investigation by San Francisco DA

Cruise Litigation
#

Personal Injury / Product Liability

Doe v. Cruise (Pedestrian Dragging)

Pending
Discovery Ongoing

The pedestrian injured in the October 2023 dragging incident filed suit against Cruise, GM, and the hit-and-run driver. Claims include product liability for the robotaxi's decision to move while a person was beneath it, and negligence for Cruise's safety protocols. The victim suffered severe injuries requiring extensive surgery.

San Francisco Superior Court 2024
Regulatory Penalty

CPUC v. Cruise (Regulatory Enforcement)

$1,500,000+
Consent Decree (2024)

CPUC fined Cruise and required compliance monitoring after investigation found the company failed to provide complete information about the pedestrian dragging incident. Cruise agreed to enhanced reporting requirements, third-party safety audits, and operational restrictions as conditions for eventual permit restoration.

California Public Utilities Commission 2024
Securities Fraud Class Action

Class Action: Cruise Investor Securities Fraud

Pending
Motion to Dismiss Pending

GM investors allege the company made materially false statements about Cruise's safety record and regulatory compliance, causing stock price decline when the pedestrian incident and video editing revelation emerged. Claims include violations of Securities Exchange Act Section 10(b).

S.D.N.Y. 2024

Other Cruise Incidents
#

DateIncidentOutcome
August 2023Cruise vehicle blocked fire truck responding to emergencyFine, operational restrictions
September 2023Multiple Cruise vehicles congregated, blocking traffic for hoursPolice intervention required
June 2023Cruise vehicle struck San Francisco busMinor injuries, investigation
April 2022Cruise vehicle pulled over by police, fled traffic stopSoftware update required

Waymo Litigation and Incidents
#

Waymo’s Regulatory Position
#

Waymo:Alphabet’s autonomous vehicle subsidiary, has taken a more conservative approach than Tesla or Cruise, operating true Level 4 robotaxis in geofenced areas (Phoenix, San Francisco, Los Angeles). While experiencing fewer fatal incidents, Waymo still faces litigation and regulatory scrutiny.

Waymo Lawsuits
#

Personal Injury

Martinez v. Waymo (Pedestrian Injury)

Confidential Settlement
Settled (2023)

Pedestrian struck by Waymo robotaxi in Phoenix filed personal injury claim. Waymo argued the pedestrian entered the roadway unexpectedly; plaintiff alleged the vehicle failed to take evasive action a human driver would have attempted. Settled confidentially.

Maricopa County Superior Court 2023
Personal Injury / Product Liability

California Cyclist Collision Case

Pending
Discovery Ongoing

Cyclist injured when Waymo vehicle allegedly failed to yield during turn, striking the cyclist. Case involves questions about how Waymo AI interprets cyclist behavior and whether safety systems are adequate for urban cycling environments.

San Francisco Superior Court 2024

NHTSA Waymo Investigation
#

Preliminary Evaluation (PE 24-003)

NHTSA Investigation: Waymo Crash Patterns

Investigation
Open Investigation

NHTSA opened investigation into Waymo vehicles following 22 reported incidents, including 17 involving collisions. Investigation focuses on whether Waymo vehicles are properly detecting and responding to obstacles, particularly in complex traffic scenarios.

NHTSA 2024

Waymo Public Opposition
#

Waymo has faced organized community opposition in San Francisco, including:

  • Vehicles set on fire during protests
  • Cones placed on vehicle hoods to disable sensors
  • Petitions demanding operational restrictions
  • City supervisor resolution questioning expansion

Other Autonomous Vehicle Companies
#

Zoox (Amazon)
#

Preliminary Evaluation

NHTSA Investigation: Zoox Braking

Investigation
Open

NHTSA investigating two Zoox robotaxi incidents in which vehicles braked unexpectedly, causing rear-end collisions with motorcyclists. Both incidents resulted in minor injuries. Investigation examines whether Zoox's AI properly accounts for following vehicle behavior.

NHTSA 2024

Aurora Innovation
#

Intellectual Property / Trade Secret

Aurora v. Uber (AV Technology IP)

$4,000,000,000+
Settled via Acquisition

Background: Aurora's predecessor Waymo sued Uber for stealing trade secrets via Anthony Levandowski. The case settled with Uber paying Waymo $245 million in equity and agreeing not to use Waymo technology. Uber later sold its AV division to Aurora, creating ongoing IP complexity.

N.D. California 2020

Ford/Argo AI
#

Status: Ford shut down Argo AI in October 2022, writing off its $2.7 billion investment. No major litigation pending, but demonstrates the financial risk of AV development.


NHTSA Regulatory Enforcement
#

Standing General Order (SGO) Reporting
#

In June 2021, NHTSA issued a Standing General Order requiring manufacturers of Level 2+ automation systems to report crashes within one day if they involve:

  • Fatality
  • Injury requiring hospitalization
  • Vehicle towing
  • Airbag deployment
  • Pedestrian/cyclist collision

Reported Crashes (as of December 2024):

ManufacturerReported CrashesFatalitiesInjuries
Tesla95652+200+
Honda134526
Toyota52312
Ford3517
GM2728
Waymo2202
Cruise1905

NHTSA Enforcement Actions
#

YearActionTargetResult
2024Investigation UpgradeTesla FSDEngineering Analysis phase
2024Recall OrderTesla (2M vehicles)Software update mandated
2024InvestigationCruise pedestrian incidentCriminal referral to DOJ
2023Recall OrderTesla (363K vehicles)FSD software update
2023Consent OrderCruiseEnhanced reporting requirements
2022InvestigationTesla emergency vehicle crashesOngoing

NHTSA Authority Gaps
#

NHTSA faces significant limitations in regulating autonomous vehicles:

No Pre-Market Approval: Unlike aircraft or pharmaceuticals, vehicles don’t require federal safety certification before sale. Manufacturers “self-certify” compliance with safety standards.

Outdated Standards: Federal Motor Vehicle Safety Standards (FMVSS) were written for human-driven vehicles. Standards for AI decision-making, cybersecurity, and software updates don’t exist.

Enforcement Delays: NHTSA investigations can take years. The agency has limited authority to order immediate operational suspensions.

Proposed Solutions:

  • AV START Act (pending): Would establish federal AV safety framework
  • NHTSA rulemaking on AV testing requirements (ongoing)
  • State-level operational permits (California, Arizona leading)

State-Level AV Regulation and Litigation
#

California: The Strictest Approach
#

California requires:

  • DMV permit for testing autonomous vehicles
  • Disengagement reporting (when human takes control)
  • Collision reporting within 10 days
  • Financial responsibility demonstrations
  • CPUC permit for passenger service

California AV Litigation:

Administrative Action

California DMV v. Cruise (Permit Revocation)

Permit Suspended
Suspension Upheld

California DMV suspended Cruise's autonomous vehicle permit following the pedestrian dragging incident, finding Cruise vehicles were 'not safe for the public's operation' and that the company 'misrepresented' information about the incident. Suspension remains in effect pending enhanced safety demonstrations.

California DMV 2023

Arizona: Permissive Regulation
#

Arizona allows AV testing with minimal requirements, making it a testing hub:

  • No permit required for testing
  • No disengagement reporting
  • Executive Order encouraging AV development
  • Limited liability framework

Arizona AV Incidents:

Criminal Prosecution

State v. Uber (Elaine Herzberg Death)

Criminal Charges Dropped
Charges Against Company Dropped; Safety Driver Convicted

First fatal autonomous vehicle crash involving a pedestrian. Uber's self-driving test vehicle struck and killed Elaine Herzberg in March 2018 while the safety driver was distracted. Prosecutors declined to charge Uber. Safety driver Rafaela Vasquez pleaded guilty to endangerment and received 3 years probation. Case highlighted gaps in criminal liability frameworks for AV incidents.

Maricopa County 2023

State-by-State AV Regulatory Comparison
#

StatePermit RequiredReportingSafety DriverLiability Framework
CaliforniaYesComprehensiveRequired (most cases)Manufacturer strict liability
ArizonaNoMinimalNot requiredTraditional negligence
TexasNoLimitedNot requiredAV-specific exemptions
FloridaNoMinimalNot requiredOwner/operator liability
NevadaYesModerateRequired initiallyManufacturer liability provisions

Product Liability Theories for AV Cases
#

Design Defect Claims
#

Plaintiffs argue autonomous vehicles contain design defects when:

  • AI fails to detect obvious obstacles
  • Software makes decisions no reasonable human would make
  • Safety systems inadequate for foreseeable conditions
  • Marketing encourages over-reliance on imperfect automation

Risk-Utility Test: Most jurisdictions apply risk-utility balancing: Does the danger of the design outweigh its benefits? AV manufacturers argue overall safety improvements justify individual failures; plaintiffs argue specific defects are unacceptable regardless of aggregate statistics.

Failure to Warn Claims
#

Warnings Issues:

  • Is “Full Self-Driving” an adequate warning that the car isn’t fully self-driving?
  • Do in-vehicle alerts sufficiently convey driver responsibility?
  • Should vehicles refuse to operate if drivers ignore warnings?

Manufacturing Defect Claims
#

Manufacturing defect claims (individual vehicle varies from design) are rare in AV cases because the AI is consistent across vehicles. However, sensor calibration issues or hardware variations may support such claims.

Negligence Claims
#

Beyond strict product liability, plaintiffs bring negligence claims alleging:

  • Negligent testing and development
  • Negligent marketing creating false safety impressions
  • Negligent failure to recall known defects
  • Negligent supervision of beta testing programs

Insurance and Autonomous Vehicles
#

Coverage Questions
#

AV crashes create complex insurance coverage questions:

Who Is Covered?

  • Traditional auto policies cover driver negligence
  • What if there’s no “driver”?
  • Product liability coverage shifts to manufacturers
  • Commercial policies for robotaxi operations

Emerging Insurance Products:

  • AV-specific liability policies
  • Manufacturer-backed insurance programs
  • Usage-based coverage for partial automation
  • Technology E&O for AV software

Insurance Litigation
#

Coverage Dispute

Progressive v. Tesla Owner (Autopilot Coverage Dispute)

Confidential
Settled

Insurance coverage dispute following Autopilot crash. Progressive argued policyholder voided coverage by using vehicle in manner not contemplated by standard auto policy. Case settled after Tesla provided data showing Autopilot was engaged. Highlights coverage ambiguity for ADAS-involved crashes.

Florida Circuit Court 2023

International AV Litigation
#

European Union
#

The EU has proposed comprehensive AV liability rules:

  • AI Act: AVs classified as “high-risk” AI systems requiring conformity assessment
  • Product Liability Directive (Proposed): Strict liability for AI-caused harm with reversed burden of proof
  • Motor Insurance Directive: Requires coverage for AV operation

United Kingdom
#

The UK Automated Vehicles Act (2024):

  • Establishes legal definition of “self-driving”
  • Creates authorized self-driving entity (ASDE) framework
  • Shifts liability to ASDE for driving decisions
  • Maintains human operator liability for handover failures

China
#

China has established AV testing zones with:

  • Government-supervised testing permits
  • Manufacturer liability for testing incidents
  • Social credit implications for safety violations
  • Insurance requirements for testing and deployment

The Future of AV Litigation
#

Expected Developments
#

Area2025-2026 Predictions
Tesla TrialsFirst FSD wrongful death verdicts
Class ActionsMarketing fraud class settlements
RegulatoryNHTSA AV-specific safety standards
State LawsComprehensive liability frameworks in 5+ states
CriminalMore prosecutions involving AV deaths
InsuranceStandard AV coverage products emerge

Unresolved Questions
#

Who Is Legally Responsible?

  • The vehicle owner?
  • The software developer?
  • The sensor/hardware manufacturer?
  • The mapping data provider?
  • The safety driver (if any)?

What Standard Applies?

  • Should AVs be safer than average human drivers?
  • Should AVs be safer than the best human drivers?
  • Should AVs be perfect?
  • How do we measure comparative safety?

How Should Damages Be Calculated?

  • Traditional wrongful death measures?
  • Higher damages for AV deaths (could have been prevented)?
  • Punitive damages for knowing deployment of unsafe AI?

Frequently Asked Questions
#

General AV Questions
#

Q: Are autonomous vehicles actually safer than human drivers?

A: The data is contested. Manufacturers claim better safety records, but critics note AVs primarily operate in favorable conditions (good weather, well-marked roads, limited geographic areas). When comparing equivalent conditions, the safety advantage is less clear. NHTSA data shows significant crash rates for Tesla Autopilot specifically.

Q: Who is liable when an autonomous vehicle causes a crash?

A: Liability depends on the automation level and circumstances. For Level 2 systems (Tesla Autopilot), courts generally hold drivers responsible but manufacturers may face product liability claims. For Level 4+ systems, liability increasingly shifts to manufacturers since there’s no human driver to blame.

Q: Can I sue Tesla if Autopilot causes a crash?

A: Yes. Multiple lawsuits have been filed against Tesla for Autopilot and FSD crashes. Claims typically include product liability (design defect, failure to warn) and sometimes fraud (misrepresentation of capabilities). Success depends on proving the defect and causation.

Legal Questions#

Q: Does Tesla’s disclaimer protect them from lawsuits?

A: Tesla requires owners to accept terms acknowledging Autopilot and FSD require constant driver attention. However, these disclaimers may not shield Tesla from:

  • Product liability claims (can’t disclaim strict liability for defects)
  • Fraud claims (if marketing contradicts disclaimers)
  • Negligence claims for known defects

Q: What’s the difference between a recall and an investigation?

A: An investigation is NHTSA examining potential safety issues. A recall is a mandatory remedy ordered when NHTSA determines a safety defect exists. Investigations may or may not lead to recalls. Tesla has faced both, investigations remain open while recalls have been ordered.

Q: Can the safety driver be criminally charged?

A: Yes. In the Uber/Elaine Herzberg case, safety driver Rafaela Vasquez was charged and convicted. Criminal liability for safety drivers depends on whether their inattention proximately caused the crash and state criminal law.


Resources and Further Reading
#

Key Cases
#

  • Estate of Banner v. Tesla, Fatal Autopilot crash settlement
  • Doe v. Cruise, Pedestrian dragging incident
  • State v. Vasquez, Safety driver criminal prosecution
  • In re Tesla Autopilot Marketing Litigation, Consumer fraud class action

Regulatory Resources
#

  • NHTSA Standing General Order Data (public database)
  • California DMV Autonomous Vehicle Reports
  • NHTSA Automated Vehicles Policy

Industry Resources
#

  • SAE J3016: Levels of Driving Automation
  • RAND Corporation AV Safety Studies
  • Insurance Institute for Highway Safety ADAS Ratings

This tracker is updated regularly as new incidents occur, cases are filed, and regulatory developments unfold. Last updated: January 2025.

Related

Autonomous Vehicle AI Liability

The Autonomous Vehicle Liability Reckoning # Autonomous vehicle technology promised to eliminate human error, responsible for over 90% of crashes. Instead, a new category of liability has emerged: algorithmic negligence, where AI systems make fatal errors that cannot be easily explained, predicted, or prevented. As self-driving technology scales from test fleets to consumer vehicles, courts are grappling with fundamental questions: Who bears responsibility when software kills? What disclosure duties exist for AI limitations? And does the promise of autonomy shift liability from driver to manufacturer?

AI Companion Chatbot & Mental Health App Liability

AI Companions: From Emotional Support to Legal Reckoning # AI companion chatbots, designed for emotional connection, romantic relationships, and mental health support, have become a distinct category of liability concern separate from customer service chatbots. These applications are marketed to lonely, depressed, and vulnerable users seeking human-like connection. When those users include children and teenagers struggling with mental health, the stakes become deadly.

AI Litigation Landscape 2025: Comprehensive Guide to AI Lawsuits

The AI Litigation Explosion # Artificial intelligence litigation has reached an inflection point. From copyright battles over training data to employment discrimination class actions, from product liability claims for AI chatbots to healthcare AI denial lawsuits, 2025 has seen an unprecedented wave of cases that will define AI accountability for decades to come.

AI Product Liability: From Negligence to Strict Liability

The Paradigm Shift # For decades, software developers enjoyed a shield that manufacturers of physical products never had: software was generally not considered a “product” subject to strict liability under U.S. law. If software caused harm, plaintiffs typically had to prove negligence, that the developer failed to exercise reasonable care.

AI Chatbot Liability & Customer Service Standard of Care

AI Chatbots: From Convenience to Liability # Customer-facing AI chatbots have moved from novelty to necessity across industries. Companies deploy these systems for 24/7 customer support, sales assistance, and information delivery. But as chatbots become more sophisticatedand more trusted by consumersthe legal exposure for their failures has grown dramatically.