The Autonomous Vehicle Liability Crisis#
Self-driving cars were promised to eliminate human error and make roads safer. Instead, they have created a complex liability landscape where crashes, injuries, and deaths have triggered hundreds of lawsuits, billions in regulatory penalties, and fundamental questions about who bears responsibility when AI-controlled vehicles cause harm.
From Tesla’s Autopilot and “Full Self-Driving” systems, involved in dozens of fatal crashes, to Cruise’s dramatic implosion after pedestrian incidents, autonomous vehicle litigation represents the cutting edge of AI product liability law. These cases will establish precedents governing AI responsibility for decades to come.
- 956 crashes involving Tesla Autopilot investigated by NHTSA (as of 2024)
- 52+ fatalities linked to Tesla Autopilot/FSD systems
- $1.5 billion recalled value of Cruise vehicles after pedestrian incident
- 17 ongoing NHTSA investigations into autonomous vehicle systems
- $200 million+ in active AV litigation against major manufacturers
Understanding Autonomous Vehicle Technology Levels#
SAE Automation Levels#
The Society of Automotive Engineers defines six levels of vehicle automation:
| Level | Name | Description | Human Role |
|---|---|---|---|
| 0 | No Automation | Human performs all driving tasks | Full control |
| 1 | Driver Assistance | Vehicle assists with steering OR acceleration | Constant supervision |
| 2 | Partial Automation | Vehicle controls steering AND acceleration | Constant supervision |
| 3 | Conditional Automation | Vehicle performs all driving tasks in limited conditions | Ready to intervene |
| 4 | High Automation | Vehicle performs all tasks without human intervention in defined areas | Optional control |
| 5 | Full Automation | Vehicle performs all tasks everywhere | No control needed |
Current Market Status#
Level 2 (Mass Market):
- Tesla Autopilot / FSD (despite “Full Self-Driving” name)
- GM Super Cruise
- Ford BlueCruise
- Mercedes-Benz Drive Pilot (Level 3 certified in limited areas)
Level 4 (Limited Deployment):
- Waymo (robotaxi service)
- Cruise (suspended operations)
- Zoox (testing phase)
- Aurora (trucking focus)
Level 5: No commercially available vehicles have achieved Level 5 automation.
Tesla Autopilot and FSD Litigation#
The Tesla Safety Record#
Tesla’s Autopilot and “Full Self-Driving” (FSD) systems have been implicated in hundreds of crashes, including over 50 fatalities. NHTSA has opened multiple investigations and ordered recalls, yet Tesla continues expanding these features to millions of vehicles.
Key Issues:
- Marketing “Full Self-Driving” for a Level 2 system requiring constant supervision
- Alleged defects causing phantom braking, failure to detect obstacles, and unintended acceleration
- Driver monitoring systems insufficient to ensure attention
- Over-reliance fostered by automation name and marketing claims
Fatal Tesla Autopilot Cases#
Estate of Banner v. Tesla (Apple Engineer Death)
Family of Apple engineer Walter Huang, killed when his Tesla Model X in Autopilot mode struck a highway barrier in 2018, sued Tesla for wrongful death. The vehicle failed to detect the concrete barrier and accelerated into it. Tesla argued Huang ignored warnings to keep hands on wheel. Case settled confidentially after extensive discovery into Autopilot defects.
Varner v. Tesla (Florida Fatal Crash)
Family of Jeremy Banner, killed when his Tesla Model 3 on Autopilot drove under a semi-truck in 2019, filed wrongful death suit. The vehicle failed to detect the truck crossing the highway, a scenario similar to the first fatal Autopilot crash in 2016. Tesla faces claims of design defect and failure to warn.
People v. Tesla (Glendale Fatal Crash)
First criminal prosecution involving Tesla Autopilot. The driver, Kevin Aziz Riad, was charged with two counts of vehicular manslaughter after his Tesla on Autopilot ran a red light and killed two people in 2019. Tesla is not a defendant but the case raises questions about criminal liability for AV-involved deaths.
Rossiter v. Tesla (FSD Beta Crash)
Plaintiff injured when a Tesla using FSD Beta suddenly swerved into oncoming traffic. Lawsuit alleges Tesla knew FSD Beta was defective and unsafe for public roads but released it anyway to thousands of untrained 'beta testers.' Claims include strict liability and negligent design.
Tesla Class Actions#
In re Tesla Autopilot Marketing Litigation
Consolidated class action alleging Tesla defrauded consumers by marketing 'Full Self-Driving' capability since 2016 while knowing the technology could not deliver fully autonomous driving. Plaintiffs claim they paid $5,000-$15,000 for features that don't work as advertised. Tesla argues FSD is still in development and customers received value from existing features.
Briggs v. Tesla (Phantom Braking)
Class action certified for Tesla owners experiencing 'phantom braking', sudden, unexpected braking without visible obstacles. Plaintiffs allege the defect creates dangerous conditions and Tesla has failed to remedy it despite thousands of complaints. Class includes Model 3 and Model Y owners from 2021-2023.
NHTSA Tesla Investigations#
| Investigation | Subject | Status | Vehicles Affected |
|---|---|---|---|
| PE 22-020 | Autopilot crashes with emergency vehicles | Open | 765,000+ |
| PE 23-001 | Phantom braking | Open | 416,000+ |
| EA 23-003 | FSD Beta sudden steering | Open | 362,000+ |
| PE 21-020 | Autopilot crash data collection | Open | 765,000+ |
| Recall 23V-838 | FSD software inadequate controls | Completed | 2,000,000+ |
Tesla Recall Orders#
NHTSA Recall: Tesla Full Self-Driving Software (December 2023)
NHTSA ordered Tesla's largest-ever recall, finding FSD software allowed drivers to become inattentive and enabled the vehicle to act in unpredictable ways at intersections, including running stop signs. Tesla issued over-the-air software update but NHTSA continues monitoring for compliance.
NHTSA Recall: Tesla Autopilot Driver Monitoring (April 2024)
NHTSA found Tesla's Autopilot driver monitoring system failed to ensure driver attention during use, violating previous recall commitments. The agency determined the December 2023 FSD recall remedy was insufficient and ordered additional software updates and monitoring improvements.
Cruise: The Rise and Fall#
The October 2023 Pedestrian Incident#
Cruise:GM’s autonomous vehicle subsidiary, experienced a catastrophic collapse following an October 2023 incident where a Cruise robotaxi dragged a pedestrian 20 feet after she was struck by another vehicle and thrown into the Cruise’s path.
What Happened:
- A hit-and-run driver struck a pedestrian, throwing her into the Cruise vehicle’s lane
- The Cruise robotaxi struck the pedestrian and stopped
- Believing the pedestrian was entirely under the vehicle, the AI decided to “pull over” to safety
- The vehicle dragged the pedestrian 20 feet before stopping
- Cruise initially showed regulators edited video omitting the dragging
Aftermath:
- California DMV revoked Cruise’s self-driving permit
- NHTSA opened formal investigation
- GM paused Cruise operations nationwide
- CEO Kyle Vogt resigned
- 24% of Cruise workforce laid off
- Criminal investigation by San Francisco DA
Cruise Litigation#
Doe v. Cruise (Pedestrian Dragging)
The pedestrian injured in the October 2023 dragging incident filed suit against Cruise, GM, and the hit-and-run driver. Claims include product liability for the robotaxi's decision to move while a person was beneath it, and negligence for Cruise's safety protocols. The victim suffered severe injuries requiring extensive surgery.
CPUC v. Cruise (Regulatory Enforcement)
CPUC fined Cruise and required compliance monitoring after investigation found the company failed to provide complete information about the pedestrian dragging incident. Cruise agreed to enhanced reporting requirements, third-party safety audits, and operational restrictions as conditions for eventual permit restoration.
Class Action: Cruise Investor Securities Fraud
GM investors allege the company made materially false statements about Cruise's safety record and regulatory compliance, causing stock price decline when the pedestrian incident and video editing revelation emerged. Claims include violations of Securities Exchange Act Section 10(b).
Other Cruise Incidents#
| Date | Incident | Outcome |
|---|---|---|
| August 2023 | Cruise vehicle blocked fire truck responding to emergency | Fine, operational restrictions |
| September 2023 | Multiple Cruise vehicles congregated, blocking traffic for hours | Police intervention required |
| June 2023 | Cruise vehicle struck San Francisco bus | Minor injuries, investigation |
| April 2022 | Cruise vehicle pulled over by police, fled traffic stop | Software update required |
Waymo Litigation and Incidents#
Waymo’s Regulatory Position#
Waymo:Alphabet’s autonomous vehicle subsidiary, has taken a more conservative approach than Tesla or Cruise, operating true Level 4 robotaxis in geofenced areas (Phoenix, San Francisco, Los Angeles). While experiencing fewer fatal incidents, Waymo still faces litigation and regulatory scrutiny.
Waymo Lawsuits#
Martinez v. Waymo (Pedestrian Injury)
Pedestrian struck by Waymo robotaxi in Phoenix filed personal injury claim. Waymo argued the pedestrian entered the roadway unexpectedly; plaintiff alleged the vehicle failed to take evasive action a human driver would have attempted. Settled confidentially.
California Cyclist Collision Case
Cyclist injured when Waymo vehicle allegedly failed to yield during turn, striking the cyclist. Case involves questions about how Waymo AI interprets cyclist behavior and whether safety systems are adequate for urban cycling environments.
NHTSA Waymo Investigation#
NHTSA Investigation: Waymo Crash Patterns
NHTSA opened investigation into Waymo vehicles following 22 reported incidents, including 17 involving collisions. Investigation focuses on whether Waymo vehicles are properly detecting and responding to obstacles, particularly in complex traffic scenarios.
Waymo Public Opposition#
Waymo has faced organized community opposition in San Francisco, including:
- Vehicles set on fire during protests
- Cones placed on vehicle hoods to disable sensors
- Petitions demanding operational restrictions
- City supervisor resolution questioning expansion
Other Autonomous Vehicle Companies#
Zoox (Amazon)#
NHTSA Investigation: Zoox Braking
NHTSA investigating two Zoox robotaxi incidents in which vehicles braked unexpectedly, causing rear-end collisions with motorcyclists. Both incidents resulted in minor injuries. Investigation examines whether Zoox's AI properly accounts for following vehicle behavior.
Aurora Innovation#
Aurora v. Uber (AV Technology IP)
Background: Aurora's predecessor Waymo sued Uber for stealing trade secrets via Anthony Levandowski. The case settled with Uber paying Waymo $245 million in equity and agreeing not to use Waymo technology. Uber later sold its AV division to Aurora, creating ongoing IP complexity.
Ford/Argo AI#
Status: Ford shut down Argo AI in October 2022, writing off its $2.7 billion investment. No major litigation pending, but demonstrates the financial risk of AV development.
NHTSA Regulatory Enforcement#
Standing General Order (SGO) Reporting#
In June 2021, NHTSA issued a Standing General Order requiring manufacturers of Level 2+ automation systems to report crashes within one day if they involve:
- Fatality
- Injury requiring hospitalization
- Vehicle towing
- Airbag deployment
- Pedestrian/cyclist collision
Reported Crashes (as of December 2024):
| Manufacturer | Reported Crashes | Fatalities | Injuries |
|---|---|---|---|
| Tesla | 956 | 52+ | 200+ |
| Honda | 134 | 5 | 26 |
| Toyota | 52 | 3 | 12 |
| Ford | 35 | 1 | 7 |
| GM | 27 | 2 | 8 |
| Waymo | 22 | 0 | 2 |
| Cruise | 19 | 0 | 5 |
NHTSA Enforcement Actions#
| Year | Action | Target | Result |
|---|---|---|---|
| 2024 | Investigation Upgrade | Tesla FSD | Engineering Analysis phase |
| 2024 | Recall Order | Tesla (2M vehicles) | Software update mandated |
| 2024 | Investigation | Cruise pedestrian incident | Criminal referral to DOJ |
| 2023 | Recall Order | Tesla (363K vehicles) | FSD software update |
| 2023 | Consent Order | Cruise | Enhanced reporting requirements |
| 2022 | Investigation | Tesla emergency vehicle crashes | Ongoing |
NHTSA Authority Gaps#
NHTSA faces significant limitations in regulating autonomous vehicles:
No Pre-Market Approval: Unlike aircraft or pharmaceuticals, vehicles don’t require federal safety certification before sale. Manufacturers “self-certify” compliance with safety standards.
Outdated Standards: Federal Motor Vehicle Safety Standards (FMVSS) were written for human-driven vehicles. Standards for AI decision-making, cybersecurity, and software updates don’t exist.
Enforcement Delays: NHTSA investigations can take years. The agency has limited authority to order immediate operational suspensions.
Proposed Solutions:
- AV START Act (pending): Would establish federal AV safety framework
- NHTSA rulemaking on AV testing requirements (ongoing)
- State-level operational permits (California, Arizona leading)
State-Level AV Regulation and Litigation#
California: The Strictest Approach#
California requires:
- DMV permit for testing autonomous vehicles
- Disengagement reporting (when human takes control)
- Collision reporting within 10 days
- Financial responsibility demonstrations
- CPUC permit for passenger service
California AV Litigation:
California DMV v. Cruise (Permit Revocation)
California DMV suspended Cruise's autonomous vehicle permit following the pedestrian dragging incident, finding Cruise vehicles were 'not safe for the public's operation' and that the company 'misrepresented' information about the incident. Suspension remains in effect pending enhanced safety demonstrations.
Arizona: Permissive Regulation#
Arizona allows AV testing with minimal requirements, making it a testing hub:
- No permit required for testing
- No disengagement reporting
- Executive Order encouraging AV development
- Limited liability framework
Arizona AV Incidents:
State v. Uber (Elaine Herzberg Death)
First fatal autonomous vehicle crash involving a pedestrian. Uber's self-driving test vehicle struck and killed Elaine Herzberg in March 2018 while the safety driver was distracted. Prosecutors declined to charge Uber. Safety driver Rafaela Vasquez pleaded guilty to endangerment and received 3 years probation. Case highlighted gaps in criminal liability frameworks for AV incidents.
State-by-State AV Regulatory Comparison#
| State | Permit Required | Reporting | Safety Driver | Liability Framework |
|---|---|---|---|---|
| California | Yes | Comprehensive | Required (most cases) | Manufacturer strict liability |
| Arizona | No | Minimal | Not required | Traditional negligence |
| Texas | No | Limited | Not required | AV-specific exemptions |
| Florida | No | Minimal | Not required | Owner/operator liability |
| Nevada | Yes | Moderate | Required initially | Manufacturer liability provisions |
Product Liability Theories for AV Cases#
Design Defect Claims#
Plaintiffs argue autonomous vehicles contain design defects when:
- AI fails to detect obvious obstacles
- Software makes decisions no reasonable human would make
- Safety systems inadequate for foreseeable conditions
- Marketing encourages over-reliance on imperfect automation
Risk-Utility Test: Most jurisdictions apply risk-utility balancing: Does the danger of the design outweigh its benefits? AV manufacturers argue overall safety improvements justify individual failures; plaintiffs argue specific defects are unacceptable regardless of aggregate statistics.
Failure to Warn Claims#
Warnings Issues:
- Is “Full Self-Driving” an adequate warning that the car isn’t fully self-driving?
- Do in-vehicle alerts sufficiently convey driver responsibility?
- Should vehicles refuse to operate if drivers ignore warnings?
Manufacturing Defect Claims#
Manufacturing defect claims (individual vehicle varies from design) are rare in AV cases because the AI is consistent across vehicles. However, sensor calibration issues or hardware variations may support such claims.
Negligence Claims#
Beyond strict product liability, plaintiffs bring negligence claims alleging:
- Negligent testing and development
- Negligent marketing creating false safety impressions
- Negligent failure to recall known defects
- Negligent supervision of beta testing programs
Insurance and Autonomous Vehicles#
Coverage Questions#
AV crashes create complex insurance coverage questions:
Who Is Covered?
- Traditional auto policies cover driver negligence
- What if there’s no “driver”?
- Product liability coverage shifts to manufacturers
- Commercial policies for robotaxi operations
Emerging Insurance Products:
- AV-specific liability policies
- Manufacturer-backed insurance programs
- Usage-based coverage for partial automation
- Technology E&O for AV software
Insurance Litigation#
Progressive v. Tesla Owner (Autopilot Coverage Dispute)
Insurance coverage dispute following Autopilot crash. Progressive argued policyholder voided coverage by using vehicle in manner not contemplated by standard auto policy. Case settled after Tesla provided data showing Autopilot was engaged. Highlights coverage ambiguity for ADAS-involved crashes.
International AV Litigation#
European Union#
The EU has proposed comprehensive AV liability rules:
- AI Act: AVs classified as “high-risk” AI systems requiring conformity assessment
- Product Liability Directive (Proposed): Strict liability for AI-caused harm with reversed burden of proof
- Motor Insurance Directive: Requires coverage for AV operation
United Kingdom#
The UK Automated Vehicles Act (2024):
- Establishes legal definition of “self-driving”
- Creates authorized self-driving entity (ASDE) framework
- Shifts liability to ASDE for driving decisions
- Maintains human operator liability for handover failures
China#
China has established AV testing zones with:
- Government-supervised testing permits
- Manufacturer liability for testing incidents
- Social credit implications for safety violations
- Insurance requirements for testing and deployment
The Future of AV Litigation#
Expected Developments#
| Area | 2025-2026 Predictions |
|---|---|
| Tesla Trials | First FSD wrongful death verdicts |
| Class Actions | Marketing fraud class settlements |
| Regulatory | NHTSA AV-specific safety standards |
| State Laws | Comprehensive liability frameworks in 5+ states |
| Criminal | More prosecutions involving AV deaths |
| Insurance | Standard AV coverage products emerge |
Unresolved Questions#
Who Is Legally Responsible?
- The vehicle owner?
- The software developer?
- The sensor/hardware manufacturer?
- The mapping data provider?
- The safety driver (if any)?
What Standard Applies?
- Should AVs be safer than average human drivers?
- Should AVs be safer than the best human drivers?
- Should AVs be perfect?
- How do we measure comparative safety?
How Should Damages Be Calculated?
- Traditional wrongful death measures?
- Higher damages for AV deaths (could have been prevented)?
- Punitive damages for knowing deployment of unsafe AI?
Frequently Asked Questions#
General AV Questions#
Q: Are autonomous vehicles actually safer than human drivers?
A: The data is contested. Manufacturers claim better safety records, but critics note AVs primarily operate in favorable conditions (good weather, well-marked roads, limited geographic areas). When comparing equivalent conditions, the safety advantage is less clear. NHTSA data shows significant crash rates for Tesla Autopilot specifically.
Q: Who is liable when an autonomous vehicle causes a crash?
A: Liability depends on the automation level and circumstances. For Level 2 systems (Tesla Autopilot), courts generally hold drivers responsible but manufacturers may face product liability claims. For Level 4+ systems, liability increasingly shifts to manufacturers since there’s no human driver to blame.
Q: Can I sue Tesla if Autopilot causes a crash?
A: Yes. Multiple lawsuits have been filed against Tesla for Autopilot and FSD crashes. Claims typically include product liability (design defect, failure to warn) and sometimes fraud (misrepresentation of capabilities). Success depends on proving the defect and causation.
Legal Questions#
Q: Does Tesla’s disclaimer protect them from lawsuits?
A: Tesla requires owners to accept terms acknowledging Autopilot and FSD require constant driver attention. However, these disclaimers may not shield Tesla from:
- Product liability claims (can’t disclaim strict liability for defects)
- Fraud claims (if marketing contradicts disclaimers)
- Negligence claims for known defects
Q: What’s the difference between a recall and an investigation?
A: An investigation is NHTSA examining potential safety issues. A recall is a mandatory remedy ordered when NHTSA determines a safety defect exists. Investigations may or may not lead to recalls. Tesla has faced both, investigations remain open while recalls have been ordered.
Q: Can the safety driver be criminally charged?
A: Yes. In the Uber/Elaine Herzberg case, safety driver Rafaela Vasquez was charged and convicted. Criminal liability for safety drivers depends on whether their inattention proximately caused the crash and state criminal law.
Resources and Further Reading#
Key Cases#
- Estate of Banner v. Tesla, Fatal Autopilot crash settlement
- Doe v. Cruise, Pedestrian dragging incident
- State v. Vasquez, Safety driver criminal prosecution
- In re Tesla Autopilot Marketing Litigation, Consumer fraud class action
Regulatory Resources#
- NHTSA Standing General Order Data (public database)
- California DMV Autonomous Vehicle Reports
- NHTSA Automated Vehicles Policy
Industry Resources#
- SAE J3016: Levels of Driving Automation
- RAND Corporation AV Safety Studies
- Insurance Institute for Highway Safety ADAS Ratings
This tracker is updated regularly as new incidents occur, cases are filed, and regulatory developments unfold. Last updated: January 2025.