Autonomous vehicles aren’t science fiction anymore. They’re on highways in Georgia, testing routes in Atlanta, and showing up in traffic reports. However, when one crashes, the question of autonomous vehicle accident liability becomes complicated quickly.
In 2023, NBC News reported that vehicles equipped with automated driving systems were involved in nearly 400 crashes over a period of just over a year. Most events occurred in real-world settings, not controlled laboratory tests. That number isn’t small; it’s a warning sign that responsibility in these cases isn’t straightforward.
Who’s at fault? The driver behind the wheel? The manufacturer that built the self-driving system? Or maybe the company that supplied the software update? These are not theoretical debates. They decide who pays medical bills, lost wages, and damages after a collision.
Suppose you or someone you care about has been involved in an accident with an autonomous vehicle. In that case, you need answers, not confusion. Understanding how liability works is the first step in protecting your rights.
Autonomous Vehicle Accident Liability for Drivers and Operators
Even if your car is in autonomous mode, you aren’t entirely off the hook. Most states treat the person in the driver’s seat as the operator, even if the vehicle is controlling speed, braking, and steering. That means you can still be found partially liable.
Courts are increasingly asking, ‘Did you pay attention?’ Did you use the system as directed? Did you step in when the car required human oversight? In multiple Tesla crash investigations, regulators have highlighted that drivers had time to intervene but failed to do so. In Georgia, where distracted driving laws are strict, failure to stay alert could weigh heavily against the operator.
Practical takeaway: If you use an autonomous system, you must remain alert and ready to take control. Failing to fulfill this duty can shift liability back onto you, even if the technology failed first.
When Tech Companies and Manufacturers Are At Fault
Automakers and tech firms face liability when defective design or negligent programming leads to harm. This falls under product liability law, a long-standing legal principle applied to everything from faulty brakes to dangerous airbags.
Tesla has been at the center of multiple lawsuits over Autopilot crashes. In one Florida case decided in 2025, a jury awarded $243 million to the family of a man killed when his Tesla on Autopilot failed to avoid a truck. The jury found Tesla misled consumers about the system’s capabilities and was unable to install adequate safety checks.
Waymo, Cruise, and other self-driving firms have also faced regulatory probes. In 2023, General Motors’ Cruise suspended operations after a pedestrian was dragged by one of its robotaxis. The company recalled nearly 1,000 vehicles following the incident. These examples show how companies can face liability for deploying tech that isn’t ready for all road scenarios.
For victims, this is crucial: If the system itself was flawed, responsibility may fall squarely on the manufacturer, rather than the human driver.

Who Is Responsible Understanding Autonomous Vehicle Accident Liability
Insurance Complexities in Autonomous Vehicle Crashes
Traditional auto insurance assumes driver fault. Autonomous vehicles complicate this. When technology takes over, insurers may argue the driver isn’t liable at all, or that coverage shifts to the manufacturer or even a third-party software vendor.
Some states are exploring new insurance models. In 2022, California began requiring autonomous vehicle operators to carry at least $5 million in liability insurance for testing fleets. Georgia has not yet adopted specific AV insurance laws, but insurers are closely monitoring crash reports.
For you, the challenge is simple: Which policy applies? Your personal coverage? The automaker’s liability coverage? Or a hybrid of both? Without clear answers, victims often face delays in receiving compensation. That’s where legal help becomes critical. Attorneys can demand data logs, review system performance, and pinpoint which insurer should pay.
Regulatory rules shaping who pays
Federal oversight is beginning to catch up with the realities of autonomous vehicles. Since June 2021, the National Highway Traffic Safety Administration (NHTSA) has required automakers and operators to report crashes involving vehicles equipped with Level 2 advanced driver assistance systems or higher within 24 hours if a death, injury, or tow-away occurs.
That mandate exposed just how often these crashes happen. In the first year alone, nearly 400 incidents were reported, with the majority concentrated in California and Texas, where autonomous fleets are more active. These reports provided regulators with their first clear view of how frequently automated systems fail in real-world traffic.
Federal Reporting and Recalls
Beyond reporting, federal regulators have also pressured companies to issue recalls when software defects are found. Tesla, Cruise, and Waymo have each initiated voluntary recalls tied to system performance, often prompted by NHTSA investigations. While recalls may protect the public, they also serve as evidence that companies knew about flaws in their systems. For victims, a recall history can strengthen claims of corporate liability.
At the state level, rules vary widely. Some states, like California, require companies testing self-driving cars to maintain permits, submit annual disengagement reports, and carry high liability insurance coverage.
Others, such as Arizona and Texas, take a more permissive approach, limiting regulatory burdens to encourage innovation. Georgia, where The Roth Firm represents clients, has enacted legislation that defines a “driver” as the human in physical control of a vehicle.
But the law has not yet been updated to address situations where a car is in full autonomous mode with no driver intervention. That gap creates uncertainty. In a courtroom, lawyers must argue whether liability belongs to the human passenger, the manufacturer, or the software provider.
The trend is clear: regulation is evolving, but unevenly. Federal rules are aimed at transparency and safety oversight, while states are experimenting with their own liability structures. For someone injured in a crash, this patchwork matters.
If the company operating or manufacturing the vehicle failed to comply with federal crash-reporting rules, ignored safety guidance, or violated state insurance mandates, those violations can directly support a victim’s case.
Recent Legal Cases That Reshape Liability
Case law is moving fast. A few standouts illustrate how courts are thinking about autonomous vehicle crashes:
- Tesla Autopilot verdict (Florida, 2025): Jury awarded $243 million after a Tesla driver was killed in Key Largo. The case hinged on whether Autopilot was misrepresented as safer than it was. The verdict shows courts will hold companies liable for overstating safety claims.
- Cruise robotaxi pedestrian case (San Francisco, 2023): A pedestrian was dragged 20 feet by a Cruise self-driving taxi. Regulators forced the company to suspend operations. Lawsuits are ongoing, but the recall of 950 vehicles shows companies may act swiftly to avoid broader liability.
- Waymo investigation closure (2025): NHTSA ended its probe into Waymo after software updates addressed 22 reports of unexpected behavior. No penalties were issued. The case shows that proactive recalls may reduce liability exposure for tech companies.
Each case adds a piece to the puzzle. Together, they reveal a legal system adapting quickly to the rise of self-driving cars.
Steps You Should Take If Involved in an AV Accident
- Call law enforcement immediately: An official crash report is vital for liability disputes.
- Document everything: Take photos of the scene, vehicles, injuries, and road conditions.
- Request vehicle data: Autonomous vehicles generate logs that can show whether the system malfunctioned.
- Seek medical care: Even minor injuries matter in personal injury claims.
- Contact an attorney quickly: Autonomous vehicle liability cases require technical and legal expertise. Acting fast protects your right to evidence before companies attempt to shield it.
Don’t Let The Car Drive Your Case
Autonomous vehicles promise convenience, but when accidents happen, liability is anything but simple. You may face finger-pointing between drivers, insurers, and tech companies.
If you or someone you love has been injured in a crash involving an autonomous vehicle, you need more than answers, you need representation. The Roth Firm helps victims hold both negligent drivers and billion-dollar corporations accountable.
Your case matters. Don’t let complexity stand in the way of justice. Contact us today for a free consultation.