I’ve been thinking a lot about autonomous vehicles as I prepare for next Wednesday’s Ride AI Summit in Los Angeles.
I’ll be moderating two panels. One features three companies—Waabi, Bot.auto, and Torc—that are working to automate long-haul trucking. The other features Nuro and Wayve, two of the leading companies developing next-generation driver asssistance systems for customer-owned cars.
We still have some tickets available, so if you are involved—or interested—in the AV industry, you’ll want to join us in Los Angeles.
The first ever fatal crash involving a fully driverless vehicle occurred in San Francisco on January 19. The driverless vehicle belonged to Waymo, but the crash was not Waymo’s fault.
Here’s what happened: a Waymo with no driver or passengers stopped for a red light. Another car stopped behind the Waymo. Then, according to Waymo, a human-driven SUV rear-ended the other vehicles at high speed, causing a six-car pileup that killed one person and injured five others. Someone’s dog also died in the crash.
Another major Waymo crash occurred in October in San Francisco. Once again, a driverless Waymo was stopped for a red light. According to Waymo, a vehicle traveling in the opposite direction crossed the double yellow line and crashed into an SUV that was stopped to the Waymo’s left. The force of the impact shoved the SUV into the Waymo. One person was seriously injured.
These two incidents produced worse injuries than any other Waymo crash in the last nine months. But in other respects they were typical Waymo crashes. Most Waymo crashes involve a Waymo vehicle scrupulously following the rules while a human driver flouts them: speeding, running red lights, careening out of their lanes, and so forth.
Waymo’s service will only grow in the coming months and years. So Waymo will inevitably be involved in more crashes—including some crashes that cause serious injuries and even death.
But as this happens, it’s crucial to keep the denominator in mind. Since 2020, Waymo has reported roughly 60 crashes serious enough to trigger an airbag or cause an injury. But those crashes occurred over more than 50 million miles of driverless operations. If you randomly selected 50 million miles of human driving—that’s roughly 70 lifetimes behind the wheel—you would likely see far more serious crashes than Waymo has experienced to date.
Federal regulations require Waymo to report all significant crashes, whether or not the Waymo vehicle was at fault—indeed, whether or not the Waymo is even moving at the time of the crash. I’ve spent the last few days poring over Waymo’s crash reports from the last nine months. Let’s dig in.
Last September, I analyzed Waymo crashes through June 2024. So this section will focus on crashes between July 2024 and February 2025. During that period, Waymo reported 38 crashes that were serious enough to either cause an (alleged) injury or an airbag deployment.
In my view only one of these crashes was clearly Waymo’s fault. Waymo may have been responsible for three other crashes—there wasn’t enough information to say for certain. The remaining 34 crashes seemed to be mostly or entirely the fault of others:
-
The two serious crashes I mentioned at the start of this article are among 16 crashes where another vehicle crashed into a stationary Waymo (or caused a multi-car pileup involving a stationary Waymo). This included ten rear-end crashes, three side-swipe crashes, and three c
19 Comments
doctorpangloss
Another point of view is, if CVC violations were enforced against human drivers as robustly as they are against Waymos; and if human drivers were held to the same standards of liability as Waymos; human drivers in California would be way safer too. To me, the overall safety of all the driverless programs should be interpreted as a huge victory for regulators.
ikerino
> Using human crash data, Waymo estimated that human drivers on the same roads would get into 78 crashes serious enough to trigger an airbag. By comparison, Waymo’s driverless vehicles only got into 13 airbag crashes. That represents an 83 percent reduction in airbag crashes relative to typical human drivers.
> This is slightly worse than last September, when Waymo estimated an 84 percent reduction in airbag crashes over Waymo’s first 21 million miles.
nitpick: Is it really slightly worse, or is it "effectively unchanged" with such sparse numbers? At a glance, the sentence is misleading even though it might be correct on paper. Could've said: "This improvement holds from last September…"
shadowgovt
One of the more interesting things Waymo discovered early in the project is that the actual incidents of vehicle collision were under-counted by about a factor of 3. This is because NHTSA was using accident reports and insurance data for their tracking state, but only 1/3 of collisions were bad enough for either first responders or insurance to get involved; the rest were "Well, that'll buff out and I don't want my rates to go up, so…" fender-taps.
But Waymo vehicles were recording and tracking all the traffic around them, so they ended up out-of-the-starting-gate with more accurate collision numbers by running a panopticon on drivers on the road.
nukem222
I would hope so! Are these rides actually cheaper? Assumably this is orders of magnitude less expensive than hiring a human and the driver is what you pay for. I don't see myself ever getting in a car if my safety is actually a concern.
I don't see myself using any of these any time soon, I tend to drive and walk everywhere and don't see much point to paying someone else to drive barring extenuating circumstances. But assuming actual cost benefits are delivered to customers this might be pretty exciting.
mjburgess
Waymos choose the routes, right?
The issue with self-driving is (1) how it generalises across novel environments without "highly-available route data" and provider-chosen routes; (2) how failures are correlated across machines.
In safe driving failures are uncorrelated and safety procedures generalise. We do not yet know if, say, using self-driving very widely will lead to conditions in which "in a few incidents" more people are killed in those incidents than were ever hypothetically saved.
Here, without any confidence intervals, we're told we've saved ~70 airbag incidents in 20 mil miles. A bad update to the fleet will easily eclipse that impact.
grakasja
This statistic could be misleading, because not all miles are equally dangerous. Google is very careful about selecting where it deploys and tests Waymo, preferring flat, safe, well-designed areas. Routing is also closely monitored and I would imagine that problematic roadways are avoided. The article says they compared it to human accident rates "on the same roads" but doesn't clarify their methodology for "same"ness. It also doesn't factor in driver experience. A taxi driver who has memorized a particular route is likely going to drive safer than a tourist who has never gone on that same road before. Waymo may be safer than the average driver on X road but that doesn't mean it will have the same comparative performance if you drop it onto a random road it has never driven before with no assistance from human support staff.
bmitc
False. Humans are driving in a much wider range range of weather, road conditions, car conditions, passenger conditions, routes, unknown destinations, etc.
floxy
OT, but does anyone know what the shape of the curve for the number of automobile crashes per human driver? Is it uniform distribution, where everyone is more or less likely to get into, say, 1.2 fender-benders per lifetime? Or there is a cluster of people who are much more likely to be involved in crashes? I suppose automobile insurance companies would have this type of information.
nickvec
Very excited for the future of AVs. So many lives will be saved using this technology.
JKCalhoun
The idea that everyone has their own "public transportation" (a computer chauffeur) seems lacking in foresight.
I suspect, but don't know, that buses are safer still? (Not aware of any airbags triggering, ha ha.)
labrador
I was initially skeptical about self-driving cars but I've been won over by Waymo's careful and thoughtful approach using visual cues, lidar, safety drivers and geo-fencing. That said I will never trust my life to a Tesla robotaxi that uses visual cues only and will drive into a wall painted to look like the road ahead like Wile E. Coyote. Beep beep.
Man Tests If Tesla Autopilot Will Crash Into Wall Painted to Look Like Road
https://futurism.com/tesla-wall-autopilot
amelius
Yeah, humans would crash less too if they had a Lidar that stopped the car if a crash appeared imminent.
wnissen
Serious crash rates are a hockey stick pattern. 20% of the drivers cause 80% of the crashes, to a rough approximation. For the worst 20% of drivers, the Waymo is almost certainly better already.
Honestly, at this point I am more interested in whether they can operate their service profitably and affordably, because they are clearly nailing the technical side.
For example data from a 100 driver study, see table 2.11, p. 29.
https://rosap.ntl.bts.gov/view/dot/37370
Roughly the same number of drivers had 0 or 1 near-crashes as had 13-50+. One of the drivers had 56 near crashes and 4 actual crashes in less than 20K miles! So the average isn't that helpful here.
ra7
This article neatly summarizes Waymo’s latest safety numbers, but Waymo’s site provides much more detail, including a full breakdown of their comparison methodology: https://waymo.com/safety/impact/
jedbrooke
purely anecdotal: one thing I’ve noticed is that Waymo’s ALWAYS use their turn signal. that’s already gotta put them above a large portion of human drivers in terms of safety ;)
dangus
You know what has even better safety statistics? Public transit including buses and trains.
Cycling also has better statistics when the infrastructure is given real attention, and it leads to better health outcomes.
paxys
Worth repeating the same comment I've left on every variant of this article for the last 10 years.
Being better than "average" is a laughably low bar for self-driving cars. Average drivers include people who drive while drunk and on drugs. It includes teenagers and those who otherwise have very little experience on the road. It includes people who are too old to be driving safely. It includes people who are habitually speed and are reckless. It includes cars that are mechanically faulty or otherwise cannot be driven safely. If you compile accident statistics the vast majority will fall into one of these categories.
For self driving to be widely adopted the bare minimum bar needs to be – is it better than the average sensible and experienced driver?
Otherwise if you replace all 80% of the good drivers with waymos and the remaining 20% stay behind the wheel, accident rates are going to go up not down.
ein0p
This is very cool if it's not the case of "lying with statistics". I hope we will at some point see independent research into these safety claims. I don't really think too highly of my own driving skills, so I also hope that I live to see the day when I can just let my own car drive itself on most or all trips. And no, Tesla FSD ain't that, at least not yet – it does help with long trips, but in difficult situations it disengages at the worst possible moment. I want to tell the car where to go, and then never touch the wheel until I'm there, day or night, rain or sun.
xlbuttplug2
Off topic, but I've always felt that the ideal way to address safety is by limiting the amount of freedom a vehicle has, i.e., force it to follow a predefined path. That is, replace road lanes with something like tram tracks. (I'm imagining vehicles being two/four person pods zipping around.)
This would vastly reduce the number of accident scenarios, be more efficient, and be much easier to automate. And would probably be good enough for 99% of use cases (i.e., work commute).
Obviously I don't seriously see anyone splurging on the infrastructure and bespoke vehicles for that. But I can dream.