Waymo mimicked dozens of actual fatal accidents that occurred in Arizona over the course of nearly ten years in an effort to demonstrate that such robot drivers are safer than humans. As per research released today by the Google offshoot, substituting either car in a two-car collision with one of its robot-driven minivans will almost totally eradicate all fatalities.

The findings support Waymo’s claim that automated vehicles are safer to run than human-driven ones. Due to the fact that millions of individuals die in car accidents each year, AV operators are highly dependent on this safety argument to persuade lawmakers to adopt legislation that allows for the use of additional completely autonomous vehicles on public roads.

KNOW MORE: Everything about The Housing Crash

Waymo has resorted to counterfactuals, or “what if?” scenarios

But because there are so few autonomous vehicles currently using public streets, it’s been challenging to demonstrate that case. Waymo has used counterfactuals, or “what if?” scenarios, intended to demonstrate how this robot automobile would respond in real-world settings, to add more statistical evidence for its claim. The business released 6.1 million miles of driving data from 2019 and 2020 the year before, which included 18 incidents and 29 near-collisions. Waymo’s engineers created a counterfactual by simulating what might have occurred if the drivers had not disengaged the waymo simulated prevent deaths realworld crashes vehicle’s self-driving systems in those situations where this safety operator seized the control of the vehicle to prevent a collision.

Perhaps the most recent statistics releases continue that work in counterfactuals. Waymo obtained data on each fatal collision that occurred in Chandler, Arizona, a suburb of Phoenix, between 2008 and 2017 from a third party. Waymo recognised 72 crashes to regenerate in simulation in order to understand how its autonomous car would react in comparable circumstances, concentrating only on the accidents that occurred inside its operational design domain, or the roughly 100-square-mile region where the company authorises its cars to drive. Out of 91 simulations, Waymo’s AVs “avoided or minimised” 88 of them.

According to the findings, 88 out of 91 scenarios would have been “avoided or reduced” by waymo simulated prevent deaths realworld crashes, Waymo’s autonomous vehicles, according to Trent Victor, head of system safety and best – practices at Waymo. Victor added that Waymo’s vehicles would have decreased the chance of serious harm by a ratio of 1.3 to 15 times for the collisions that were mitigated.

Victor explained, “That suggests that even though it didn’t completely escape the collision, it took the effort to lessen the impact.” It was much less probable that the driver would have perished if the intensity had been lessened.

MUST READ: Staying Safe On Your Next Road Trip

“The Waymo driver used evasive manoeuvres”

There have been three occasions where someone was killed after being hit from behind by another car. When Waymo’s vehicle was rear-ended, it was unable to prevent these collisions in the scenario. According to Matthew Schwall, head of field safety at Waymo, “there’s not much that the responsible position can accomplish in rear crashes particularly.” “The Waymo driver truly faces the very same difficulty that people do in certain circumstances, that it’s challenging to make the predictions early sufficiently to be able to perform an evasive manoeuvre,” the researcher concluded.

For assessing AV safety, there is no set methodology. Even if US regulators seem satisfied to allow the private industry to decide what is safe, recent research by RAND found that, in the lack of a foundation, customers are more inclined to rely on the government. Waymo believes that by making this information widely available, decision-makers, academics, and even other businesses will start to take on the challenge of creating an all-encompassing framework.

To be clear, Waymo didn’t submit its discoveries for peer review for publishing in a scholarly or scientific journal, however, a spokeswoman stated the company would be open to doing so in the future. The simulations weren’t carried out independently of the corporation, and no outside party verified them before the company made them public. In order to gain feedback, the corporation did present its findings to a select set of academic authorities. The National Advanced Driving Simulation Laboratory director at University of Iowa, Daniel McGehee, claimed that Waymo is taking safety evaluations and openness “to a new level.”

ADD TO YOUR WISH LIST: Automated Vehicle Safety Features That Most People Still Don't Know How To Utilize

These are some of the reasons we performed the simulation with the Waymo Driver as the responder is because of this. If the waymo simulated prevent deaths realworld crashes, Waymo Driver regularly outperformed human drivers in such collisions, that points to the potential for wider safety gains from our automation technology. In reality, our simulations fully support this:

  • Whenever the Waymo Driver was used as the response, 82% of simulation collisions were totally avoided. In fact, it accomplished so in the majority of cases with constant, steady driving—without the necessity for sudden evasive manoeuvres or forceful braking.
  • Another 10% of the instances in which the simulation Waymo Driver has been the person to respond at a crossroads when some other car turned in front of it—saw it take a step that lessened the impact of the collision.
  • Less than 8% of responders’ accident simulations remained the same. In each of these cases, the human-driven car struck the dummy Waymo Driver’s back while it was either stationary or moving at a consistent speed, leaving the Waymo Driver little time to react.


Q1) How many accidents did Waymo have?

Ans: Totally automated driving technologies were involved in 130 collisions, 62 of which involved Waymo. 34 collisions were reported by the shuttle company Transdev, and 23 by Cruise, which provides robotaxis for General Motors in San Francisco.

Q2)Who is to blame for crashes involving autonomous vehicles?

Ans: Depending on the specifics of an accident, a manufacturer, a software developer, the person who activated the autopilot, or even all 3 of these parties may be held accountable.


In conclusion, the article has attempted to give you information about ‘waymo simulated prevent deaths realworld crashes’. Go through the steps about simulation mentioned above.

MUST READ: Safety Tips for Solo Travelers