Malfunctioning Waymo Trapped Passenger, Drove in Circles for Hours

AI consultant Mike Johns experienced a harrowing ride when a Waymo autonomous vehicle malfunctioned on December 9, trapping him inside as it drove in circles around a parking lot in Scottsdale, Arizona. Johns, founder and CEO of Digital Mind State, was attempting to reach the airport for a flight back to Los Angeles when the self-driving car began looping repeatedly instead of proceeding to his destination.

The incident lasted approximately seven minutes, with Johns completing at least four laps around a parking lot island before the vehicle corrected itself. “By lap number four, I knew this ain’t a prank — and because the circle was going around this little island, I felt the nausea, the dizziness, start to happen,” Johns told Business Insider. The experience left him feeling trapped in what he described as a “ghost in the machine scenario.”

Waymo, a subsidiary of Alphabet, acknowledged the incident delayed Johns’ trip by just over five minutes and stated the looping issue was addressed through a regularly scheduled software update. However, the company declined to provide specifics about how frequently such malfunctions occur or detail their procedures for identifying and fixing problems before they impact passengers. Johns ultimately made his flight on time and was not charged for the ride.

This incident is far from isolated in Waymo’s operational history. Since launching its driverless taxi service to the public in 2020, the company has faced numerous reported malfunctions. In 2021, a Waymo blocked traffic in Arizona, documented by a YouTuber from the backseat. A 2023 software glitch caused a dozen Waymo vehicles to create gridlock in Phoenix. Last September, a stalled Waymo even blocked Vice President Kamala Harris’ motorcade in San Francisco. The autonomous vehicles have also confused first responders at emergency scenes, blocked public transit, and refused to pull over for police.

Legal recourse appears limited for affected passengers. Los Angeles personal injury attorney Jordan Peagler explained that Waymo’s terms of service include arbitration agreements, liability limitations, and indemnification clauses protecting the company from lawsuits. The terms explicitly state that Waymo does not warrant services are “free of viruses or other harmful components” and that users “assume the entire risk as to the quality and performance of the Services.” Johns has not indicated plans to pursue legal action, instead advocating for greater transparency about autonomous vehicle limitations and risks.

Key Quotes

By lap number four, I knew this ain’t a prank — and because the circle was going around this little island, I felt the nausea, the dizziness, start to happen. That’s the part that I really didn’t like.

Mike Johns, an AI consultant and CEO of Digital Mind State, described his experience trapped in the malfunctioning Waymo. This quote illustrates the physical and psychological distress passengers can experience when autonomous vehicle AI systems fail, highlighting real safety concerns beyond mere inconvenience.

In that moment, it feels almost like a hijack. That was the longest seven minutes — especially when you’re not expecting it. It’s a really crazy feeling because it’s already when you’re in an autonomous vehicle. It’s this ghost in the machine scenario.

Johns captured the unsettling nature of losing control to malfunctioning AI technology. His description emphasizes the unique vulnerability passengers face in autonomous vehicles where there’s no human driver to intervene during system failures.

It’s a new world that we’ve never known before and autonomous vehicles are ushering in a new economy. And I’m super all for it, but the big thing that we have to be aware of is the fact that we’re all a part of the experiment — and we’re paying to be a part of the experiment.

Despite his negative experience, Johns maintains support for autonomous vehicle technology while advocating for transparency. His statement reflects a critical tension in AI deployment: consumers are effectively beta testers for technology that companies market as ready for public use.

He likely signed an arbitration agreement as part of the user terms of service so he won’t be able to sue.

Los Angeles personal injury attorney Jordan Peagler explained the legal barriers facing passengers who experience autonomous vehicle malfunctions. This highlights how current legal frameworks may inadequately protect consumers in the emerging autonomous vehicle industry.

Our Take

This incident reveals a troubling disconnect between the autonomous vehicle industry’s public confidence and the reality of AI system reliability. Waymo’s refusal to disclose malfunction frequency or detail safety protocols suggests a defensive posture that prioritizes corporate protection over consumer transparency. The pattern of recurring incidents — from traffic blockages to emergency interference — indicates these aren’t isolated bugs but potentially systemic AI decision-making failures.

Most concerning is the legal vacuum surrounding passenger rights. Terms of service that absolve companies of virtually all liability while customers “assume the entire risk” are untenable as autonomous vehicles scale. Johns’ experience underscores that we’re deploying AI systems in safety-critical applications without adequate regulatory oversight, incident reporting requirements, or consumer protections. The autonomous vehicle industry needs mandatory transparency standards, independent safety audits, and legal frameworks that balance innovation with accountability before these technologies become ubiquitous in our transportation infrastructure.

Why This Matters

This incident highlights critical safety and transparency concerns as autonomous vehicle technology rapidly expands into mainstream transportation. Waymo operates one of the most advanced self-driving taxi services globally, and malfunctions like this expose the gap between AI capabilities and public expectations for reliable, safe transportation.

The legal implications are particularly significant. Current terms of service effectively shield autonomous vehicle companies from liability, leaving passengers with minimal recourse even when experiencing distressing or potentially dangerous situations. As self-driving technology becomes more prevalent, this raises urgent questions about consumer protection, regulatory oversight, and corporate accountability in the AI industry.

The pattern of recurring malfunctions — from traffic blockages to emergency response interference — suggests systemic challenges in autonomous vehicle AI that extend beyond isolated software bugs. Johns’ observation that “we’re all a part of the experiment — and we’re paying to be a part of the experiment” captures a broader truth about AI deployment: companies are testing cutting-edge technology on paying customers without fully transparent risk disclosure. This case underscores the need for stronger regulatory frameworks, mandatory incident reporting, and clearer communication about AI limitations as autonomous systems become integrated into daily life.

For those interested in learning more about artificial intelligence, machine learning, and effective AI communication, here are some excellent resources:

Source: https://www.businessinsider.com/malfunctioning-waymo-trapped-passenger-inside-drove-in-circles-2025-1