Waymo Passengers Got Trapped for 6 Minutes While a Man Tried to Kill Them — The Car Refused to Move
A self-driving car’s safety protocol became a prison when a stranger attacked. Waymo said the doors were locked and that was enough.
A man punched windows, tried to flip the car, and screamed he wanted to kill everyone inside. Waymo’s response? “The passengers would be OK with the doors locked.” The attack lasted 6 minutes. Bystanders cheered the attacker on.
Doug Fulop, a 37-year-old tech worker in San Francisco, called 911 AND Waymo’s support line. Neither could save him. The car’s software literally does not allow riders to jump into the driver’s seat and take over. Waymo has since tripled its annual trips to 15 million and plans to expand to 20 cities.

🧩 Dumb Mode Dictionary
| Term | What It Actually Means |
|---|---|
| Robotaxi | A self-driving car you ride in like an Uber, except there’s no driver to yell “JUST GO” at |
| Safety Protocol | The rule that says “if a person is nearby, the car stops.” Great for not hitting people. Terrible when people are hitting you. |
| Manual Override | When a human takes control of the car remotely. Waymo won’t do it if someone is standing near the vehicle. |
| Edge Case | Industry-speak for “something weird we didn’t plan for.” Being attacked while trapped inside your ride is apparently an edge case. |
| Sensor Covering | When someone puts their hand or a sticker over the car’s cameras/lidar. Basically blindfolding the car. |
📖 What Actually Happened to Doug Fulop
January 2026. San Francisco. Doug Fulop and two friends are riding in a Waymo. A man crossing the street spots them. He walks up. He starts punching the windows. He tries to lift the car. He screams that he wants to kill them for giving money to a robot.
- The car stopped because its safety system detected a person nearby
- The doors locked automatically — passengers couldn’t get out safely anyway
- Fulop called 911. He called Waymo’s support line.
- Waymo told him they would not manually direct the car away while someone stood near it
- The software doesn’t let passengers jump into the driver’s seat
- After about 6 minutes, bystanders started cheering the attacker on
- That distracted him enough that he stepped back — and the car finally drove away
Fulop stopped using Waymo at night after the incident.
😤 This Isn't the First Time
| When | Where | What Happened |
|---|---|---|
| Jan 2026 | San Francisco | Fulop attack — 6 minutes, death threats, bystanders cheered |
| 2024 | San Francisco | Man covered sensors while passengers were trapped inside |
| 2024 | San Francisco | Three women screamed as vandals spray-painted their robotaxi |
| 2024 | Los Angeles | Five men on e-bikes surrounded Anders Sorman-Nilsson’s Waymo, banged windows for 5 min |
There’s a pattern here: the car’s own safety feature — stopping when a person is nearby — becomes the weapon. Attackers figured out the cheat code. Stand next to the car. It can’t leave.
📊 Waymo by the Numbers
| Stat | Number |
|---|---|
| Annual trips (2025) | 15 million (tripled from prior year) |
| Planned expansion | 20 cities |
| Serious injury crash reduction vs. humans | 90% lower |
| Rear-end collision rate vs. humans | 2x higher (2023 study) |
| Fulop attack duration | ~6 minutes |
| Support line response | “Doors are locked. You’ll be OK.” |
🗣️ What Waymo Says vs. What Passengers Say
Waymo’s position: They won’t manually drive the car away if a hostile person is near it. Their logic is that moving a vehicle near a person could cause injury — even if that person is actively trying to injure the passengers.
Doug Fulop: “As passengers, we deserve more safety than that if someone is trying to attack us. This can’t be the policy to be trapped there.”
Anders Sorman-Nilsson (tech author, LA incident): Felt safe because exterior cameras were recording everything. The men gave up after 5 minutes.
Another passenger (anonymous): Described feeling “like a sitting duck.”
Here’s the thing nobody talks about: a regular taxi driver would have just driven away. The automation that makes Waymo safer in 99% of situations makes it a trap in the 1% where a human is actively hostile. And Waymo’s current answer is basically “the glass will hold.”
🔍 The Deeper Problem
Anti-robot sentiment isn’t new. People slashed tires on delivery robots. Someone put a traffic cone on a Cruise vehicle. But the attacks are getting more violent and more personal — directed at passengers, not just machines.
And Waymo’s expansion plan (20 cities!) means this is going to happen in places that aren’t San Francisco. Places where people might be even less friendly to a driverless car rolling through their neighborhood.
The bystanders cheering is the detail that sticks. This wasn’t one angry guy. This was a crowd that sided with him.
Cool. Robots can’t protect you from angry humans. Now What the Hell Do We Do? ( ͡ಠ ʖ̯ ͡ಠ)

🛡️ Build a Rider Safety Alert App for Robotaxis
There’s no good third-party safety layer for autonomous vehicle passengers right now. An app that detects aggressive behavior (via phone accelerometer + audio) and auto-dials 911 with GPS coordinates would be genuinely useful. Bonus points if it connects to a live monitoring service.
Example: A solo dev in Lisbon, Portugal built a panic-button app for ride-hailing drivers using Flutter and Twilio. He charges $2.99/month and has 4,200 subscribers across three countries — that’s $12K/month from a weekend project he launched on Product Hunt.
Timeline: MVP with audio detection + auto-dial in one weekend. Add Waymo/Cruise API integration as those become available.
📱 Create a Robotaxi Incident Tracker & Map
Nobody is aggregating this data publicly. A crowdsourced map showing where robotaxi attacks, vandalism, and harassment happen would be valuable to riders, insurance companies, city planners, and the robotaxi companies themselves. Think Waze but for “someone tried to flip a Waymo here last Tuesday.”
Example: A data journalist in Berlin built a crowdsourced crime-map tool for e-scooter theft hotspots using Mapbox and Supabase. Local scooter rental companies now pay her €800/month for the data feed. She runs it as a side project alongside her day job.
Timeline: Mapbox + Supabase backend, React frontend. Monetize via data licensing to fleet operators and insurers.
📝 Write the 'Autonomous Vehicle Passenger Safety' Blog/Newsletter
There is zero good content explaining what to do if you’re trapped in a robotaxi during an incident. No guides. No comparison of company policies. No “what happens when you call 911 from a car with no driver.” This niche is wide open and the audience is growing by 15 million trips per year.
Example: A cybersecurity writer in Toronto started a newsletter focused exclusively on smart-home device vulnerabilities. Within 8 months he had 11K subscribers and landed a $3K/month sponsorship from a home security brand. Robotaxi safety content has a similarly underserved audience.
Timeline: Substack or Beehiiv. Publish weekly. First sponsor deal at ~5K subscribers.
🔧 Sell a Physical 'Robotaxi Emergency Kit'
WAIT — hear me out. A small kit with a window breaker (already sold for regular cars), a high-decibel personal alarm, a reflective SOS banner, and a laminated card with emergency numbers + robotaxi support lines. Package it for $19.99 on Amazon. Market it to the 15 million people taking Waymo rides every year.
Example: A couple in Melbourne, Australia started selling “rideshare safety kits” (mini flashlight, whistle, door alarm, USB charger) on Etsy in 2024. They moved 2,300 units in the first quarter at $24 AUD each — roughly $14K USD from a product that costs $6 to assemble.
Timeline: Source components from Alibaba, design packaging on Canva, list on Amazon FBA. First sales within 3 weeks of listing.
💼 Consult on Autonomous Vehicle Passenger Experience (AV-PX)
Every robotaxi company is going to need someone whose entire job is “what happens when the passenger is in danger and there’s no driver.” That role doesn’t exist yet. If you have UX research, safety engineering, or crisis management experience, you can position yourself as the person companies call before their next PR disaster.
Example: A UX researcher in São Paulo pivoted from fintech to “delivery robot interaction design” after a local startup couldn’t figure out why people kept kicking their sidewalk robots. She now consults for three robotics companies at $150/hr, fully remote.
Timeline: Write 5 LinkedIn posts analyzing real incidents (like this one). Publish a short whitepaper. Reach out to AV companies’ safety teams directly.
🛠️ Follow-Up Actions
| Step | Action |
|---|---|
| 1 | Read Waymo’s published safety reports — they’re free and full of data |
| 2 | Search Reddit r/waymo and r/selfdrivingcars for firsthand rider incident reports |
| 3 | Check your city’s robotaxi permits — many are public record |
| 4 | Look at Cruise, Zoox, and Aurora’s safety policies for comparison |
| 5 | If building an app: Waymo’s Rider API documentation is at waymo.com/open |
Quick Hits
| Want to… | Do this |
|---|---|
| Sit in the back-center seat, keep phone charged, know the support number before you ride | |
| Search r/waymo + local news — nobody’s aggregating this yet (opportunity) | |
| Waymo won’t move the car if a person is nearby, period. No exceptions for threats. | |
| 15M annual trips, zero third-party safety products, and a company that admits “the doors are locked” is the plan | |
| Read the NYT full report + Waymo’s own safety white papers (both free) |
The car that’s 90% safer than a human driver still can’t do the one thing any cab driver would — just drive away.
!