Advertisement
Waymo Self-driving Car Goes on Train Tracks, Passenger Flees
A Waymo self-driving car has accidentally ended up on the tracks of a light-rail line in the city of Phoenix, Arizona.
Menu
Front Page Breaking News Comments Flagged Comments Recently Flagged User Blogs Write a Blog Entry Create a Poll Edit Account Weekly Digest Stats Page RSS Feed Back Page
Subscriptions
Read the Retort using RSS.
RSS Feed
Author Info
lamplighter
Joined 2013/04/13Visited 2026/01/19
Status: user
MORE STORIES
Trump's latest Western Hemisphere fixation: Canada (1 comments) ...
Defending the Disabled (1 comments) ...
'Bring it on': UK's Labour readies for EU reset fight (1 comments) ...
'Massive' untapped oil and gas reserves discovered (14 comments) ...
How Donald Trump Should Tackle America's Population Crisis (6 comments) ...
Alternate links: Google News | Twitter
Waymo is issuing a software recall for its self-driving cars after reports the company's autonomous vehicles failed to stop for school buses. n.pr/4iGUYWf[image or embed] -- NPR (@npr.org) Dec 6, 2025 at 4:08 PM
Waymo is issuing a software recall for its self-driving cars after reports the company's autonomous vehicles failed to stop for school buses. n.pr/4iGUYWf[image or embed]
Admin's note: Participants in this discussion must follow the site's moderation policy. Profanity will be filtered. Abusive conduct is not allowed.
More from the article ...
... News outlet First Alert 6 reported that the railway was constructed just last year, which is why the car confused it with a road. Artificial intelligence (AI) systems are typically programmed to learn from past mistakes, rather than anticipating future changes. And in the case of expanding cities like Phoenix, construction zones and new layouts are known weak points for driverless GPS operating systems. In this case, the passenger, being untrained, would not have been expected to take a measured intervention, especially in such a high-stress, fight-or-flight situation. ...
Artificial intelligence (AI) systems are typically programmed to learn from past mistakes, rather than anticipating future changes. And in the case of expanding cities like Phoenix, construction zones and new layouts are known weak points for driverless GPS operating systems.
In this case, the passenger, being untrained, would not have been expected to take a measured intervention, especially in such a high-stress, fight-or-flight situation. ...
#1 | Posted by LampLighter at 2026-01-16 02:57 PM | Reply
As the video in the article shows, there was a train approaching the Waymo vehicle.
#2 | Posted by LampLighter at 2026-01-16 02:58 PM | Reply
A self-driving car had a driver?
#3 | Posted by qcp at 2026-01-16 03:10 PM | Reply
@#3
Yeah, that caught my eye as well.
From the video, it looks like the person who left the vehicle left via a rear door, i.e., not in the driver seat.
I'm thinking the article may have misrepresented the passenger who booked the ride as the driver of the ride.
#4 | Posted by LampLighter at 2026-01-16 03:26 PM | Reply
"A Waymo self-driving car has accidentally ended up on the tracks of a light-rail line in the city of Phoenix, Arizona."
A suicidal self-driving car?
Well, it had been acting depressed the last few days.
Down in the pumps.
#5 | Posted by Danforth at 2026-01-16 03:54 PM | Reply | Funny: 2
It spent too much time "interfacing" with GROK.
Apparently.
#6 | Posted by donnerboy at 2026-01-16 03:59 PM | Reply
It is events just such as this that tell me self-driving is not ready for prime time. There was another among the many, who was stuck in a round about. It took him calling the company to get the vehicle to stop traveling the round about. Call me old school but I am not ready for this. I know that while I'm in control of my vehicle it's not going to do such.
#7 | Posted by BBQ at 2026-01-16 04:03 PM | Reply
Tangentially related. Uber Eats Delivery Robot Destroyed After Getting Stuck on Train Tracks in Miami
#8 | Posted by et_al at 2026-01-16 05:58 PM | Reply
#8 I really wish there was some sort of Retort for your link, Et_Al, but it kind of stands on its own. :-)
#9 | Posted by A_Friend at 2026-01-16 06:11 PM | Reply
"It spent too much time "interfacing" with GROK."
And Alexa found out.
#10 | Posted by Danforth at 2026-01-16 06:13 PM | Reply | Funny: 2
@#8
So much for the 30-minute delivery guarantee ...
#11 | Posted by LampLighter at 2026-01-16 06:29 PM | Reply
Out in left field ...
Sorry Dave, I'm afraid I can't do that! PCs refuse to shut down after Microsoft patch www.theregister.com
... We're not saying Copilot has become sentient and decided it doesn't want to lose consciousness. But if it did, it would create Microsoft's January Patch Tuesday update, which has made it so that some PCs flat-out refuse to shut down or hibernate, no matter how many times you try. In a notice on its Windows release health dashboard, Microsoft confirmed that some PCs running Windows 11 23H2 might fail to power down properly after installing the latest security updates. Instead of slipping into shutdown or hibernation, affected machines stay stubbornly awake, draining batteries and ignoring shutdown like they have a mind of their own and don't want to experience temporary non-existence. ...
In a notice on its Windows release health dashboard, Microsoft confirmed that some PCs running Windows 11 23H2 might fail to power down properly after installing the latest security updates.
Instead of slipping into shutdown or hibernation, affected machines stay stubbornly awake, draining batteries and ignoring shutdown like they have a mind of their own and don't want to experience temporary non-existence. ...
#12 | Posted by LampLighter at 2026-01-16 06:33 PM | Reply
Waymo problems
#13 | Posted by LegallyYourDead at 2026-01-16 06:59 PM | Reply
@#12 ... We're not saying Copilot has become sentient and decided it doesn't want to lose consciousness. But if it did, it would create Microsoft's January Patch Tuesday update, which has made it so that some PCs flat-out refuse to shut down or hibernate, no matter how many times you try. ...
How many AI models are now training on this "bug" for future reference?
#14 | Posted by LampLighter at 2026-01-16 07:12 PM | Reply
@#14 ... How many AI models are now training on this "bug" for future reference? ...
Asked differently ...
How long before Microsoft starts using AI to create its monthly Windows Updates?
#15 | Posted by LampLighter at 2026-01-16 08:41 PM | Reply
These days, I seem to respond to dark humor more than ever before...and pretending that machines can ever, at least in our lifetimes, run around the planet without human guidance, is just madness. In fifteen years or so, whose job won't be replaced by AI? Lawyers, doctors, teachers, truck drivers, prostitutes...
#16 | Posted by Hughmass at 2026-01-18 08:16 AM | Reply
"I'm sorry, Dave, I can't let you out of the car."
HAL 9000's progeny
#17 | Posted by AMERICANUNITY at 2026-01-18 11:22 AM | Reply
Post a comment The following HTML tags are allowed in comments: a href, b, i, p, br, ul, ol, li and blockquote. Others will be stripped out. Participants in this discussion must follow the site's moderation policy. Profanity will be filtered. Abusive conduct is not allowed. Anyone can join this site and make comments. To post this comment, you must sign it with your Drudge Retort username. If you can't remember your username or password, use the lost password form to request it. Username: Password: Home | Breaking News | Comments | User Blogs | Stats | Back Page | RSS Feed | RSS Spec | DMCA Compliance | Privacy
The following HTML tags are allowed in comments: a href, b, i, p, br, ul, ol, li and blockquote. Others will be stripped out. Participants in this discussion must follow the site's moderation policy. Profanity will be filtered. Abusive conduct is not allowed.
Home | Breaking News | Comments | User Blogs | Stats | Back Page | RSS Feed | RSS Spec | DMCA Compliance | Privacy