Tesla FSD ignores school bus lights, hits 'child' dummy
Tesla has been testing self-driving Model Ys on the streets of Austin, Texas. But according to the automaker's bete noire, the Dawn Project, kids should keep clear.
Menu
Front Page Breaking News Comments Flagged Comments Recently Flagged User Blogs Write a Blog Entry Create a Poll Edit Account Weekly Digest Stats Page RSS Feed Back Page
Subscriptions
Read the Retort using RSS.
RSS Feed
Author Info
LampLighter
Joined 2013/04/13Visited 2025/06/02
Status: user
MORE STORIES
Deep Cuts Erode the Foundations of US Public Health System (13 comments) ...
Connecticut Passes Bill Overhauling Towing Laws (6 comments) ...
Appeals Court Bars Texas AG From Investigating Media Matters (1 comments) ...
Tesla FSD ignores school bus lights, hits 'child' dummy (6 comments) ...
Sec Noem said an immigrant threatened to kill Trump. (9 comments) ...
Alternate links: Google News | Twitter
Admin's note: Participants in this discussion must follow the site's moderation policy. Profanity will be filtered. Abusive conduct is not allowed.
More from the article ...
... The self-proclaimed self-driving gadfly posted a video this week showing a staged event in which a Tesla running the latest version of its self-driving software illegally overtook a school bus with its red lights flashing (indicating it is dropping off passengers) and then slammed into a child-sized mannequin emerging from the sidewalk. The car identified the pedestrian, but didn't stop after hitting it and carried on down the road. To be fair to Tesla, a human driver would have had problems stopping in time to avoid the dummy, although one would hope that a human driver would have noticed the lights and stopped as legally required. However, the software running the car -- FSD (Supervised) 13.2.9 -- did not. (FSD originally for "full self-driving," but as the parenthetical now denotes, Tesla's own documentation has always said that the driver should be supervising at all times.) The Dawn Project has its own agenda. It was founded by software entrepreneur Dan O'Dowd to highlight the dangers of Tesla's self-driving claims when matched with reality. O'Dowd, who owns several Teslas himself -- including original Roadster models -- has been sounding concerns about the software for years. ...
To be fair to Tesla, a human driver would have had problems stopping in time to avoid the dummy, although one would hope that a human driver would have noticed the lights and stopped as legally required. However, the software running the car -- FSD (Supervised) 13.2.9 -- did not. (FSD originally for "full self-driving," but as the parenthetical now denotes, Tesla's own documentation has always said that the driver should be supervising at all times.)
The Dawn Project has its own agenda. It was founded by software entrepreneur Dan O'Dowd to highlight the dangers of Tesla's self-driving claims when matched with reality. O'Dowd, who owns several Teslas himself -- including original Roadster models -- has been sounding concerns about the software for years. ...
#1 | Posted by LampLighter at 2025-06-01 12:14 AM | Reply
@#1 ... To be fair to Tesla, a human driver would have had problems stopping in time to avoid the dummy ...
Also to be fair, a human driver might have noticed the school bus lights, and stopped.
#2 | Posted by LampLighter at 2025-06-01 12:19 AM | Reply
This self driving s%^* is gonna kill people.
All so that dumb s%^* can act like he's ushering in "the future."
#3 | Posted by jpw at 2025-06-01 01:02 AM | Reply
@#3 ... This self driving is%^* is gonna kill people. ...
I agree, but maybe for a different reason.
How many drivers will turn on and just sit back as they have believed the sales pitch of full self-driving?
Asked differently, one evening in a winter a few years ago, when we used to get snow here, I was driving home from an evening visit with a friend's family.
Many of the roads I drove on were not visible because snow does not distinguish between road and non-road.
So, I was navigating by my memory of the houses and other structures on the road.
Full-self driving seems to have issues in the States where there is no such thing as snow.
But ... that aside ... how does full self-driving do in snow-bound roads?
Why don't we see information about that?
#4 | Posted by LampLighter at 2025-06-01 01:32 AM | Reply
Well, people die. Kids are people. So what's the fuss?
#5 | Posted by Zed at 2025-06-01 10:58 AM | Reply
#5 - Joni Ernst approved!
#6 | Posted by johnny_hotsauce at 2025-06-01 07:06 PM | Reply
Post a comment The following HTML tags are allowed in comments: a href, b, i, p, br, ul, ol, li and blockquote. Others will be stripped out. Participants in this discussion must follow the site's moderation policy. Profanity will be filtered. Abusive conduct is not allowed. Anyone can join this site and make comments. To post this comment, you must sign it with your Drudge Retort username. If you can't remember your username or password, use the lost password form to request it. Username: Password: Home | Breaking News | Comments | User Blogs | Stats | Back Page | RSS Feed | RSS Spec | DMCA Compliance | Privacy
The following HTML tags are allowed in comments: a href, b, i, p, br, ul, ol, li and blockquote. Others will be stripped out. Participants in this discussion must follow the site's moderation policy. Profanity will be filtered. Abusive conduct is not allowed.
Home | Breaking News | Comments | User Blogs | Stats | Back Page | RSS Feed | RSS Spec | DMCA Compliance | Privacy