Tesla FSD ignores school bus lights, hits 'child' dummy
Tesla has been testing self-driving Model Ys on the streets of Austin, Texas. But according to the automaker's bete noire, the Dawn Project, kids should keep clear.
Menu
Front Page Breaking News Comments Flagged Comments Recently Flagged User Blogs Write a Blog Entry Create a Poll Edit Account Weekly Digest Stats Page RSS Feed Back Page
Subscriptions
Read the Retort using RSS.
RSS Feed
Author Info
LampLighter
Joined 2013/04/13Visited 2025/06/03
Status: user
MORE STORIES
Trump Organization Has Expanded Globally Since the 2024 Election (3 comments) ...
Tariff Gloom Weighs on US Manufacturing (3 comments) ...
New Federal Employees Must Praise Trump EOs (17 comments) ...
Acting FEMA Head said he Wasn't Aware U.S. has a Hurricane Season: Sources (18 comments) ...
Going to an office and pretending to work: it works in China (1 comments) ...
Alternate links: Google News | Twitter
Admin's note: Participants in this discussion must follow the site's moderation policy. Profanity will be filtered. Abusive conduct is not allowed.
More from the article ...
... The self-proclaimed self-driving gadfly posted a video this week showing a staged event in which a Tesla running the latest version of its self-driving software illegally overtook a school bus with its red lights flashing (indicating it is dropping off passengers) and then slammed into a child-sized mannequin emerging from the sidewalk. The car identified the pedestrian, but didn't stop after hitting it and carried on down the road. To be fair to Tesla, a human driver would have had problems stopping in time to avoid the dummy, although one would hope that a human driver would have noticed the lights and stopped as legally required. However, the software running the car -- FSD (Supervised) 13.2.9 -- did not. (FSD originally for "full self-driving," but as the parenthetical now denotes, Tesla's own documentation has always said that the driver should be supervising at all times.) The Dawn Project has its own agenda. It was founded by software entrepreneur Dan O'Dowd to highlight the dangers of Tesla's self-driving claims when matched with reality. O'Dowd, who owns several Teslas himself -- including original Roadster models -- has been sounding concerns about the software for years. ...
To be fair to Tesla, a human driver would have had problems stopping in time to avoid the dummy, although one would hope that a human driver would have noticed the lights and stopped as legally required. However, the software running the car -- FSD (Supervised) 13.2.9 -- did not. (FSD originally for "full self-driving," but as the parenthetical now denotes, Tesla's own documentation has always said that the driver should be supervising at all times.)
The Dawn Project has its own agenda. It was founded by software entrepreneur Dan O'Dowd to highlight the dangers of Tesla's self-driving claims when matched with reality. O'Dowd, who owns several Teslas himself -- including original Roadster models -- has been sounding concerns about the software for years. ...
#1 | Posted by LampLighter at 2025-06-01 12:14 AM | Reply
@#1 ... To be fair to Tesla, a human driver would have had problems stopping in time to avoid the dummy ...
Also to be fair, a human driver might have noticed the school bus lights, and stopped.
#2 | Posted by LampLighter at 2025-06-01 12:19 AM | Reply
This self driving s%^* is gonna kill people.
All so that dumb s%^* can act like he's ushering in "the future."
#3 | Posted by jpw at 2025-06-01 01:02 AM | Reply
@#3 ... This self driving is%^* is gonna kill people. ...
I agree, but maybe for a different reason.
How many drivers will turn on and just sit back as they have believed the sales pitch of full self-driving?
Asked differently, one evening in a winter a few years ago, when we used to get snow here, I was driving home from an evening visit with a friend's family.
Many of the roads I drove on were not visible because snow does not distinguish between road and non-road.
So, I was navigating by my memory of the houses and other structures on the road.
Full-self driving seems to have issues in the States where there is no such thing as snow.
But ... that aside ... how does full self-driving do in snow-bound roads?
Why don't we see information about that?
#4 | Posted by LampLighter at 2025-06-01 01:32 AM | Reply
Well, people die. Kids are people. So what's the fuss?
#5 | Posted by Zed at 2025-06-01 10:58 AM | Reply
#5 - Joni Ernst approved!
#6 | Posted by johnny_hotsauce at 2025-06-01 07:06 PM | Reply | Funny: 1
I follow several accounts of school districts around the nation and have seen more than my share of douchebags that speed past school buses with the lights on. Some cases missing children by inches.
#7 | Posted by Nixon at 2025-06-02 02:48 PM | Reply
Someone should ask Joni Breadbags how she reconciles her stance on living children losing healthcare and dying with her stance on abortion.
They're not consistent.
But that is the entire GOP policy platform.
#8 | Posted by Nixon at 2025-06-02 02:50 PM | Reply
#5 - Joni Ernst approved! #6 | Posted by johnny_hotsauce at 2025-06-01 07:06 PM
She doubled-down and posted a sarcastic reply from a graveyard.
She compares believing in the tooth fairy with believing in healthcare, then claims Jesus is guiding her.
Why are Christians so hateful to the needy?
#9 | Posted by redlightrobot at 2025-06-02 03:32 PM | Reply | Newsworthy 2
Post a commentComments are closed for this entry.Home | Breaking News | Comments | User Blogs | Stats | Back Page | RSS Feed | RSS Spec | DMCA Compliance | Privacy
Comments are closed for this entry.
Home | Breaking News | Comments | User Blogs | Stats | Back Page | RSS Feed | RSS Spec | DMCA Compliance | Privacy