Advertisement

Drudge Retort: The Other Side of the News
Monday, June 17, 2024

Police attempting to direct traffic following an accident had to take cover when the Tesla plowed into a cruiser.

More

Comments

Admin's note: Participants in this discussion must follow the site's moderation policy. Profanity will be filtered. Abusive conduct is not allowed.

More from the article...

... Tesla's autonomous driving features have been a key selling point for Elon Musk's electric vehicles, but as owners trust these features more, we're learning just how imperfect they are.

The police in Fullerton, CA say that a Tesla in self-driving mode collided with a police car this week while officers were on the scene of an unrelated traffic collision. If this sounds familiar, that's because Teslas are gaining an undesirable reputation for running into parked emergency vehicles.

On June 13, a police cruiser was parked at the intersection of Orangethorpe and Courtney Avenues in Fullerton after responding to a fatal DUI crash. The vehicle in question was blocking the road while an officer routed traffic away from the original accident. Shortly after midnight, the Tesla Model S slammed into the cruiser, appearing not to brake at all. The officer was able to move out of the way in time, and the driver of the Tesla was not seriously injured.

According to the report, the police car was stationary with its emergency lights on. Police had also placed several flares on the road to alert drivers to the hazard. The Tesla's owner admitted that the car was in "self drive mode" and they were distracted using their phone at the time of the accident. You can see the crash play out in the Instagram post below. ...



#1 | Posted by LampLighter at 2024-06-16 12:54 AM | Reply

@#1

imo, his is not a good thing.

Tesla really seems to need to get its autonomous driving act together.

For starters... does it work, or does it not work?

If the latter, what in the world is it doing on public roads?


#2 | Posted by LampLighter at 2024-06-16 12:56 AM | Reply

If the latter, what in the world is it doing on public roads?

I'm sure in the fine print it says the driver is supposed to be paying attention to what's going on.

#3 | Posted by REDIAL at 2024-06-16 01:04 AM | Reply

@#3

That is likely to be the huge issue here.

The legal department of Tesla proffering fine print, vs the marketing department of Tesla wanting to empty those hordes of unsold Teslas.

Tesla sales tumble nearly 9%, most in 4 years, as competition heats up and demand for EVs slows (April 2024)
apnews.com

... Tesla sales fell sharply last quarter as competition increased worldwide, electric vehicle sales growth slowed, and price cuts failed to lure more buyers.

The Austin, Texas, company said Tuesday that it delivered 386,810 vehicles worldwide from January through March, almost 9% below the 423,000 it sold in the same quarter of last year. It was the first year-over-year quarterly sales decline in nearly four years. ...




#4 | Posted by LampLighter at 2024-06-16 01:33 AM | Reply

It's not binary.

It works 99.9999% of the time.

It can't solve edge cases. Not within the realm of machine vision regardless of training data set.

Users are supposed to stay in control and monitor things, expecting failures.

#5 | Posted by sitzkrieg at 2024-06-17 07:46 AM | Reply | Funny: 3

If you can make progress on edge cases, it's a $600k/yr research job.

#6 | Posted by sitzkrieg at 2024-06-17 08:49 AM | Reply

#4 | Posted by LampLighter

By this point we all know it doesn't work. It's the fanbois who use it as if it does work that are the problem just as much as Tesla's name for its glorified driver assist system.

#7 | Posted by GalaxiePete at 2024-06-17 12:46 PM | Reply | Newsworthy 1

#5 | Posted by sitzkrieg

Edge cases like a police vehicle directly in front of it... Gotcha...

#8 | Posted by GalaxiePete at 2024-06-17 12:47 PM | Reply

The Tesla's owner admitted that the car was in "self drive mode" and they were distracted using their phone at the time of the accident.

Why the ---- is "self driving mode" even legal? This is ludicrous.

#9 | Posted by JOE at 2024-06-17 01:04 PM | Reply | Newsworthy 1

Edge cases like a police vehicle directly in front of it...

#8 | POSTED BY GALAXIEPETE AT 2024-06-17 12:47 PM | FLAG:

It is when you're training AI models on extreme amounts of video driving data and not using a LIDAR. Tesla's design choices have been poor.

#10 | Posted by sitzkrieg at 2024-06-17 01:17 PM | Reply | Newsworthy 1

#9 | POSTED BY JOE- The same applies to bicycles and ATV/4-wheelers having the same status as cars and trucks. Been stuck behind both on 65mph state highways. Self driving vehicles? Defies common sense.

#11 | Posted by Yodagirl at 2024-06-17 01:46 PM | Reply

call an Uber

#12 | Posted by brerrabbit at 2024-06-17 02:10 PM | Reply

Elmo views us plebs as expendable,
so, don't expect improvements anytime soon.

#13 | Posted by earthmuse at 2024-06-17 02:16 PM | Reply | Newsworthy 1

#13 POSTED BY EARTHMUSE

he's beta testing his AI

#14 | Posted by brerrabbit at 2024-06-17 02:20 PM | Reply

#10 | Posted by sitzkrieg

My point is the case is not at all fringe. There is a vehicle directly in the car's path. How is that fringe by any stretch?

Poor choice is an understatement. When they killed LIDAR (it was there) critics jumped on it as a poor decision because literally everyone else is using it as part of their systems. .

#15 | Posted by GalaxiePete at 2024-06-17 02:42 PM | Reply

#9 | Posted by JOE

A lot of cars have something like it today in the form of Advanced Driver Assist. The naming is horrendously poor and creates a false sense of security.

#16 | Posted by GalaxiePete at 2024-06-17 02:44 PM | Reply

How is that fringe by any stretch?

#15 | POSTED BY GALAXIEPETE AT 2024-06-17 02:42 PM | FLAG:

It's how statistics work. Use linear regression and plot it, it's going to be in the 0.0001 area since most of the data is just freeway driving.

They never really had LIDAR. They had ultrasonic sensors. Those sensors don't have the range to do anything about high speed vehicle collisions. Real vehicle LIDAR required a large roof apparatus. They've already relied on machine vision, and it had significant limits, like low frequency data.

#17 | Posted by sitzkrieg at 2024-06-17 03:08 PM | Reply

always relied on

#18 | Posted by sitzkrieg at 2024-06-17 03:08 PM | Reply

I keep seeing all these Musk fanboy Twitter/X posts about Tesla's crashing. Vaulting into the air, rollovers, losing control...

The fanboys praise Musk when the occupants don't die. Musk is saving lives! As if no one has ever survived a major crash until recently.

We know the "full self driving" is a joke, but what happened to stability control, emergency braking, collision warning and lane departure avoidance?

#19 | Posted by Derek_Wildstar at 2024-06-17 03:27 PM | Reply

#19 There are many posts by Cybertruck owners describing how the vehicle broke down on its first road trip, or on the way home from delivery, or left their family with young children stranded in the middle of nowhere, etc. Inevitably the posts end with some variation of "still love the truck and Elon though!"

#20 | Posted by JOE at 2024-06-17 04:00 PM | Reply | Funny: 1

I'd be interested in seeing a comparison (per mile driven) in crash incidence comparing regular vehicles to "assisted vehicles".

Stationary Police cars get rear-ended every single day. Even when they are on the shoulder, idiots just drive right towards those flashing lights. I got hit 3 times, almost lost both legs the last time. The guy said, "Officer, I thought I was following you, why was you goin so slow?" I was stopped. That is why Officersturn their wheelsto the right, so if they get hit, they run off the road and not into traffic.

#21 | Posted by Miranda7 at 2024-06-18 02:35 PM | Reply

Comments are closed for this entry.

Home | Breaking News | Comments | User Blogs | Stats | Back Page | RSS Feed | RSS Spec | DMCA Compliance | Privacy | Copyright 2024 World Readable

Drudge Retort