this post was submitted on 13 Jun 2024
523 points (98.2% liked)

News

23310 readers
4189 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
 

A Tesla was in its self-driving mode when it crashed into a parked patrol vehicle responding to a fatal crash in Orange County Thursday morning, police said. 

The officer was on traffic control duty blocking Orangethorpe Avenue in Fullerton for an investigation into a suspected DUI crash that left a motorcyclist dead around 9 p.m. Wednesday when his vehicle was struck.

A Fullerton Police Department spokesperson said the officer was standing outside his vehicle around midnight when he saw a Tesla driving in his direction and not slowing down.

top 50 comments
sorted by: hot top controversial new old
[–] FlyingSquid@lemmy.world 118 points 4 months ago (5 children)

It really doesn't help that the media isn't putting "Self-Driving" Mode in quotes since it isn't fucking self-driving.

[–] Glemek@lemmy.world 92 points 4 months ago (1 children)

It is though: self driving into objects

[–] Nougat@fedia.io 41 points 4 months ago

"We never said it was good self-driving."

[–] NoIWontPickAName@kbin.earth 19 points 4 months ago (1 children)

Isn’t that what Tesla called it?

[–] FlyingSquid@lemmy.world 46 points 4 months ago (2 children)

Tesla calls it "Full Self Driving" and it's a lie. So capitalize it and put it in quotes, rather than call it self-drive mode like that's an actual thing.

[–] icy_mal@lemmy.world 20 points 4 months ago

The actual name: Full self driving (supervised) is so shady. Supervised is just a less crappy sounding way to indicate that you will have to take over and drive sometimes. So sometimes the car drives itself and sometimes you drive. So partial self driving, partial human driving. I'm surprised they didn't call it "Partial Full Self Driving". That would certainly amp up the trolling factor and really separate the true believers who would come out defending it with Olympic level mental gymnastics.

[–] disguy_ovahea@lemmy.world 10 points 4 months ago* (last edited 4 months ago)

It is an actual thing, just not on Teslas. It must’ve chapped Musk’s ass something fierce that Mercedes-Benz got the DOT approval before him.

https://www.caranddriver.com/reviews/a45326503/mercedes-benz-drive-pilot-review/

[–] Empricorn@feddit.nl 10 points 4 months ago

It's "self-driving", not "self-stopping". Luckily the police were able to assist with cruiser-based rapid deceleration.

[–] Lileath@lemmy.blahaj.zone 9 points 4 months ago

Technically it is self-driving but just in the sense that it doesn't need any external power sources like horses to pull it.

load more comments (1 replies)
[–] mcqtom@lemmy.world 66 points 4 months ago (1 children)
[–] Gradually_Adjusting@lemmy.world 38 points 4 months ago

Finally, some real journalism

[–] MeekerThanBeaker@lemmy.world 64 points 4 months ago (3 children)

And again, I thank rich people for being test subjects on new technology.

[–] FireRetardant@lemmy.world 94 points 4 months ago

The victims involved in crashes aren't always rich. People in other cars or pedestrians and cyclists can be injured by these mistakes.

[–] garretble@lemmy.world 52 points 4 months ago* (last edited 4 months ago) (1 children)

If only it were that simple. WE are all the test subjects in this case whether we like it or not.

[–] oxjox@lemmy.ml 9 points 4 months ago (1 children)

No we’re not. They’re the rats, we’re the maze.

[–] Grimy@lemmy.world 13 points 4 months ago (1 children)
[–] grysbok@lemmy.sdf.org 7 points 4 months ago (1 children)
load more comments (1 replies)
[–] Revan343@lemmy.ca 12 points 4 months ago

I mean, testing new technology wasn't really the issue here, the name of Tesla's 'self driving' mode is just a lie. This is an idiot driver who should have been paying attention and wasn't.

But we already knew he was an idiot, he did buy a Tesla.

[–] Kolanaki@yiffit.net 52 points 4 months ago (2 children)

"ACCAB" - That Tesla

*The extra C is for Cars. All Cop Cars Are Bastards.

[–] Zorsith@lemmy.blahaj.zone 26 points 4 months ago (3 children)

FTP - It's not just a protocol.

load more comments (3 replies)
load more comments (1 replies)
[–] BigMacHole@lemm.ee 44 points 4 months ago (2 children)

That must have been SO scary for the cop! He wouldn't know whether to shoot the car or the passenger!

[–] Zip2@feddit.uk 22 points 4 months ago (2 children)

Must have been a white Tesla. If it was a black one, it would have been a no brainer.

load more comments (2 replies)
load more comments (1 replies)
[–] FiniteBanjo@lemmy.today 29 points 4 months ago (1 children)

TBH if this process could work a little faster then maybe evolution could remove all the ai tech bros from the gene pool.

load more comments (1 replies)
[–] bitwolf@lemmy.one 29 points 4 months ago* (last edited 4 months ago)

They give so much lenience to Tesla.

Yet Cruise was kicked out of California for someone else hitting a pedestrian into the Cruise vehicle and running. This was while also providing dashcam footage to capture the assailant.

[–] kandoh@reddthat.com 27 points 4 months ago (1 children)

Maybe I've been too harsh on self driving Tesla's...

Tesla... Back (into) the Blue.

[–] Wrench@lemmy.world 26 points 4 months ago (2 children)

Fuck Elon, and to a lesser extend, Tesla and all. But this seems like yet another user error on several accounts. I thought "autopilot" was only supposed to be used on freeways. And obviously assisted by a human who should have seen a fucking parked cop car coming and intercede anyway.

But that said, fuck Elon and his deceptive naming of a fucking primitive tech that's really only good at staying in a lane at speed under ideal conditions.

[–] halcyoncmdr@lemmy.world 13 points 4 months ago (7 children)

I thought "autopilot" was only supposed to be used on freeways. And obviously assisted by a human who should have seen a fucking parked cop car coming and intercede anyway.

It depends on which system they actually had on the vehicle. It's more complicated than random people seem to think. But even with the FSD beta, it specifically tells the driver every time they activate it that they need to pay attention and are still responsible for the vehicle.

Despite what the average internet user seems to think, not all Teslas even have the computer capable of Full Self Driving installed. I'd even say most don't. Most people seem to think that Autopilot and FSD are the same, they're not, and never have been.

There have been 4+ computer systems in use over the years as they've upgraded the hardware and added capabilities in newer software. Autopilot, Enhanced Autopilot, and Full Self Driving BETA are three different systems with different capabilities. Anything bought prior to the very first small public closed beta of FSD a couple years ago would need to be replaced with a new computer to use FSD. Installation cost is included if someone buys FSD outright, or they have to pay for the upgrade if they instead want the subscription. All older Teslas however would be limited to Autopilot and Enhanced Autopilot without that computer upgrade.

The AP and FSD systems are not at all the same, and they use different code. Autopilot is designed and intended for highways and doesn't require the upgraded computer. Autopilot is and always has been effectively just Traffic Aware Cruise Control and Auto steer. Enhanced Autopilot added extra features like Summon, Auto lane change, Navigetc.ate on Autopilot (on-ramp to off-ramp navigation) but has never been intended for city streets. Autopilot itself hasn't really been updated in years, almost all the updates have been to the FSD beta.

The FSD beta is what is being designed for city streets, intersections, etc. and needs that upgraded computer to process everything for that in real time. It uses a different codebase to process data.

[–] ShepherdPie@midwest.social 12 points 4 months ago (2 children)

The spokesperson said that the Tesla was in self-drive mode and **the driver admitted to being on a cellphone at the time of the crash. **

That seems to answer all the questions about this accident.

load more comments (2 replies)
[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 11 points 4 months ago (1 children)

My Subaru with adaptive cruise control is smart enough to not zoom into the back of a parked car. If my car with a potato for a CPU can figure it out then why can’t a tesla in any more with it’s significant more advanced computer?

[–] halcyoncmdr@lemmy.world 7 points 4 months ago* (last edited 4 months ago)

It is simple, it depends on what the vehicle is using to actually process other vehicles to maintain distance from.

These systems process a lot of information, and a lot of it is pretty bad data that needs to be cleaned to remove erroneous readings before it can be processed. Sensors stream a lot of info, and not all of it is perfectly accurate. The same is true for a Tesla or any other vehicle, and filtering that data accurately means a better experience.

Say your vehicle has a forward facing radar, and you're driving along the highway and the radar gets a return for a large object in front of the car 100 feet ahead when the returns immediately before were showing a 300 foot clear zone. Is it more likely that a large object suddenly appeared in front of the car, or that this return is erroneous and the next few returns after will show a clear zone again? Overhead signs and overpasses can show similar returns to a large truck in your lane for instance. This is one advantage lidar has over radar, more accurate angle measurements at all distances.

So say the vehicle acts on that return and slam on the brakes because the "object" is only 100 feet ahead at highway speeds. Then the erroneous return goes away and there's a clear road again. That's the "phantom braking" I'm sure you've seen various people talk about. The system reacting to an erroneous return instead of filtering it out as a bad reading. Now random braking in the middle of a highway is dangerous as well, need to minimize that. Is it more likely a massive wall suddenly appeared directly in front of the car, or that it's a couple bad readings? The car has to determine that to make a decision on what to do. And different types of sensors will detect things differently. To some sensors, materials like paper are essentially invisible for instance but metal is clear as day. If the sensor can't detect something, it won't react.

Note that these readings do not involve a camera at all. They inherently work differently than a human driver does by looking at the road. So many people online want to point out that sensors are more "reliable" or "trustworthy" compared to vision since there's little processing, you just get a data point, yet sensors will provide bad data often enough that it needs to have a filter to remove bad data. A camera works like a person, it can see everything, you just need to teach it to ide tify what it needs to pay attention to, and what it can ignore, like the sky, or power lines, or trees passing by on the side of the road. But not the human on the side of the road, need to see that.

Then we get into the fact that various sensors exist on older vehicles that have been removed from newer ones. Things like radar and ultrasonic sensors have been removed in favor of using computer vision via the cameras directly, like a human driver watching the road. Going frame by frame to categorize what it sees for vehicles, people, cones, lanes, etc. and comparing to previous frames to extrapolate things like motion, movement, and relative speed. But with cameras you have issues with things like lights blinding them, just like a bright light blinds a person. Maybe the camera can't see for some reason, like a light shining directly in the lens. It takes a little time for it to try and adjust exposure to compensate for a bright light shining directly in the lens.

You might suggest using as many sensors as possible then, but that makes it nearly impossible to actually make a decision then. Sensor integration is a huge data processing issue. how do you determine what data to accept and what to ignore when you get conflicting results from different types of sensors? This is why Tesla is trying to just do it all via vision. One type of sensor, roughly equivalent to a human but with wider visual spectrum sensitivity. Just classify what's in each frame and act on it. Simple implementation, just needs A LOT of data to train it in as many situations as possible.

And that camera is where we get to emergency vehicles specifically. In my opinion, these emergency vehicle accidents are likely the camera being blinded repeatedly by the emergency lights rotating and the camera shifting exposure up and down every second or so to try and maintain an image it can actually process. As a human, at night, those lights make it hard for even me to see the rest of the road.

It's not like regular drivers never crash into emergency vehicles either, they just don't make national news, just like the 33 car fires every hour in the US alone.

It's not a simple thing, and even your "simple" car by comparison is doing a lot to filter the data it gets. It could be using completely different kinds of data than another vehicle for that cruise control, so given the right circumstances it may react differently.

For what it's worth, my Model 3 has rarely had issues with Autopilot acting in any sort of dangerous manner. A few phantom braking issues back when I got it in 2018, but I haven't had a single one of those in maybe 4 years now, even in areas where it would almost always react that way back when I got it. Sometimes a little lane weirdness with old poorly marked lane lines, or even old lane lines visible in addition to the current ones in some areas. It's pretty easy to tell the situations AP might have issues with once you're used it just a few times.

[–] NotMyOldRedditName@lemmy.world 7 points 4 months ago* (last edited 4 months ago) (2 children)

All cars since mid 2019 have the computer required for FSD.

At this point that includes the majority of all Teslas ever sold. Somewhere between 750k and 800k of 6 million don't have the hardware. And of those 100-200k are upgradeable, maybe more but the research time isn't worth it.

That being said, it still could have been AP and not FSD as the media gets it confused all the time.

load more comments (2 replies)
load more comments (4 replies)
load more comments (1 replies)
[–] Buffalox@lemmy.world 25 points 4 months ago* (last edited 4 months ago) (1 children)

I just heard from Enron Musk that it crashed into the patrol car way more safely than a human would have done.
Also according to Enron Musk Full self driving has been working since 2017, and is in such a refined state now, that you wouldn't believe how gracefully it crashed into that patrol car. It was almost like a car ballet, ending in a small elegant piruette.

As Enron Musk recently stated, in a few months we should have Tesla Robo Taxies in the streets, and you will be able to observe these beautiful events regularly yourself.

Others say that's ridiculous, he is just trying to save Enron, but that's too late.

[–] brbposting@sh.itjust.works 12 points 4 months ago (3 children)

All I do at night is open my garage door to let my car out. A few months later here I’m a millionaire. Thank you full self driving Roboenron 😍

load more comments (3 replies)
[–] FanciestPants@lemmy.world 23 points 4 months ago (1 children)

~~Jesus~~ Elon is my co-pilot

[–] j4k3@lemmy.world 7 points 4 months ago

thoughts and prayers

[–] nutsack@lemmy.world 15 points 4 months ago (2 children)
load more comments (2 replies)
[–] rez_doggie@lemmy.world 15 points 4 months ago

Based robot car

[–] aesthelete@lemmy.world 14 points 4 months ago

Uh oh tesla self drive is woke

[–] AbouBenAdhem@lemmy.world 11 points 4 months ago* (last edited 4 months ago)

Everyone tried to warn Elon not to use Fury Road as training data.

[–] mashbooq@infosec.pub 10 points 4 months ago

First useful thing Elon ever did

[–] Timecircleline@sh.itjust.works 9 points 4 months ago (1 children)

I backed my car into a cop car, the other day.

Well he just drove off sometimes life's okay.

load more comments (1 replies)
[–] itsgroundhogdayagain@lemmy.ml 7 points 4 months ago

2 birds 1 crash

load more comments
view more: next ›