this post was submitted on 16 Dec 2023
571 points (96.3% liked)

Technology

59284 readers
4743 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] samus7070@programming.dev 222 points 11 months ago (6 children)

The real crime is marketing the driver assist capability under the name autopilot when it is anything but that.

[–] TheGrandNagus@lemmy.world 174 points 11 months ago (1 children)

Oh no, it's even worse than that.

It's the CEO and other staff repeatedly speaking of the system as if it's basically fully capable and it's only for legal reasons why a driver is even required. Even saying that the car could drive from one side of the US to the other without driver interaction (only to not actually do that, of course).

It's the company never correcting people when they call it a self driving system.

It's the company saying they're ready for autonomous taxis and saying owner's cars will make money for them while they aren't driving it.

It's calling their software subscription Full Self Driving

It's honestly staggering to me that they're able to get away with this shit.

[–] meleecrits@lemmy.world 85 points 11 months ago (5 children)

I love my Model 3, but everything you said is spot on. Autopilot is a great driver assist, but it is nowhere near autonomous driving. I was using it on the highway and was passing a truck on the left. The road veered left and the truck did as well, keeping in its lane the entire time. The car interpreted this as the truck merging over into my lane and slammed the brakes. Fortunately, I was able to figure out what went wrong and quickly accelerated myself so as to not become a hazard to the cars behind me.

Using Autopilot as anything more than a nice dynamic cruise control setting is putting your life, and other lives, in danger.

[–] Neato@kbin.social 53 points 11 months ago (3 children)

Holy shit. My car doing that once and I'd be a nervous wreck just thinking about using it again.

[–] Wrench@lemmy.world 29 points 11 months ago

I give teslas more room because I have been brake checked by them on empty roads before. These ghost brake problems are prevalent.

[–] snooggums@kbin.social 20 points 11 months ago (1 children)

I have had the adaptive cruise control brake on multiple Hondas and Subarus in similar situations. Not like slamming on the brakes, but firm enough to confuse the hell out of me.

Every time it was confusing and now I just don't use it if the road is anything but open and clear.

[–] buran@lemmy.world 23 points 11 months ago* (last edited 10 months ago)

Honda’s sensing system will read shadows from bridges as obstructions in the road that it needs to brake for. It’s easy enough to accelerate out of the slowdown, but I was surprised to find that there is apparently no radar check to see if the obstruction is real.

My current vehicle doesn’t have that issue, so either the programming has been improved or the vendor for the sensing systems is a different one (different vehicle make, so it’s entirely possible).

[–] burliman@lemm.ee 0 points 11 months ago (3 children)

That’s the bar that automatic driving has. It messes up once and you never trust it again and the news spins the failure far and wide.

Your uncle doing the same thing just triggers you to yell at him, the guy behind him flips you off, he apologizes, you’re nervous for a while, and you continue your road trip. Even if he killed someone we would blame the one uncle, or some may blame his entire class at worst. But we would not say that no human should drive again until it is fixed like we do with automated cars.

I do get the difference between those, and I do think that they should try to make automated drivers better, but we can at least agree about that premise: automated cars have a seriously unreasonable bar to maintain. Maybe that’s fair, and we will never accept anything but perfect, but then we may never have automated cars. And as someone who drives with humans every day, that makes me very sad.

[–] maynarkh@feddit.nl 12 points 11 months ago (1 children)

There is a big difference between Autopilot and that hypotethical uncle. If the uncle causes an accident or breaks shit, he or his insurance pays. Autopilot doesn't.

By your analogy, it's like putting a ton of learner drivers on the road with unqualified instructors, and not telling the instructors that they are supposed to be instructors, but that they are actually taking a taxi service. Except it's somehow their responsibility. And of course pocketing both the instruction and taxi fees.

The bar is not incredibly high for self driving cars to be accepted. The only thing is that they should take the blame if they mess up, like all other drivers.

[–] burliman@lemm.ee 2 points 11 months ago

Yeah, for sure. Like I said, I get the difference. But ultimately we are talking about injury prevention. If automated cars prevented one less death per mile than human drivers, we would think they are terrible. Even though they saved one life.

And even if they only caused one death per year we’d hear about it and we might still think they are terrible.

[–] Neato@kbin.social 3 points 11 months ago (1 children)

The difference is that Tesla said it was autopilot when it's really not. It's also clearly not ready for primetime. And auto regulators have pretty strict requirements about reliability and safety.

While that's true that autonomous cars kill FAR less people than human drivers, ever human is different. If an autonomous driver is subpar and that AI is rolled out to millions of cars, we've vastly lowered safety of cars. We need autonomous cars to be better than the best driver because, frankly, humans are shit drivers.

I'm 100% for autonomous cars taking over entirely. But Tesla isn't really trying to get there. They are trying to sell cars and lying about their capabilities. And because of that, Tesla should be liable for the deaths. We already have them partially liable: this case caused a recall of this feature.

[–] Staiden@lemmy.dbzer0.com 4 points 11 months ago* (last edited 11 months ago)

But the vaporware salesman said fully automatic driving was 1 year away! In 2018, 2019, 2020, 2021... he should be held responsible. The guy once said to further technology some people will die and that's just the price we pay. It was in a comment about going to Mars, but we should take that in to accout for everything he does. If I owned a business and one of my workers died or killed someone because of gross negligence I'd be held responsible why does he get away with it.

[–] SlopppyEngineer@discuss.tchncs.de 0 points 11 months ago

Except Tesla's uncle had brain damage and doesn't really learn from the situation so will go it again, and had clones of him driving thousands of other cars.

[–] Damage 8 points 11 months ago

Something like that happened to me while using adaptive cruise control on a rental Jeep Renegade, it slammed the brakes twice on the highway but for no clear reason. I deactivated it before it tried a third one.

[–] Alchemy@lemmy.world 7 points 11 months ago (1 children)

Your cars actions could kill someone.

[–] Speculater@lemmy.world 6 points 11 months ago

That's only a $11.5k fine though.

[–] LordKitsuna@lemmy.world 1 points 11 months ago

The auto cruise on the Priuses at work do this a lot. If the freeway curves to the left or something it will panic and think I'm about to hit the cars in the lane next to me also going through the Curve

[–] NeoNachtwaechter@lemmy.world -3 points 11 months ago (1 children)

The road veered left and the truck did as well, keeping in its lane the entire time. The car interpreted this as the truck merging over into my lane and slammed the brakes.

Even dynamic cruise control must never do such dangerous mistakes!

You should claim that they fix this on warranty, and they should prove that this is never going to happen again.

[–] LordKitsuna@lemmy.world 1 points 11 months ago

Almost all of them do it, the one most fresh in my mind is the Prius because my work uses them as base cards so I drive them a lot. If the highway curves kind of hard to either the left or the right sometimes it will panic and think you're about to hit the car in the lane next to you because they're technically in front of you and so it will try to brake.

Thankfully there is an option to turn off the automatic braking it will just start screaming instead

[–] dylanmorgan 38 points 11 months ago

I think the real crime is vehicular manslaughter, especially the SECOND one.

[–] Neato@kbin.social 25 points 11 months ago (1 children)

Tesla should be playing wrongful death suits every time autopilot kills someone. Their excuses don't excuse the blatant marketing that leads people to believe it's a self driving car.

[–] 800XL@lemmy.world 10 points 11 months ago (1 children)

But you see that wasn't the vehicle's fault. It's been programmed perfectly. What happened was the fault of the pedestrians and driver for not properly predicting what the car would do.

maybe /s maybe not.

[–] Goferking0@ttrpg.network 8 points 11 months ago

no you see the issue is that the auto pilot stopped right before the accident so obviously it was entirely drivers fault, please don't check how much time was between it stopping and the accident

[–] raptir@lemdro.id 22 points 11 months ago (5 children)

Do we need to go through what autopilot in a plane or boat actually does again?

[–] kool_newt@lemm.ee 24 points 11 months ago (1 children)

It doesn't matter, Tesla cars are marketed to the public which isn't expected to know these things. To probably 90% of people "autopilot" means "drive automatically".

[–] CmdrShepard@lemmy.one -3 points 11 months ago (2 children)

To probably 90% of people "autopilot" means "drive automatically".

Based on what?

[–] kool_newt@lemm.ee 5 points 11 months ago

Based on my usage and understanding of the word being a lay person.

I'm an engineer myself, sometimes there are words that you have to be cognizant of the differences in meaning to other engineers vs lay people or even engineers in other fields. Some words are heavily overloaded, and "autopilot" is kinda one of them (others being "domain", "node", "artificial intelligence", etc.).

[–] poopkins@lemmy.world 4 points 11 months ago (1 children)

Tesla markets this feature as "Full Self-Driving Capability." Maybe I'm poorly informed, but to me that means that the car is fully capable of driving itself without human interaction.

[–] CmdrShepard@lemmy.one 2 points 11 months ago (1 children)

FSD is an entirely separate thing. Autopilot is just an LKAS system, or adaptive cruise control.

[–] poopkins@lemmy.world 4 points 11 months ago (1 children)

Aha, today I learned that Autopilot is just lane-keeping and adaptive cruise control. I feel that it must be a common misunderstanding to confuse the terms "Autopilot" and "Fully Self-Driving" in the vernacular.

Many other manufacturers refer to lane-keeping systems as "driver assistance," and I believe Tesla is intentionally misleading consumers with the impression that their system is more capable and allows the driver to pay less attention.

[–] RushingSquirrel@lemm.ee 1 points 11 months ago

Until you drive it. You know the capabilities, you know when you can and cannot activate it, you know how often it tells you to look at the road and if you don't prove you've got your hands on the wheel, it disables itself for the drive (you need to park to reactivate it). No Tesla driver thinks autopilot is more than a lane and distance keeping assistance.

Autopilot is a marketing name, that's it.

[–] dexa_scantron@lemmy.world 18 points 11 months ago (1 children)

If we do, then they shouldn't have picked a name that most people think does something it doesn't.

[–] RushingSquirrel@lemm.ee 2 points 11 months ago

When you drive a Tesla, it's pretty clear what autopilot is. The name is a marketing term, you can't engage it everywhere and anytime, you've got to keep your hands on the wheel or it disables itself, won't stop at stop signs and red lights, won't do line changes, etc.

[–] fiah@discuss.tchncs.de 7 points 11 months ago (1 children)

do we need to go through the differences in training, aptitude and intelligence between pilots, captains and your neighbor Greg again? Marketing it as "autopilot" to anyone who can sign a car loan is reckless and has killed people and will continue to kill people until they stop

[–] CmdrShepard@lemmy.one -3 points 11 months ago

Yep, just like "cruise control" made tons of people drive their car into the ocean thinking they could sail it to popular island destinations.

[–] doublejay1999@lemmy.world 7 points 11 months ago (1 children)

What does full self driving mean ?

[–] CmdrShepard@lemmy.one 3 points 11 months ago

Full Self Driving and Autopilot are two totally separate systems.

[–] merc@sh.itjust.works 1 points 11 months ago

Depends entirely on the type of autopilot.

[–] Fox@pawb.social 12 points 11 months ago* (last edited 11 months ago) (1 children)

It's a common misunderstanding that an autopilot system in an airplane does everything or even a lot of things. The most basic ones keep the wings level and nothing else. Of course Tesla is probably counting on that misconception to sell this feature, but actual pilots using any kind of autopilot are still on the hook to pay attention 100% of the time.

[–] menemen@lemmy.world 3 points 11 months ago

In an airplane that is fine as pilots are specifically trained on the planes they fly (at least in theory). No one gets a special course in how to drive a specific (non industrial) car...