this post was submitted on 11 Aug 2023
111 points (95.1% liked)

Technology

59582 readers
3971 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

In a pivotal moment for the autonomous transportation industry, California chose to expand one of the biggest test cases for the technology.

you are viewing a single comment's thread
view the rest of the comments
[–] jeffw@lemmy.world -5 points 1 year ago (2 children)

Good. I’m sick of the fearmonger. “OH NO, THIS ONE CAR GOT IN A CRASH!!!!!!”

Yeah, but humans crash too?

[–] Gsus4@feddit.nl 9 points 1 year ago* (last edited 1 year ago) (1 children)

It hits different when you're the one being crashed into, but if it crashes less than monkeys behind the wheel and liabilities are all accounted for and punished accordingly, bring it!

[–] SpaceNoodle@lemmy.world 1 points 1 year ago (1 children)

What's with the obsession with punishment?

[–] Gsus4@feddit.nl 4 points 1 year ago* (last edited 1 year ago)

Because corporations can't be allowed to get away with what would land any of us in jail if we did it. We know they will cut corners if allowed, so make sure FSD is safer and that citizens are not defrauded when dealing with economic behemoths.

In other words, it's good that they have less accidents, but the ones they have should be treated the same way we treat human drivers or harsher, so that playing with chances is not just an economic factor to optimize and cut corners on. E.g. aviation safety rules: even low cost airlines need to follow these rules, not the legal farwest they created with social media.

With FSD the example is: LIDAR is more expensive, but it is an evolving technology that is essentially safe, but Elon wants to use just cameras...because it's cheaper...and...much less safe...it's not a solved problem on the cheap. That's why you need to penalize them for making such choices or outright forbid them from making them. They are going to be setting standards here and there is a risk that a shittier technology wins a few bucks for elon at the cost of lives into the future: and we can't half-ass this forever just because Elon wants his cars to be half the price it takes to do right.

[–] Chozo@kbin.social 9 points 1 year ago (2 children)

When I worked on Google's Waymo project, we only had a small handful of our cars involved in any collision on public roads. And every single one of them was from a human driver running into the SDC. I dunno if that's changed since I left, but even in the early stages, SDCs are remarkably safe compared to human drivers.

[–] sky@codesink.io 1 points 1 year ago

Cruise has hit an oncoming car, smashed into the back of a Muni bus, and is constantly stopping in emergency zones making first responders lives harder.

7 hours of debate of the community making it clear how much they don’t want this, how much the city’s leaders don’t want this, but the state doesn’t give a shit.

They may be “safe” because they avoid difficult maneuvers and only drive like 25-30mph, but that doesn’t mean they’re practical or should be welcome in our cities.

[–] NeoNachtwaechter@lemmy.world -1 points 1 year ago

And every single one of them was from a human driver running into the SDC

Yea, me too. I'm such a good driver, others are crashing into me every day...