this post was submitted on 22 Sep 2023
210 points (96.1% liked)

Technology

60067 readers
3761 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Cruise CEO says SF ‘should be rolling out the red carpet’ for robotaxis, threatens to maybe leave town::In his first major public interview since the DMV cut their San Francisco fleet in half, Cruise CEO Kyle Vogt said “we cannot expect perfection” from the self-driving cars, and vaguely threatened to leave town if regulators curtail them any further.

you are viewing a single comment's thread
view the rest of the comments
[–] chakan2@lemmy.world 96 points 1 year ago (3 children)

You MUST DEMAND perfection from self driving cars. Mistakes cost lives.

Fuck this guy.

[–] GenderNeutralBro@lemmy.sdf.org 44 points 1 year ago (5 children)

I don't know about "perfection", but we should at least aim to be better than most human drivers.

I'd be comfortable holding robot drivers to the same standard as human drivers if there were similar levels of accountability. That said, I think the current standards for licensing human drivers are far too low. Tons of people on the road are simply not capable of driving safely, consistently, and legally. I would support measures to raise the bar for human drivers as well, but since that is extremely unlikely, we can at least establish better standards for the future.

[–] Rolder@reddthat.com 12 points 1 year ago (1 children)

Just hold the CEO directly liable for any deaths or injuries. Like someone gets hit? That’s a reckless driving charge for the CEO. They would get perfect real quick.

[–] PHLAK@lemmy.world 9 points 1 year ago (2 children)

Unfortunately that's not how software development works.

[–] Rolder@reddthat.com 6 points 1 year ago

I’m aware, the idea was more tongue in cheek then anything

[–] LufyCZ@lemmy.world 2 points 1 year ago

Fortunately that's not how software development works

[–] Psythik@lemm.ee 8 points 1 year ago

As a pedestrian, I'd sooner trust a self-driving car to ID and stop for me than I'd trust a human to do the same. Humans make way more mistakes than these cars do. It just doesn't make the news when humans fuck up cause we do it all the damn time. But accidents are so rare for self-driving cars that every time one happens, it makes headlines, and then a bunch of idiots show up in the comments to throw shade at them when they're much worse drivers themselves.

And then more idiots show up and upvote them.

[–] DeadlineX@lemm.ee 6 points 1 year ago

Yeah a lot of people drive selfishly and dangerously. Until we get alternative transportation, however, more stringent licensing will just condemn poorer folks to worse poverty and possibly being cast to the streets.

We need better public transportation before we can cripple people’s ability to get where they need to be. Including work.

[–] guacupado@lemmy.world 3 points 1 year ago

I've always thought that self-driving cars won't be mainstream until local departments of transportation are actively aiding in surrounding recognition for these vehicles. Cities will need to make sure their paint is maintained much more often so that yellow and white lines are much more easily recognized by AI. Also need more of those LED street lights with the hoods so that the colors of the light better stand out. I'm sure there are also better way to make signs more readable to AI as well, but all of these needs to be done with the help from local governments. Autonomous vehicles get better the more other autonomous vehicles are on the road.

[–] themajesticdodo@lemmy.world -2 points 1 year ago (1 children)

How can a robot be as accountable as a human? You going to threaten to send it to jail?

I'd want it to be regulated like other safety features. If they shipped a car with faulty brakes or any other safety defects, it would be a legal issue. Fines, recalls, etc. Ideally it should be enough that half-assing it would put them out of business.

[–] admin@lemmy.my-box.dev 8 points 1 year ago (3 children)

As the other guy said. Demanding perfection is insane - we don't demand that from human drivers either. As long as it's better than humans (preferably by a long shot), I'm all in favour.

[–] chakan2@lemmy.world 3 points 1 year ago* (last edited 1 year ago) (1 children)

We demand perfection in a lot of fields, and we are a hell of a lot closer to it than the wild west of AI alphas we have driving around.

Aviation, Medical, Space Travel...etc...

We can get to extreme levels of quality when lives are at risk. Driverless cars put lives at risk.

Humans are a terribly low bar to use for a quality measure. Also, a human will (usually) do it's best to mitigate damage in am accident

In the case of Tesla...fuck it...I'm going through that parked semi at 80mph.

[–] admin@lemmy.my-box.dev 0 points 1 year ago (1 children)

None of those fields have achieved perfection. Airplanes crash, people die in hospitals and space shuttles. If anything, computer assistance has managed to make those safer than before.

If (when) robotcars are safer than human drivers, less people will die in traffic accidents. It's not a perfect bar to settle on, but it's better then the current standard.

Again, denying improvements, because it's less than perfect is just insane.

[–] chakan2@lemmy.world 0 points 1 year ago (1 children)

Denying "improvements" that cost innocent bystanders their life is the only responsible choice.

I was game for the great experiment 10 years ago. But the tech just hasn't gotten better, and arguably is worse today.

It's time to say enough is enough and restrict driverless tech to controlled areas.

Being simply better than the average human isn't enough here.

[–] admin@lemmy.my-box.dev 1 points 1 year ago (1 children)

I never said better than the average driver, I said better than human drivers (preferably by a long shot).

So let's say that means... Better than 90% of all drivers. That isn't going to cost lives, it's going to save them. Not to mention improve traffic flow.

[–] chakan2@lemmy.world -1 points 1 year ago

Unlikely...to make an AI car safer than 90% than human drivers means it will respect the speed limit.

That alone causes traffic jams and unsafe conditions around the car as people try to get around it.

A human driver will somewhat go with the flow of traffic.

An AI vehicle just won't work until it's a nearly perfect driver that can make human decisions.

That's not going to happen for a long time. Musk, with his revolving door of low cost engineers is actually making it all worse.

Pull the plug on this experiment and put it back on the test track.

[–] supercriticalcheese@feddit.it 3 points 1 year ago (1 children)

We don't even know if they are better than humans in an actual driving environment that is more challenging higher speed roads etc...

It is insane to think the slow speed tests are representative of the entire possible scenarios. Or they might fail in driving in things like roundabouts or merging into motorways much more often than humans or who knows what edge cases.

[–] admin@lemmy.my-box.dev 2 points 1 year ago (1 children)

I agree. That will need to be proven. But when they are better than, say 90% of all drivers, it would make sense to switch. Waiting until they're "perfect" (which is the requirement I object to), is just wasting needless lives.

[–] supercriticalcheese@feddit.it 1 points 1 year ago (1 children)

Depends on what happens when they make errors. Is it comparable to human errors or are they prone to making worse mistakes than humans on average in terms of the conseguences.

They might be 99.99% perfect but in 0.01% of cases cause massive car pileups in motorways (for example) due to reasons.

A proper risk analysis based on a controlled transition would be better to be done first.

[–] admin@lemmy.my-box.dev 1 points 1 year ago

Yups, fully agreed.

When it all comes down to I'd much rather have the mass pileup you describe once every few years (which can then be analysed and remedied due to the telemetry involved), than the over 3000 traffic deaths a day we have now.

[–] Son_of_dad@lemmy.world 8 points 1 year ago

Perfection right out of the gate is impossible, but I think SF is too big for these kinds of tests. Use smaller towns if anything