this post was submitted on 25 Oct 2023
81 points (96.6% liked)
Technology
59298 readers
4437 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Driverless cars will have an impossible standard to live up to. California has 48.5 injuries per 100 million miles driven (and 1.4 deaths). Unless that is zero with driverless cars, then the public will see an unreasonable risk. Any single accident gets tons of press… I found it very difficult to find an objective injury rate for driverless cars. Probably because there are five levels of automation, and many of them allow human error to come into play. Also they are self reported by the driver companies.
Really sounds like it was more that the company tried to hide that their car started driving again with someone trapped underneath.
Sure, but who HASN'T done that?
Yeah, this incident and response makes more sense. But it is another case in point of the difficulties driverless companies will have. I drive a lot and I see the stupidest things. I’m sure we all have stories. With this story it is very easy to imagine a clueless driver doing the same.
But the best way to avoid crashes is to be predictable. Isn’t much more predictable than a bunch of self driven cars with no emotions.
True. But if a clueless driver tried to hide that they started driving again with someone trapped underneath, we view that as a criminal act.
I could totally see and even understand not knowing they were under the car and so trying to clear the scene of the accident.
It's the specific attempt to obscure that it happened. If a human did that, loosing their license is basically the bare minimum I'd expect.
This isn't an issue with the technology, but an issue with the company not being able to be relied upon to develop the technology in public in a safe fashion.
Agreed, hiding it was a terrible idea and should be punished.
This is an important point but I think you're interpreting it backwards. The current system relies on companies with a profit motive to do the testing internally, and the rest of us to trust their honesty and openness working with regularity authorities to make that rollout safe. They violated that trust,.
Also fwiw companies used to publish their data on injury rates for their internal testing, and by and large they were way worse than humans. In the last couple years, they've mostly stopped reporting them. Afaik there doesn't exist a single shred of actual, empirical evidence that we can make self driving cars actually better than humans outside of faith in technological improvement. Maybe that faith is warranted, maybe it's not (I think it's not), but either way, safety must be the number one priority. If these companies can't be trusted to work collaboratively with safety authorities then we should pull the plug hard and fast.
What number of free kills for your car are you asking?
Somewhere below the rate for human drivers...
As a human driver, you know you have zero free kills. Your car, with a rate below zero, must then be able to resurrect at least one dead person for free.
If the total number of road deaths decreases then it's a net benefit, regardless of whether some were killed by automation or some were killed by human error. I just want the number to decrease. It will never be zero. Don't make good the enemy of perfect.
No, I wasn't asking about any anonymous 'total number'. Just you, specifically, and your car, specifically.
You're arguing for no self-driving until it's perfect, which is insane, so I'm not going to bother responding further. I don't want this to descend any further into a waste of time.