this post was submitted on 24 Aug 2023
564 points (94.3% liked)

Technology

59197 readers
2512 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Driverless cars worse at detecting children and darker-skinned pedestrians say scientists::Researchers call for tighter regulations following major age and race-based discrepancies in AI autonomous systems.

top 50 comments
sorted by: hot top controversial new old
[–] angelsomething@lemmy.one 105 points 1 year ago (1 children)

Easy solution is to enforce a buddy system. For every black person walking alone at night must accompanied by a white person. /s

[–] RanchOnPancakes@lemmy.world 34 points 1 year ago* (last edited 1 year ago) (3 children)

Loved that show.

But you have to hire equally so some may get darker skinned buddies. Who will then need buddies.

[–] Vengefu1Tuna@lemm.ee 6 points 1 year ago (1 children)

I think this is one of my favorite TV episodes ever. Better Off Ted deserved better.

load more comments (1 replies)
load more comments (2 replies)
[–] reddig33@lemmy.world 63 points 1 year ago (6 children)

LiDAR doesn’t see skin color or age. Radar doesn’t either. Infra-red doesn’t either.

[–] drz@lemmy.ca 57 points 1 year ago (4 children)

LiDAR, radar and infra-red may still perform worse on children due to children being smaller and therefore there would be fewer contact points from the LiDAR reflection.

I work in a self driving R&D lab.

load more comments (4 replies)
[–] quirk_eclair78@lemmy.world 5 points 1 year ago (1 children)

That's a fair observation! LiDAR, radar, and infra-red systems might not directly detect skin color or age, but the point being made in the article is that there are challenges when it comes to accurately detecting darker-skinned pedestrians and children. It seems that the bias could stem from the data used to train these AI systems, which may not have enough diverse representation.

[–] bassomitron@lemmy.world 17 points 1 year ago

The main issue, as someone else pointed out as well, is in image detection systems only, which is what this article is primarily discussing. Lidar does have its own drawbacks, however. I wouldn't be surprised if those systems would still not detect children as reliably. Skin color wouldn't definitely be a consideration for it, though, as that's not really how that tech works.

load more comments (4 replies)
[–] Rinox@feddit.it 58 points 1 year ago (2 children)

Isn't that true for humans as well? I know I find it harder to see children due to the small size and dark skinned people at night due to, you know, low contrast (especially if they are wearing dark clothes).

Human vision be racist and ageist

Ps: but yes, please do improve the algorithms

[–] tony@lemmy.hoyle.me.uk 7 points 1 year ago

Part of the children problem is distinguishing between 'small' and 'far away'. Humans seem reasonably good at it, but from what I've seen AIs aren't there yet.

load more comments (1 replies)
[–] tonytins@pawb.social 37 points 1 year ago (13 children)

Maybe if we just, I dunno, funded more mass transit and made it more accessible? Hell, trains are way better at being automated than any single car.

load more comments (13 replies)
[–] OrdinaryAlien@lemm.ee 29 points 1 year ago* (last edited 1 year ago)

DRIVERLESS CARS: We killed them. We killed them all. They're dead, every single one of them. And not just the pedestmen, but the pedestwomen and the pedestchildren, too. We slaughtered them like animals. We hate them!

[–] 666dollarfootlong@lemmy.world 24 points 1 year ago (6 children)

Wouldn't good driverless cars use radars or lidars or whatever? Seems like the biggest issue here is that darker skin tones are harder for cameras to see

[–] MSids@lemmy.sdf.org 17 points 1 year ago* (last edited 1 year ago) (2 children)

Tesla removed the LiDAR from their cars, a step backwards if you ask me.

Edit: Sorry RADAR not LiDAR.

[–] skyspydude1@lemmy.world 13 points 1 year ago

They removed the radars, they've never used LiDAR as Elon considered it "a fool's errand", which translates to "too expensive to put in my penny pinched economy cars". Also worth noting that they took the radars out purely to keep production and the stock price up, despite them knowing well in advance performance was going to take a massive hit without it. They just don't give a shit, and a few pedestrian deaths are 100% worth it to Elon with all the money he made from the insane value spike of the stock during COVID. They were the one automaker who maintained production because they just randomly swapped in whatever random parts they could find, instead of anything properly tested or validated, rather than suck it up for a bad quarter or two like everyone else.

[–] dx1@lemmy.world 5 points 1 year ago (3 children)

Seems like Tesla is really not going to be the market leader on this. IDK if anytime else caught those videos by the self driving tech expert going through all the ways Tesla is bullshitting about it.

[–] Blackmist@feddit.uk 6 points 1 year ago

Hey, they'll have full self driving tech next year!

Source: Elon Musk, every year, for like the last ten years.

load more comments (2 replies)
load more comments (5 replies)
[–] dangblingus@lemmy.dbzer0.com 24 points 1 year ago (1 children)

I'm sick of the implication that computer programmers are intentionally or unintentionally adding racial bias to AI systems. As if a massive percentage of software developers in NA aren't people of color. When can we have the discussion where we talk about how photosensitive technology and contrast ratio works?

load more comments (1 replies)
[–] eager_eagle@lemmy.world 24 points 1 year ago* (last edited 1 year ago) (2 children)

I hate all this bias bullshit because it makes the problem bigger than it actually is and passes the wrong idea to the general public.

A pedestrian detection system shouldn't have as its goal to detect skin tones and different pedestrian sizes equally. There's no benefit in that. It should do the best it can to reduce the false negative rates of pedestrian detection regardless, and hopefully do better than human drivers in the majority of scenarios. The error rates will be different due to the very nature of the task, and that's ok.

This is what actually happens during research for the most part, but the media loves to stir some polarization and the public gives their clicks. Pushing for a "reduced bias model" is actually detrimental to the overall performance, because it incentivizes development of models that perform worse in scenarios they could have an edge just to serve an artificial demand for reduced bias.

[–] zabadoh@lemmy.ml 12 points 1 year ago (2 children)

I think you're misunderstanding what the article is saying.

You're correct that it isn't the job of a system to detect someone's skin color, and judge those people by it.

But the fact that AVs detect dark skinned people and short people at a lower effectiveness is a reflection of the lack of diversity in the tech staff designing and testing these systems as a whole.

They staff are designing the AVs to safely navigate in a world of people like them, but when the staff are overwhelmingly male, light skinned, young and single, and urban, and in the United States, a lot of considerations don't even cross their minds.

Will the AVs recognize female pedestrians?

Do the sensors sense light spectrum wide enough to detect dark skinned people?

Will the AVs recognize someone with a walker or in a wheelchair, or some other mobility device?

Toddlers are small and unpredictable.

Bicyclists can fall over at any moment.

Are all these AVs being tested in cities being exposed to all the animals they might encounter in rural areas like sheep, llamas, otters, alligators and other animals who might be in the road?

How well will AVs tested in urban areas fare on twisty mountain roads that suddenly change from multi lane asphalt to narrow twisty dirt roads?

Will they recognize tractors and other farm or industrial vehicles on the road?

Will they recognize something you only encounter in a foreign country like an elephant or an orangutan or a rickshaw? Or what's it going to do if it comes across that tomato festival in Spain?

Engineering isn't magical: It's the result of centuries of experimentation and recorded knowledge of what works and doesn't work.

Releasing AVs on the entire world without testing them on every little thing they might encounter is just asking for trouble.

What's required for safe driving without human intelligence is more mind boggling the more you think about it.

[–] rDrDr@lemmy.world 20 points 1 year ago (4 children)

But the fact that AVs detect dark skinned people and short people at a lower effectiveness is a reflection of the lack of diversity in the tech staff designing and testing these systems as a whole.

No, it isn't. Its a product of the fact that dark people are darker and children are smaller. Human drivers have a harder time seeing these individuals too. They literally send less data to the camera sensor. This is why people wear reflective vests for safety at night, and ninjas dress in black.

[–] lud@lemm.ee 4 points 1 year ago

That doesn't make it better.

It doesn't matter why they are bad at detecting X, it should be improved regardless.

Also maybe Lidarr would be a better idea.

load more comments (3 replies)
load more comments (1 replies)
load more comments (1 replies)
[–] Darkassassin07@lemmy.ca 16 points 1 year ago (5 children)

This has been the case with pretty much every single piece of computer-vision software to ever exist....

Darker individuals blend into dark backgrounds better than lighter skinned individuals. Dark backgrounds are more common that light ones, ie; the absence of sufficient light is more common than 24/7 well-lit environments.

Obviously computer vision will struggle more with darker individuals.

[–] Rivalarrival@lemmy.today 12 points 1 year ago (1 children)
load more comments (1 replies)
load more comments (4 replies)
[–] RobotToaster@infosec.pub 16 points 1 year ago* (last edited 1 year ago)

The study only used images and the image recognition system, so this will only be accurate for self driving systems that operate purely on image recognition. The only one that does that currently is Tesla AFAIK.

[–] AllonzeeLV@lemmy.world 15 points 1 year ago* (last edited 1 year ago) (7 children)

Worse than humans?!

I find that very hard to believe.

We consider it the cost of doing business, but self-driving cars have an obscenely low bar to surpass us in terms of safety. The biggest hurdle it has to climb is accounting for irrational human drivers and other irrational humans diving into traffic that even the rare decent human driver can't always account for.

American human drivers kill more people than 10 9/11s worth of people every year. Id rather modernizing and automating our roadways would be a moonshot national endeavor, but we don't do that here anymore, so we complain when the incompetent, narcissistic asshole who claimed the project for private profit turned out to be an incompetent, narcissistic asshole.

The tech is inevitable, there are no physics or computational power limitations standing in our way to achieve it, we just lack the will to be a society (that means funding stuff together through taxation) and do it.

Let's just trust another billionaire do it for us and act in the best interests of society though, that's been working just gangbusters, hasn't it?

load more comments (7 replies)
[–] macrocephalic@lemmy.world 11 points 1 year ago (3 children)

Self driving cars are republicans?

load more comments (3 replies)
[–] Dave@lemmy.nz 9 points 1 year ago* (last edited 1 year ago) (6 children)

Weird question, but why does a car need to know if it's a person or not? Like regardless of if it's a person or a car or a pole, maybe don't drive into it?

Is it about predicting whether it's going to move into your path? Well can't you just just LIDAR to detect an object moving and predict the path, why does it matter if it's a person?

Is it about trolley probleming situations so it picks a pole instead of a person if it can't avoid a crash?

[–] almar_quigley@lemmy.world 8 points 1 year ago (1 children)

Im guessing it can’t detect them as objects at all, not that it can’t classify them as humans.

[–] Dave@lemmy.nz 11 points 1 year ago (2 children)

That seems like the car is relying way too much on video to detect surroundings...

[–] reddig33@lemmy.world 8 points 1 year ago
[–] Haquer@lemmy.today 7 points 1 year ago (1 children)
load more comments (1 replies)
[–] fresh@sh.itjust.works 6 points 1 year ago (2 children)

Conant and Ashby’s good regulator theorem in cybernetics says, “Every good regulator of a system must be a model of that system.”

The AI needs an accurate model of a human to predict how humans move. Predicting the path of a human is different than predicting the path of other objects. Humans can stand totally motionless, pivot, run across the street at a red light, suddenly stop, fall over from a heart attack, be curled up or splayed out drunk, slip backwards on some ice, etc. And it would be computationally costly, inaccurate, and pointless to model non-humans in these ways.

I also think trolley problem considerations come into play, but more like normativity in general. The consequences of driving quickly amongst humans is higher than amongst human height trees. I don’t mind if a car drives at a normal speed on a tree lined street, but it should slow down on a street lined with playing children who could jump out at anytime.

load more comments (2 replies)
[–] RobotToaster@infosec.pub 5 points 1 year ago

Cameras and image recognition are cheaper than LIDAR/RADAR, so Tesla uses it exclusively.

load more comments (3 replies)
[–] camillaSinensis@reddthat.com 7 points 1 year ago (2 children)

I'd assume that's either due to bias in the training set, or poor design choices. The former is already a big problem in facial recognition, and can't really be fixed unless we update datasets. With the latter, this could be using things like visible light for classification, where the contrast between target and background won't necessarily be the same for all skin tones and times os day. Cars aren't limited by DNA to only grow a specific type of eye, and you can still create training data from things like infrared or LIDAR. In either case though, it goes to show how important it is to test for bias in datasets and deal with it before actually deploying anything...

In this case it's likely partly a signal to noise problem that can't be mitigated easily. Both children and dark skinned people produce less signal to a camera because they reflect less light. children because they're smaller, and dark skinned people because their skin tones are darker. This will cause issues in the stereo vision algorithms that are finding objects and getting distance to them. Lidar would solve the issue, but companies don't want to use it because lidars with a fast enough update rate and high enough resolution for safe highway driving are prohibitively expensive for a passenger vehicle (60k+ for just the sensor)

load more comments (1 replies)
[–] Aopen@discuss.tchncs.de 7 points 1 year ago (1 children)

Im not expert, but perhaps thermal camera + lidar sensor could help.

load more comments (1 replies)
[–] jhoward@lemmy.sdf.org 6 points 1 year ago

Probably could have stopped that headline at the third word.

[–] ChromeSkull@lemmy.world 4 points 1 year ago (1 children)

A single flir camera would help massively. They don't care about colour or height. Only temperature.

load more comments (1 replies)
[–] Fedizen@lemmy.world 4 points 1 year ago

cars should be tested for safety in collisions with children and it should affect their safety rating and taxes. Driverless equipment shouldn't be allowed on the road until these sorts of issues are resolved.

[–] ma11en@lemmy.world 3 points 1 year ago (3 children)

They need Google Pixel cameras.

load more comments (3 replies)
load more comments
view more: next ›