this post was submitted on 07 Aug 2023
504 points (97.7% liked)

Science

13175 readers
9 users here now

Subscribe to see new publications and popular science coverage of current research on your homepage


founded 5 years ago
MODERATORS
 

Was this AI trained on an unbalanced data set? (Only black folks?) Or has it only been used to identify photos of black people? I have so many questions: some technical, some on media sensationalism

you are viewing a single comment's thread
view the rest of the comments
[–] DavidGarcia@feddit.nl 32 points 1 year ago (6 children)

Putting any other issues aside for a moment, I'm not saying they're not true also. Cameras need light to make photos, the more light they get, the better the image quality. Just look at astronomy, we don't find the dark astetoids/planets/stars first, we find the ones that are the brightest and we know more about them than about a planet with lower albedo/light intensity. So it is literally physically harder to collect information about anything black, that includes black people. If you have a person with a skin albedo of 0.2 vs one with 0.6, you get 3x less information in the same amount of time all things being equal.

And also consider that cameras have a limited dyanmic range and white skin might often be much closer to most objects around us than black skin. So if the facial features of the black person might fall out of the dynamic range of the camera and be lost.

The real issue with these AIs is that they aren't well calibrated, meaning the output confidence should mirror how often predictions are correct. If you get a 0.3 prediction confidence, among 100 predictions 30 of them should be correct. Then any predictions lower than 90% or so should be illegal for the police to use, or something like that. Basically the model should tell you that it doesn't have enough information and the police should appropriately act on that information.

I mean really facial recognition should be illegal for the police to use, but that's besides the point.

[–] isthismanas_droid@lemdro.id 3 points 1 year ago (3 children)

Exactly! I don't think any programmer would intentionally go out of their way to make it so that only the people with dark skin tones are matched from the database. It has got something to do with how it is not easy to detect facial features on a darker skin tone. The image vectors will have noisy information per pixel and the pixel intensities will be similar in some patches of the image because of the darker skin tone. But that's just my unbiased programmer's way of thinking. Let's hope the world is still beautiful !We are all humans afterall

load more comments (2 replies)