504
In every reported case where police mistakenly arrested someone using facial recognition, that person has been Black
(www.businessinsider.com)
Subscribe to see new publications and popular science coverage of current research on your homepage
IMO, the fact that the models aren't accurate with people of color but they're putting the AI to use for them anyway is the systemic racism. If the AI were not good at identifying white people do we really think it would be in active use for arresting people?
It's not the fact that the technology is much much worse at identifying people of color that is the issue, it's the fact that it's being used anyway despite it.
And if you say "oh, they're just being stupid and didn't realize it's doing that " then it's egregious that they didn't even check for that.
That part I can agree with. These issues should have been fixed before it was rolled out. The fact that they don’t care is very telling.