this post was submitted on 07 Aug 2023
372 points (97.0% liked)

Science

13144 readers
107 users here now

Subscribe to see new publications and popular science coverage of current research on your homepage


founded 5 years ago
MODERATORS
 

Was this AI trained on an unbalanced data set? (Only black folks?) Or has it only been used to identify photos of black people? I have so many questions: some technical, some on media sensationalism

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (1 children)

@nieceandtows The fact that there have been issues with sensors (which is true) does not disprove systemic racism (which exists). That's like saying that because I put vinegar in the dressing the lemon juice wasn't sour. It doesn't follow.

[–] [email protected] 0 points 1 year ago (1 children)

Putting the same thing the other way around: The fact that there have been issues with systemic racism (which is true) does not disprove technical malfunction (which exists). That’s like saying that because the lemon juice is sour it means it has vinegar in it. It doesn’t follow. Lemon juice can be sour just because it has lemons in it, without need of any vinegar in it.

[–] [email protected] 3 points 1 year ago (1 children)

@nieceandtows But we know that there is systemic racism in the police. There *is* vinegar in it.

[–] [email protected] 1 points 1 year ago

@fishidwardrobe As far as the UK is concerned (re facial recognition) I recall the latest study has found false positives disproportionately higher for Black people and statistically significant.

The UK Police thought this acceptable and have continued the roll out of this tech. A judgement call that bakes a little bit more systemic racism into UK Policing with little to no accountability.

https://science.police.uk/site/assets/files/3396/frt-equitability-study_mar2023.pdf

PS. I'm not academically qualified to comment on the paper, but take an interest in these things.

@nieceandtows