this post was submitted on 14 Aug 2023
1044 points (100.0% liked)
196
16423 readers
1921 users here now
Be sure to follow the rule before you head out.
Rule: You must post before you leave.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That's capitalism's fault. Poor white kids would face the same issues.
That's just a lack of data points and not a system constructed by anyone. The data points should be increasing naturally.
It'd be awesome if we can just solve language barriers generally. Before we can do that having a single official language in working situations seems to be not avoidable for productivity.
Not related to racism.
Is this happening? I think it's straight out wrong to predict criminals with AI trained on previous data.
All in all I agree that many of the existing systems sucks but I don't think it's helpful to link every problem to racism. Disclaimer: I'm not black or white
But why is the data lacking in the first place?
Lack of public black faces as suggested by previous comment? That's not a "system" tho, which would imply something like a policy to reject black faces as learning data.
You're way too smart for this site, I thought reddit echo chambers were bad
True, but that's not what the discussion was about. Black kids are disproportionately poorer than white kids, and that's because they inherited inequalities from the past, which came about because of slavery and systematic racism.
What you are suggesting is a lingua franca, which is already the norm in multilingual countries. That's quite different from having a colonial language dominating over the others.
Lingua franca is a language that 2 groups speaking a different language both understand, English is a lingua franca.
It makes sense to teach in 2 languages when bit percentages of the population speak them, in the case of America teaching native American language would be pointless for the children aside from preventing the language from dying
There are a bunch of articles about it. Here's one about it in general
https://www.nytimes.com/2021/03/15/technology/artificial-intelligence-google-bias.html
And one with a few examples of ai used in criminal justice
https://www.geeksforgeeks.org/5-algorithms-that-demonstrate-artificial-intelligence-bias/
https://lemmy.world/post/3768607
More badly trained AI.