Conversation
Notices
-
Embed this notice
feld (feld@friedcheese.us)'s status on Wednesday, 29-Jan-2025 04:50:08 JST feld
@rachel @quixoticgeek cameras themselves since time immemorial have been unable to capture black skin tones properly -
Embed this notice
Rachel (rachel@transitory.social)'s status on Wednesday, 29-Jan-2025 04:50:09 JST Rachel
@quixoticgeek@social.v.st always has been
there was the AI tool that made asian women look more white when "beauty" was adjusted
there was wrist heart rate that didn't work on some skin tones
there was face detection that could barely detect brown people
and those are just the recent ones that bubbled up to my news feeds -
Embed this notice
Quixoticgeek (quixoticgeek@social.v.st)'s status on Wednesday, 29-Jan-2025 04:50:11 JST Quixoticgeek
Deepseek is at least making it very very clear that these models merely encode the biases of the cultures that make them.
Deepseek doesn't want to know about Tiananmen Square or Taiwan.
Gitlabs copilot won't let you talk about trans people, or gender (or transactions...)
The software systems we use implement the biases of those who make them.
Think about that and remember the techbros making the software we all use and rely on.
-
Embed this notice