An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.::Rona Wang, a 24-year-old MIT student, was experimenting with the AI image creator Playground AI to create a professional LinkedIn photo.
Meanwhile every trained model on Civit.ai produces 12/10 Asian women…
Joking aside, what you feed the model is what you get. Model is trained. You train it on white people, it’s going to create white people, you train it on big titty anime girls it’s not going to produce WWII images either.
Then there’s a study cited that claims Dall-e has a bias when producing images of CEO or director as cis-white males. Think of CEOs that you know. Better yet, google them. It’s shit but it’s the world we live in. I think the focus should be on not having so many white privileged people in the real world, not telling AI to discard the data.
Yeah there are a lot of cases of claims being made of AI “bias” which is in fact just a reflection of the real world (from which it was trained). Forcing AI to fake equal representation is not fixing a damn thing in the real world.
Why should you focus on tearing others down, especially when you’re simply looking at them as a statistic rather than individuals?
I recall the being a study of the typical CEO. 6+ feet tall, white males.
But yeah, the output she was getting really depends heavily on the data that whatever model she used was trained on. For someone who is a computer science major, I’m surprised she simply cried “racial bias” rather than investigating the why, and how to get the desired results. Like cranking down the denoising strength.
To me it just seems like she tried messing around with those easy to use, baity websites without really understanding the technology.