An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.::Rona Wang, a 24-year-old MIT student, was experimenting with the AI image creator Playground AI to create a professional LinkedIn photo.

  • CoderKat@lemm.ee
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    If it’s stable diffusion img2img, then totally, this is a misunderstanding of how that works. It usually only looks at things like the borders or depth. The text based prompt that the user provides is otherwise everything.

    That said, these kinds of AI are absolutely still biased. If you tell the AI to generate a photo of a professor, it will likely generate an old white dude 90% of the time. The models are very biased by their training data, which often reflects society’s biases (though really more a subset of society that created whatever training data the model used).

    Some AI actually does try to counter bias a bit by injecting details to your prompt if you don’t mention them. Eg, if you just say “photo of a professor”, it might randomly change your prompt to “photo of a female professor” or “photo of a black professor”, which I think is a great way to tackle this bias. I’m not sure how widespread this approach is or how effective this prompt manipulation is.