20 years after Mark Zuckerbergā€™s infamous ā€˜hot-or-notā€™ website, developers have learned absolutely nothing.


Two decades after Mark Zuckerberg created FaceMash, the infamously sexist ā€œhot-or-notā€ website that served as the precursor to Facebook, a developer has had the bright idea to do the exact same thingā€”this time with all the women generated by AI.

A new website, smashorpass.ai, feels like a sick parody of Zuckerbergā€™s shameful beginnings, but is apparently meant as an earnest experiment exploring the capabilities of AI image recommendation. Just like Zuckā€™s original site, ā€œSmash or Passā€ shows images of women and invites users to rate them with a positive or negative response. The only difference is that all the ā€œwomenā€ are actually AI generated images, and exhibit many of the telltale signs of the sexist bias common to image-based machine learning systems.

For starters, nearly all of the imaginary women generated by the site have cartoonishly large breasts, and their faces have an unsettling airbrushed quality that is typical of AI generators. Their figures are also often heavily outlined and contrasted with backgrounds, another dead giveaway for AI generated images depicting people. Even more disturbing, some of the images omit faces altogether, depicting headless feminine figures with enormous breasts.

According to the siteā€™s novice developer, Emmet Halm, the site is a ā€œgenerative AI party gameā€ that requires ā€œno further explanation.ā€

ā€œYou know what to do, boys,ā€ Halm tweeted while introducing the project, inviting men to objectify the female form in a fun and novel way. His tweet debuting the website garnered over 500 retweets and 1,500 likes. In a follow-up tweet, he claimed that the top 3 images on the site all had roughly 16,000 ā€œsmashes.ā€

Understandably, AI experts find the project simultaneously horrifying and hilariously tonedeaf. ā€œItā€™s truly disheartening that in the 20 years since FaceMash was launched, technology is still seen as an acceptable way to objectify and gather clicks,ā€ Sasha Luccioni, an AI researcher at HuggingFace, told Motherboard after using the Smash or Pass website.

One developer, Rona Wang, responded by making a nearly identical parody website that rates menā€”not based on their looks, but how likely they are to be dangerous predators of women.

The sexist and racist biases exhibited by AI systems have been thoroughly documented, but that hasnā€™t stopped many AI developers from deploying apps that inherit those biases in new and often harmful ways. In some cases, developers espousing ā€œanti-wokeā€ beliefs have treated bias against women and marginalized people as a feature of AI, and not a bug. With virtually no evidence, some conservative outrage jockeys have claimed the oppositeā€”that AI is ā€œwokeā€ because popular tools like ChatGPT wonā€™t say racial slurs.

The developerā€™s initial claims about the siteā€™s capabilities seem to be exaggerated. In a series of tweets, Halm claimed the project is a ā€œrecursively self-improvingā€ image recommendation engine that uses the data collected from your clicks to determine your preference in AI-generated women. But the currently-existing version of the site doesnā€™t actually self-improveā€”using the site long enough results in many of the images repeating, and Halm says the recursive capability will be added in a future version.

Itā€™s also not gone over well with everyone on social media. One blue-check user responded, ā€œBro wtf is this. The concept of finetuning your aesthetic GenAI image tool is cool but you definitely could have done it with literally any other category to prove the concept, like food, interior design, landscapes, etc.ā€

Halm could not be reached for comment.

ā€œIā€™m in the arena trying stuff,ā€ Halm tweeted. ā€œSome ideas just need to exist.ā€

Luccioni points out that no, they absolutely do not.

ā€œThere are huge amounts of nonhuman data that is available and this tool could have been used to generate images of cars, kittens, or plantsā€”and yet we see machine-generated images of women with big breasts,ā€ said Luccioni. ā€œAs a woman working in the male-dominated field of AI, this really saddens me.ā€


  • alwaysconfused@lemmy.ca
    link
    fedilink
    English
    arrow-up
    14
    Ā·
    10 months ago

    I originally posted the following comment as a reply to another comment that has now been removed. Iā€™m reposting it as I think it still has value to the current conversation under this post.

    This type of ā€œparty gameā€ is still at itā€™s core objectifying women. They may be generated images but the whole project is aimed at passing judgement on women you would rate as fuckable or not. Itā€™s encouraging behaviour that makes women feel uncomfortable or unsafe.

    This type of objectifying isnā€™t exclusive to this project. Groups of men will rate and objectify women casually and frequently. Iā€™ve worked in the trades and have been surrounded by such talk from men. The more normalized this type of behaviour is, the easier it is to consider women as less than human. Feeling like a replaceable tool with no sense of self or sense of worth is dehumanizing.

    They could have chosen to base this project on just about anything else in our world. We have animals, nature, technology and so much more to try this kind of thing out on. Yet, what seems like another ā€œtech broā€ idea was focused on hyper sexualizing and objectifying women as if they were just another thing for menā€™s entertainment.

    Simply, itā€™s gross behaviour. Just because they are generated images does not make it any less gross or acceptable. People are not objects for another personā€™s amusement and we should not encourage such behaviour.``