LEAVE BAD ANDROID REVIEWS HERE

LEAVE BAD IOS REVIEWS HERE

:crab-party: :crab-party: :crab-party:

↓↓↓ Read theory :LIB: ↓↓↓

Trans women are not allowed to use Giggle

It claims to enforce this by requiring users to “verify they are a female” by submitting an image to be analysed by AI

Giggle has a community called ‘Gender Identity’ for women who are de-transitioning only

The CEO/Founder of this flaming pile of garbage is a notorious TERF that made a name for herself after having a hissy fit about swimming pools

She’s currently trying to get a journalist fired (for calling her a TERF)

There are many more examples of why this is a terrible thing run by a terrible person, but I’m sure you all get the point.

EDIT: Just found an article where she doubles down on EVERYTHING

  • MichelLouise [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 years ago

    New reminder (after the usage of facial recognition by the police during the protests of course, but also after the more recent twitter cropping thing) that most current AI computer vision software that exists have a racist and sexist bias. Joy Buolamwini and the now famous Timnit Gebru showed in 2018 that systems showing a >99% accuracy for gender classification were actually mostly evaluated on (and developed by?) white men, and that there was a 35 points drop in accuracy when evaluating on a black female dataset. Basically, if you’re a black woman, there is a >1/3 chance that AI will classify you as a man.

    (They re-evaluated the same software in a later paper showing that, compared to a control group of products that were not in the initial study, the fairness of the systems exposed improved over time. So it seems that even when it’s through academic publications, bullying works.)

    But with this app the additional problem is that the system misgendering someone will not even be considered as a bug, but precisely as a feature.