MichelLouise [he/him]

  • 1 Post
  • 1 Comment
Joined 4 years ago
cake
Cake day: July 26th, 2020

help-circle
  • New reminder (after the usage of facial recognition by the police during the protests of course, but also after the more recent twitter cropping thing) that most current AI computer vision software that exists have a racist and sexist bias. Joy Buolamwini and the now famous Timnit Gebru showed in 2018 that systems showing a >99% accuracy for gender classification were actually mostly evaluated on (and developed by?) white men, and that there was a 35 points drop in accuracy when evaluating on a black female dataset. Basically, if you’re a black woman, there is a >1/3 chance that AI will classify you as a man.

    (They re-evaluated the same software in a later paper showing that, compared to a control group of products that were not in the initial study, the fairness of the systems exposed improved over time. So it seems that even when it’s through academic publications, bullying works.)

    But with this app the additional problem is that the system misgendering someone will not even be considered as a bug, but precisely as a feature.