The context is that earlier image generating AI took some flak for generating images that didn’t have much diversity, even if that was true of the training data. For instance, you might have asked one to generate images of a doctor treating a child in a doctor’s office, and it would generate almost all male doctors, maybe even white male.
So in response, some of them are programmed to generate a range of diverse subjects when creating images of people. Lots of racial and gender variation. But they didn’t stop to think that some historical groups didn’t have much diversity, and it’s a mistake to artificially create it, as in the example of Nazi soldiers.
The context is that earlier image generating AI took some flak for generating images that didn’t have much diversity, even if that was true of the training data. For instance, you might have asked one to generate images of a doctor treating a child in a doctor’s office, and it would generate almost all male doctors, maybe even white male.
So in response, some of them are programmed to generate a range of diverse subjects when creating images of people. Lots of racial and gender variation. But they didn’t stop to think that some historical groups didn’t have much diversity, and it’s a mistake to artificially create it, as in the example of Nazi soldiers.
So yes, a “thumb on the scale” of sorts.