- cross-posted to:
- legalnews
- cross-posted to:
- legalnews
cross-posted from: https://lemmy.zip/post/15863526
Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison
cross-posted from: https://lemmy.zip/post/15863526
Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison
The basis of making CSAM illegal was that minors are harmed during the production of the material. Prior to CG, the only way to produce pornographic images involving minors was to use real, flesh-and-blood minors. But if no minors are harmed to create CSAM, then what is the basis for making that CSAM illegal?
Think of it this way: if I make a pencil drawing of a minor being sexually abused, should that be treated as though it is a criminal act? What if it’s just stick figures, and I’ve labeled one as being a minor, and the others as being adults? What if I produce real pornography using real adults, but used actors that appear to be underage, and I tell everyone that the actors were all underage so that people believe it’s CSAM?
It seems to me that, rationally, things like this should only be illegal when real people are being harmed, and that when there is no harm, it should not be illegal. You can make an entirely reasonable argument that pornographic images created using a real person as the basis does cause harm to the person being so depicted. But if it’s not any real person?
This seems like a very bad path to head down.
Simpson CSAM in 2008 in Australia:
https://www.sydneycriminallawyers.com.au/blog/bizarre-australian-criminal-cases-the-simpsons-porn-case/