- cross-posted to:
- legalnews
- cross-posted to:
- legalnews
cross-posted from: https://lemmy.zip/post/15863526
Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison
cross-posted from: https://lemmy.zip/post/15863526
Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison
It isn’t csam if there was no abuse.
It’s not child sexual assault if there was no abuse. However, the legal definition of csam is any visual depiction, including computer or computer-generated images of sexually explicit conduct, where […]— (A) the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct; (B) such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or © such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.
You may not agree with that definition, but even simulated images that look like kids engaging in sexual activity meet the threshold for CSAM.
Yes it is
CSAM is child pornography
Do you not know that CSAM is an acronym that stands for child sexual abuse material?
True but CSAM is anything that involves minors. Its really up to the court to decide a lot of it but in the case above I’d imagine that the images were quite disturbing.
in this instance, no human children or minors of any kind were involved.
I think the court looked at the phycological aspects of it. When you look at that kind of material you are training your brain and body to be attracted to that stuff in real life.
prove that any “training” is involved, please.