cross-posted from: https://lemmy.zip/post/15863526

Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison

  • laughterlaughter@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    1 month ago

    I mean… regardless of your moral point of view, you should be able to answer that yourself. Here’s an analogy: suppose I draw a picture of a man murdering a dog. It’s an animal abuse image, even though no actual animal abuse took place.

      • laughterlaughter@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 month ago

        Except that it is an animal abuse image, drawing, painting, fiddle, whatever you want to call it. It’s still the depiction of animal abuse.

        Same with child abuse, rape, torture, killing or beating.

        Now, I know what you mean by your question. You’re trying to establish that the image/drawing/painting/scribble is harmless because no actual living being suffering happened. But that doesn’t mean that they don’t depict it.

        Again, I’m seeing this from a very practical point of view. However you see these images through the lens of your own morals or points of view, that’s a totally different thing.

          • laughterlaughter@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            1 month ago

            No, they’re violent films.

            Snuff is a different thing, because it’s supposed to be real. Snuff films depict violence in a very real sense. So so they’re violent. Fiction films also depict violence. And so they’re violent too. It’s just that they’re not about real violence.

            I guess what you’re really trying to say is that “Generated abuse images are not real abuse images.” I agree with that.

            But at face value, “Generated abuse images are not abuse images” is incorrect.