• JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    15
    ·
    8 months ago

    Why does it matter? If they share them it’s obviously bad, but if they keep them to themselves it harms no one.

    • oldGregg@lemm.ee
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      8 months ago

      It wouldnt be be news if they didn’t share them.

      If they didn’t, nobody would know and nobody could care

    • Dudewitbow@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      6
      ·
      8 months ago

      Outside of the invasuon of privacy, they are at a highschool. There’s a high chance that the material is CP, even if fake.

      • NightAuthor@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 months ago

        Well, apparently not csam according to the law.

        Also, putting your public face on a ai body has nothing to do with privacy.

        If anything it’s more akin to trademark or copyright on your own likeness.

        Idk, it’s all weird and fucked up, but csam and privacy violating its not.

        • sik0fewl@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          It’s possible that other New Jersey laws, like those prohibiting harassment or the distribution of child sexual abuse materials, could apply in this case. In April, New York sentenced a 22-year-old man, Patrick Carey, to six months in jail and 10 years of probation “for sharing sexually explicit ‘deepfaked’ images of more than a dozen underage women on a pornographic website and posting personal identifying information of many of the women, encouraging website users to harass and threaten them with sexual violence.” Carey was found to have violated several laws prohibiting harassment, stalking, child endangerment, and “promotion of a child sexual performance,” but at the time, the county district attorney, Anne T. Donnelly, recognized that laws were still lacking to truly protect victims of deepfake porn.

    • NightAuthor@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      8 months ago

      Like the other said, obv they weren’t just keeping them secret bc people found out.

      Additionally, condoning actions like this must have implications on objectification of women, which should be socially condemned.