Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • priapus@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    That Wikipedia broader is about CP, a broader topic. Practically zero authorities will include illustrated and simualated forms of CP in their definitions of CSAM

    • balls_expert@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I assumed it was the same thing, but if you’re placing the bar of acceptable content below child porn, I don’t know what to tell you.

      • priapus@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        That’s not what I was debating. I was debating whether or not it should be reported to authorities. I made it clear in my previous comment that it is disturbing and should always be defederated.

        • balls_expert@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Ah. It depends on the jurisdiction the instance is in

          Mastodon has a lot of lolicon shit in japan-hosted instances for that reason

          Lolicon is illegal under US protect act of 2003 and in plenty of countries