Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • mindbleach@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    We’re not JUST talking about exclusion from a social network.

    Do you speak English?

    The subject matter is the part that’s a felony - so the glib inclusion of the part you just don’t like is dangerous misinformation.

    I am calling out how this study falsely equates child rape and gross drawings, and your neverending hot take is ‘well I don’t care for either.’ There’s not enough ‘who asked’ in the world. One of these things is tacitly legal and has sites listed on Google. One of these things means you die in prison, anywhere in the world.

    And here you are, still calling both of them “child porn.” In the same post insisting you’re not equating them. Thanks for keeping this simple, I guess.

    • balls_expert@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      They’re studying the prevalence of CSAM under the definition of the country they’re in. It’d be arbitrary to separate the two and make two different conclusions.

      • mindbleach@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        No possible definition of child sexual abuse can include drawings.

        Tell me otherwise in the same breath as insisting you’re not making that false equivalence. Apparently my patience is limitless when the lie is that fucking obvious.

        edit: Hang on, the obvious lie disguised a stupid lie. What country do you think Stanford is in? Drawing Bart Simpson’s dick is not illegal in America. You could do it right now, in MS Paint, and e-mail it to the FBI, and they’d just formally tell you to go fuck yourself. Which would obviously not be the case with ACTUAL “child sexual abuse materials,” being evidence of abusing an actual god-damn child.