cross-posted from: https://beehaw.org/post/6795142

Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • drdiddlybadger@pawb.social
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    1 year ago

    Isn’t this bound to happen without built in automated tools for flagging and moderation. Not quite sure how the federation handles this sort of thing besides community modding, saying something if you see something.

    • debounced@kbin.run
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      yep, i use Cloudflare’s CSAM tool to help aid in this… scans all object storage and cached items against known CSAM hashes. i don’t think most people hosting instances consider this as a massive liability if it’s open to the web for all to see… the feds (only talking about USA here) will shut you down or worse threaten charges if nothing is done about it.

    • Efwis
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      The problem is most of the people that see this crap want to see it. They take perverts to a whole new level, sick bastards 🤮🤮🤮