I don’t know if you need this info, but I was pretty disturbed to see unexpected child pornography on a casual community. Thankfully it didn’t take place on SLRPNK.net directly, but if anyone has any advice besides leaving the community in question, let me know. And I wanted to sound an alarm to make sure we have measures in place to guard against this.

  • poVoq@slrpnk.netM
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    5 months ago

    The best tool that is currently available is lemmy-safty AI image scanning that can be configured to check images on upload or regularly scan the storage and remove likely csam images.

    It’s a bit tricky to set up as it requires an GPU in the server and works best with object storage, but I have a plan to complete the setup of it for SLRPNK sometimes this year.

    • silence7@slrpnk.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 months ago

      This is probably the best option; in a world where people use ML tools to generate CSAM, you can’t depend on visual hashes of known-problematic images anymore.