cross-posted from: https://jamie.moe/post/113630
There have been users spamming CSAM content in [email protected] causing it to federate to other instances. If your instance is subscribed to this community, you should take action to rectify it immediately. I recommend performing a hard delete via command line on the server.
I deleted every image from the past 24 hours personally, using the following command:
sudo find /srv/lemmy/example.com/volumes/pictrs/files -type f -ctime -1 -exec shred {} \;
Note: Your local jurisdiction may impose a duty to report or other obligations. Check with these, but always prioritize ensuring that the content does not continue to be served.
Update
Apparently the Lemmy Shitpost community is shut down as of now.
That’s what we’re pushing the lemmy devs to do. Honestly even if they want to use proprietary tools for this instance I’m okay, I’ll happily go register an Azure account and plop an API key into the UI so it can start scanning. Lemmy should have the guardrails to prevent this from ever hitting our servers.
In the meantime, services like cloudflare will handle the recognizing and blocking access to images like that, but the problem still comes down to the federation of images. Most small hosters do not want the risk of hosting images from the whole of the internet, and it sounds like there is code in the works to disable that. Larger hosters who allow open registrations can do what they please and host what they please, but for us individual hosters we really need tools to block this.
Proprietary software isnt necessary there are plenty of project that detect scam
I’m saying when it comes to this I don’t care if it is or isn’t proprietary, frankly I’d be down if we used multiple ones. I’m all for my morals but when it comes to CSAM as long as it works. That’s the most important, and yes I’d probably use multiples