An in-depth report reveals an ugly truth about isolated, unmoderated parts of the Fediverse. It’s a solvable problem, with challenges.

  • Sean Tilley@lemmy.mlOPM
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    I agree that the problem isn’t with the Fediverse itself, any more than it is with email, usenet, encrypted messengers, etc.

    The thing is, it’s a problem that affects the network. While “block and move on” is a reasonable strategy for getting that crap out of your own instance’s feeds, the real meat and potatoes of the issue have to do with legal and legislative repercussions. If an admin comes across this stuff, they have a legal obligation to report it, in most jurisdictions. In fact, the EARN IT and STOP CSAM acts that politicians are trying to push through Congress are likely to make companies overreact to any potential penalty that could come from accidental cross-pollination of CSAM between servers.

    Unfortunately, this thing becomes a whole lot messier when an instance discovers cached CSAM after the fact. There was a Mastodon instance that was recently taken down without any turnaround time given to the admin to look into it, the hosting company was just ordered to comply with a CSAM request that basically said “This server has child porn on it.”

    Also, regardless of whether you report it or block it and pretend you never saw anything, that doesn’t change the fact that it’s still happening. At the very least, having tooling to make the reporting easier would probably be a big boon to knocking those servers off the network.