Sammeeeeeee@lemmy.world to Technology@lemmy.worldEnglish · 11 months agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square38fedilinkarrow-up148arrow-down123 cross-posted to: [email protected][email protected][email protected][email protected][email protected][email protected][email protected]
arrow-up125arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comSammeeeeeee@lemmy.world to Technology@lemmy.worldEnglish · 11 months agomessage-square38fedilink cross-posted to: [email protected][email protected][email protected][email protected][email protected][email protected][email protected]
minus-squarewhenigrowup356@lemmy.worldlinkfedilinkEnglisharrow-up3·11 months agoShouldn’t it be possible to create open-source bots that use the same databases as the researchers to automatically flag and block that kind of content?
minus-squareozymandias117@lemmy.worldlinkfedilinkEnglisharrow-up2·11 months agoThose databases are highly regulated, as they are, themselves CSAM Apple tried to do fuzzy hashes to download them to devices, and it wasn’t able to reliably identify things at all
Shouldn’t it be possible to create open-source bots that use the same databases as the researchers to automatically flag and block that kind of content?
Those databases are highly regulated, as they are, themselves CSAM
Apple tried to do fuzzy hashes to download them to devices, and it wasn’t able to reliably identify things at all