corb3t@lemmy.world to Technology@lemmy.ml · edit-21 year agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square172fedilinkarrow-up127arrow-down112file-textcross-posted to: [email protected]
arrow-up115arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comcorb3t@lemmy.world to Technology@lemmy.ml · edit-21 year agomessage-square172fedilinkfile-textcross-posted to: [email protected]
minus-squareredcalcium@lemmy.institutelinkfedilinkarrow-up0arrow-down1·1 year agoIf you run your instance behind cloudlare, you can enable the CSAM scanning tool which can automatically block and report known CSAMs to authorities if they’re uploaded into your server. This should reduce your risk as the instance operator. https://developers.cloudflare.com/cache/reference/csam-scanning/
If you run your instance behind cloudlare, you can enable the CSAM scanning tool which can automatically block and report known CSAMs to authorities if they’re uploaded into your server. This should reduce your risk as the instance operator.
https://developers.cloudflare.com/cache/reference/csam-scanning/