Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.
https://blog.cloudflare.com/the-csam-scanning-tool/
https://developers.cloudflare.com/cache/reference/csam-scanning/
(From just a quick survey, the smaller open source CSAM scanning projects seem to mostly have stalled, and the government hosted ones are in perpetual design and planning).
Spitballing here, but having a corresponding images.lemmy.tld stood up along side each Lemmy instance where one could put that site behind cloud flare could be a workable approach.
While I agree that CSAM material needs to be addressed it’s also worth pointing out that cloudflare has some privacy issues.