Though this content could flourish in pockets of the fediverse, the scary scenario of prevalent child sexual abuse material is not the case. There are many moderation tools, including shared blocklists, that prevent it. However, the idea that the fediverse is full of harmful content was used by Elon Musk to justify his anti-competitive decision to block links from X to Mastodon.
Didn’t he unban someone who posted one of the worst CSAM videos known?
There are some public numbers on how many occurrences are found each year on the major platforms.
IIRC, Facebook deals with around 75 million reports per year. Twitter, Reddit, and others were around 20 million reports per year.
I don’t know how many are dealt with on Mastodon or Lemmy (or how you’d even get reliable numbers for that), but something tells me it’s a lot less than the bigger platforms these days.
We have heard of some here and there. The biggest problem is instances with open signups, they’re the ones that tend to get CSAM. That and instances that see nothing wrong with ‘lolicon’.
In the four years that I’ve been an admin here I’ve only seen one CSAM case. I don’t want to see another one. It was very difficult dealing with it on a personal level.
I’m very sorry for you. People might not realize how traumatizing having to deal with it can be. It definitely shouldn’t be the responsibility of people without proper support or training.
Didn’t he unban someone who posted one of the worst CSAM videos known?
CSAM is a risk on any platform. I guarantee there’s private subreddits with it.
There are some public numbers on how many occurrences are found each year on the major platforms.
IIRC, Facebook deals with around 75 million reports per year. Twitter, Reddit, and others were around 20 million reports per year.
I don’t know how many are dealt with on Mastodon or Lemmy (or how you’d even get reliable numbers for that), but something tells me it’s a lot less than the bigger platforms these days.
We have heard of some here and there. The biggest problem is instances with open signups, they’re the ones that tend to get CSAM. That and instances that see nothing wrong with ‘lolicon’.
In the four years that I’ve been an admin here I’ve only seen one CSAM case. I don’t want to see another one. It was very difficult dealing with it on a personal level.
I’m very sorry for you. People might not realize how traumatizing having to deal with it can be. It definitely shouldn’t be the responsibility of people without proper support or training.