Researchers found CSAM within five minutes of searching Mastodon.

Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

Sphere
link
fedilink
411Y

So instances that are actually supporting CSAM material can and should be dealt with by law enforcement. That much is simple (and I’m surprised it hasn’t been done with certain … instances, to be honest). But I think the apparently less clearly solved issues have known and working solutions that apply to other parts of the web as well. No content moderation is perfect, but in general, if admins are acting in good faith, I don’t think there should be too much of a problem:

  • For when federation inadvertently spreads some of the material through to other instances’ databases: Isn’t this the same situation as when ISP’s used to cache web traffic to save on bandwidth costs? In that situation, too, browsed web pages would end up in the ISP’s cache which could then harbour whatever material the user was looking at. As I recall, the ISP would just ban CSAM and other illegal material in their terms of service, and remove anyone reported as violating the rule, and that sufficed.
  • As for “bad” instances/users: It’s impossible to block all instances and all users that might disseminate this material as you’d have to go to a “block everything, then allow known entities” rule which would break the Fediverse model. Again, users or site admins found to be acting in bad faith should be blocked and reported (either automatically or manually). Some may slip through the net, but as long as admins are seen to be doing the best they can, that should be enough.

There seem to be concerns about “surveillance” of material on Mastodon, which strikes me as a bit odd. Mastodon isn’t a private platform. People who want private messaging should use an E2EE messaging app like Signal, not a social networking platform like Mastodon (or Twitter, Threads etc.). Mastodon data is already public and is likely already being surveilled, and will be so regardless of what anyone involved with the network wants, because there’s no access control on it anyway. Having Mastodon itself contain code to keep the network clean, even if it only applies to part of the network, just allows those Mastodon admins who are running that part of the code to take some of the responsibility on themselves for doing so, reducing the temptation for third parties to do it for them.

Create a post

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

  • 1 user online
  • 61 users / day
  • 171 users / week
  • 620 users / month
  • 2.31K users / 6 months
  • 1 subscriber
  • 3.28K Posts
  • 67K Comments
  • Modlog