An investigation by the security publication Secjuice has found that the online social media platform Mastodon is home to a considerable volume of child pornography.
Mastodon, a decentralised social media platform, was set up in 2016 but saw a sudden surge in popularity in late 2022 after Elon Musk bought Twitter. That purchase, alongside comments made by Musk, caused an exodus of left-wing activists, academics, and journalists from Twitter to Mastodon. However, it has now emerged that Mastodon is home to significant amounts of child pornorgraphy.
The Secjuice investigation found that what they described as “child porn communities” were the second and third largest communities on Mastodon. These are, according to Secjuice, the most active communities on Mastodon, with 60 million posts each.
It’s important to note that these communities appear to be Japanese, and that the child pornography found on them consists of drawn images rather than video or photographs of real children. Japan has traditionally had a very different relationship with the possession of child pornography than most Western countries, with the possession of child pornography only becoming a crime in Japan in 2014/15 – the sale, distribution, and/or production of child pornography was banned in Japan in 1999.
Secjuice stated that these communities were “the visible tip of the Mastodon child porn iceberg,” and that there were thousands of smaller groups which contained more extreme child porn.
Whilst we could not confirm all aspects of Secjuice’s reporting, due to the clear legal issues with accessing this type of content, we were able to confirm that the second most popular Mastodon community contains a considerable amount of drawn images which would be classed as child pornography in Ireland. We couldn’t confirm how much child pornography is available in this community, or how popular such content is, due to legal concerns, and the dessimination of child pornography is not the sole purpose of that community, but the sharing of such material appears to be common within the community.
The Department of Justice told Gript that “in the context of child sexual abuse and exploitation, computer-generated or “virtual” child sexual abuse, refers to wholly or partly drawn, artificially or digitally created and/or altered sexualised images of children, and that this is illegal under Irish law. This can include, for example, cartoons, drawings, computer-generated animations or imagery, pseudophotographs (e.g., where the computer-generated image is almost indistinguishable from that of a real living child), stories, etc.”
Both the Department and An Garda Síochána said that they would encourage anyone who encounters this material, or any material that they are concerned may be explicit images of children, to confidentiality and securely report the material to Hotline.ie.
We reached out to Mastodon for comment, asking them to detail the steps they had taken to pushback against the posting of child pornography on their platform, or the technical limitations to that pushback, but we received no response. We will update this story should we receive a response after publication.