META-owned photo and video sharing platform Instagram “helps connect and promote ” accounts dedicated to the commission and purchase of underage-sex content, according to a joint investigation by the Wall Street Journal and academics at Stanford University and the University of Massachusetts Amherst.
The vast network of accounts were used to help paedophiles find child pornographer and arrange meet-ups with children, with the investigation laying bare the extent to which the platform’s recommendation systems “connects pedophiles and guides them to content sellers.”
Researchers were able to find accounts which advertised using explicit hashtags including #preteensex, #pedowhore, and #pedobait — which then connected users to accounts that used such terms to advertise child-sex material for sale.
Chilling discoveries included “menus” of content offering users to buy or commission — including images of bestiality and self-harm.
Researchers said that once they set up a test account and viewers content shared by such networks, they were immediately met with “suggested for you” recommendations of more accounts to follow, as well as accounts which linked to off-platform content trading sites.
They added that “following just a handful of these recommendations was enough to flood a test account with content that sexualises children”.
As the exposé detailed: “Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them.
“Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests.”
The WSJ’s report added that while Instagram accounts that offer to sell illicit sex material “generally don’t publish it openly,” these accounts often link to “off-platform content trading sites.”
“At the right price, children are available for in-person ‘meet ups,’” the WSJ reported.
According to the report, other social media platforms such as Snapchat and TikTok do not seem to facilitate networks of paedophiles seeking out child abuse content in the same way Instagram has done. Regarding Twitter, 128 accounts offering to sell such content were found — however researchers said that Twitter did not recommend such accounts in the same way Instagram did and was in additionally found to remove such accounts “far more quickly”.
Responding to a viral Twitter thread detailing the WSJ report, Twitter owner Elon Musk described the revelations as “terrible”.
Wow, this is terrible!
— Elon Musk (@elonmusk) June 7, 2023
In a statement, Meta addressed the findings of the report, while describing child exploitation as a “horrific crime.” The company, which also owns Facebook and WhatsApp, said that it was working “aggressively” to fight it across its platforms.
A spokesperson said Meta was “continuously exploring ways to actively defend against this behavior, and we set up an internal task force to investigate these claims and immediately address them”.
The spokesperson also cited a software error which prevented Meta from receiving reports of child sexual abuse and acting on them.
Additionally, the company said it “provided updated guidance to our content reviewers to more easily identify and remove predatory accounts”.
“Child exploitation is a horrific crime,” the statement continued. “We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it.
“Predators constantly change their tactics in their pursuit to harm children, and that’s why we have strict policies and technology to prevent them from finding or interacting with teens on our apps and hire specialist teams who focus on understanding their evolving behaviours so we can eliminate abusive networks.”