A coroner in the UK has listed Instagram’s algorithm as a contributing factor to a teenage girl’s death.
14-year-old Molly Russell took her life in 2017 after viewing thousands of posts on platforms such as Instagram and Pinterest which promoted self-harm.
According to Bloomberg, during a court hearing in London last week, coroner Andrew Walker was tasked with finding out what role, if any, social media algorithms played in Russell’s suicide, and the mental health of adolescent users more broadly.
Walker said that some of the content that Russell’s account had liked and saved was so disturbing it was “almost impossible to watch.” However, he argued that the girl’s death could not truly be ruled a suicide, and instead described it as “an act of self-harm whilst suffering from depression and the negative effects of online content.”
He said he reached this conclusion based on her “prolific” use of Instagram, on which she liked, shared and saved 16,300 posts in the six months before her death. She similarly saved 5,739 pins on Pinterest during the same period.
“The platforms operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text,” said Walker. These posts “romanticised acts of self-harm” and “sought to isolate and discourage discussion with those who may have been able to help.”
Russell’s family issued a statement following the ruling.
“This past fortnight has been particularly painful for our family,” they said, speaking to Ars Technica.
“We’re missing Molly more agonisingly than usual, but we hope that the scrutiny this case has received will help prevent similar deaths encouraged by the disturbing content that is still to this day available on social media platforms including those run by Meta.”