Blame the algorithm —

Coroner lists Instagram algorithm as contributing cause of UK teen’s death [Updated]

Meta called content "safe" that UK judge found "impossible to watch."

Coroner lists Instagram algorithm as contributing cause of UK teen’s death [Updated]

In a London court this week, coroner Andrew Walker had the difficult task of assessing a question that child safety advocates have been asking for years: How responsible is social media for the content algorithms feed to minors? The case before Walker involved a 14-year-old named Molly Russell, who took her life in 2017 after she viewed thousands of posts on platforms like Instagram and Pinterest promoting self-harm. At one point during the inquest, Walker described the content that Russell liked or saved in the days ahead of her death as so disturbing, the coroner said in court, that he found it "almost impossible to watch."

Today, Walker concluded that Russell's death couldn't be ruled a suicide, Bloomberg reports. Instead, he described her cause of death as "an act of self-harm whilst suffering from depression and the negative effects of online content."

Bloomberg reported that Walker came to this decision based on Russell's "prolific" use of Instagram—liking, sharing, or saving 16,300 posts in six months before her death—and Pinterest—5,793 pins over the same amount of time—combined with how the platforms catered content to contribute to Russell's depressive state.

"The platforms operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text," which "romanticized acts of self-harm" and "sought to isolate and discourage discussion with those who may have been able to help," Walker said.

Following Walker's ruling, Russell's family issued a statement provided to Ars, calling it a landmark decision and saying that the court didn't even review the most disturbing content that Molly encountered.

"This past fortnight has been particularly painful for our family," the Russell family's statement reads. "We're missing Molly more agonizingly than usual, but we hope that the scrutiny this case has received will help prevent similar deaths encouraged by the disturbing content that is still to this day available on social media platforms including those run by Meta."

Bloomberg reports that the family's lawyer, Oliver Sanders, has requested that Walker "send instructions on how to prevent this happening again to Pinterest, Meta, the UK government, and the communications regulator." In their statement, the family pushed UK regulators to quickly pass and enforce the UK Online Safety Bill, which The New York Times reported could institute "new safeguards for younger users worldwide."

Defenses from Pinterest and Facebook took different tactics

During the inquest, Pinterest and Meta took different approaches to defend their policies. Pinterest apologized, saying it didn't have the technology it currently has to more effectively moderate content that Molly was exposed to. But Meta's head of health and well-being, Elizabeth Lagone, frustrated the family by telling the court that the content Molly viewed was considered "safe" by Meta's standards.

"We have heard a senior Meta executive describe this deadly stream of content the platform's algorithms pushed to Molly, as 'SAFE' and not contravening the platform's policies," the Russell family wrote in their statement. "If this demented trail of life-sucking content was safe, my daughter Molly would probably still be alive."

A Meta spokesperson told Bloomberg that the company is "committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers," promising to "carefully consider the Coroner's full report when he provides it."

Molly's family made it a point to praise Pinterest for its transparency during the inquest, urging other social media companies to look to Pinterest as a model when dealing with anyone challenging content policy decisions.

"For the first time today, tech platforms have been formally held responsible for the death of a child," the Russells' statement said. "In the future, we as a family hope that any other social media companies called upon to assist an inquest follow the example of Pinterest, who have taken steps to learn lessons and have engaged sincerely and respectfully with the inquest process."

Bloomberg reported that Pinterest has said that "Molly's story has reinforced our commitment to creating a safe and positive space for our pinners." In response to the ruling, Pinterest said it has "continued to strengthen" its "policies around self-harm content."

Neither Pinterest nor Meta immediately responded to Ars' request for comment. [Update: Pinterest told Ars that its thoughts are with the Russell family, saying it has listened carefully to the court and the family throughout the inquest. According to Pinterest, it is "committed to making ongoing improvements to help ensure that the platform is safe for everyone" and internally "the Coroner’s report will be considered with care." Since Molly's death, Pinterest said it has taken steps to improve content moderation, including blocking more than 25,000 self-harm related search terms and, since 2019, has combined "human moderation with automated machine learning technologies to reduce policy-violating content on the platform."]

Channel Ars Technica