An inquest into the death of London teenager Molly Russell has seen an Instagram executive defend the sharing of suicidal content on social media, claiming that it helps people to “share feelings and express themselves”.
Per The Telegraph, Elizabeth Lagone, head of health and wellbeing at Meta – Instagram‘s parent company – gave evidence on 23 September, expressing that Instagram allows certain content because it is being “posted in order to create awareness”, for people to “come together for support” or for someone to “talk about their own experience”.
It came after representatives from both Pinterest and Meta flew to the UK to give evidence in the inquest – both issued a formal apology to Molly’s family.
Molly, who was just 14 when she took her own life, had viewed thousands of disturbing posts via social media in the months leading up to her death.
Oliver Sanders KC, representing the Russell family, challenged Lagone repeatedly on whether a child would be able to tell the difference between “content that encourages or raises awareness” of suicide and self-harm, according to The Telegraph.
Lagone replied: “I really can’t answer that question because we don’t allow content that encourages self-injury.”
She also added that it was important for Meta to consider “the broad and unbelievable harm” that silencing a poster might cause “when talking about their troubles”.
The court was shown Instagram’s guidelines at the time of Molly’s death, which said that users were allowed to post content about suicide and self-harm to “facilitate the coming together to support” other users but not if it “encouraged or promoted” it.
The inquest also saw some of the disturbing video content that Molly consumed before her death – which depicted incidents of self-harm and suicide – as well as the ‘recommended’ accounts she was encouraged to follow. Seven per cent of said accounts were either “sad or depressive related”.
A Meta spokesperson told GLAMOUR: “Our deepest sympathies remain with Molly’s family and we will continue to assist the Coroner in this inquest. These are incredibly complex issues. We’ve never allowed content that promotes or glorifies suicide and self harm and, since 2019 alone, we’ve updated our policies, deployed new technology to remove more violating content, shown more expert resources when someone searches for, or posts, content related to suicide or self-harm, and introduced controls designed to limit the types of content teens see.
“We continue to improve the technology we use, and between April and June 2022, we found and took action on 98.4% of suicide or self-harm content identified on Instagram before it was reported to us, up from 93.8% two years ago. We’ll continue to work closely with independent experts, as well as teens and parents, to help ensure our apps offer the best possible experience and support for teens.”