Instagram Launches 'Sensitivity Filters' After Death Of 14-Year-Old Molly Russell
WARNING: This article discusses self-harm and suicide.
Instagram will be introducing 'sensitivity filters' following the suicide of 14-year-old Molly Russell, whose parents believe she was being fed graphic images of suicide and self-harm by algorithms on Instagram and Pinterest.
The Facebook-owned photo-sharing platform announced a number of safety measures following a comprehensive review last week, which included seeking advice from UK-based suicide prevention organisations Papyrus and Samaritans.
That includes updates to the algorithm, preventing self-harm images appearing in recommended related images, hashtags, accounts and typeahead suggestions.
New CEO Adam Mosseri -- who moved into the role after the sudden resignation of founders Kevin Systrom and Mike Krieger in September last year -- said he was "deeply moved" by Molly's story.
"We are not yet where we need to be on the issues of suicide and self-harm," he admitted in an op-ed for The Telegraph.
"This is a difficult but important balance to get right. These issues will take time, but it’s critical we take big steps forward now. To that end, we have started to make changes."
Based on specialist advice, Mosseri said Instagram would not be deleting images of cutting, for example, to avoid further stigmatising mental health.
However, these images will no longer be recommended by Instagram's algorithms and will be behind sensitivity filters.
Molly was found dead in her bedroom in November 2017, after showing "no obvious signs" of mental health issues, UK media reports.
But her family later found she'd been viewing material on social media linked to self-harm and suicide.
Her father Ian Russell believed these social media platforms "helped kill my daughter", writing to Facebook (which owns Instagram), Snapchat, Pinterest, Apple and Google imploring the social media giants to act.
"It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people," he wrote.
"It is time for internet and social media providers to step up and purge this content once and for all."
If you need help in a crisis, call Lifeline on 13 11 14. For further information about depression, anxiety, suicidal thoughts or self-harm, contact beyondBlue on 1300 22 4636 or talk to your GP, local health professional or someone you trust.
Contact the author: email@example.com