Changes We’re Making to Do More to Support and Protect the Most Vulnerable People who Use Instagram

By  

Adam Mosseri

February 07, 2019

At Instagram, nothing is more important to us than the safety of the people in our community. Over the past month we have seen that we are not where we need to be on self-harm and suicide, and that we need to do more to keep the most vulnerable people who use Instagram safe.

That’s why today, following a comprehensive review with global experts and academics on youth, mental health and suicide prevention, we’re announcing further changes to our approach on self-harm content:

What’s changing

  1. We will not allow any graphic images of self-harm, such as cutting on Instagram – even if it would previously have been allowed as admission. We have never allowed posts that promote or encourage suicide or self harm, and will continue to remove it when reported.
  2. We will not show non-graphic, self-harm related content – such as healed scars – in search, hashtags and the explore tab, and we won’t be recommending it. We are not removing this type of content from Instagram entirely, as we don’t want want to stigmatize or isolate people who may be in distress and posting self-harm related content as a cry for help.
  3. We want to support people in their time of need – so we are also focused on getting more resources to people posting and searching for self-harm related content and directing them to organizations that can help.
  4. We’re continuing to consult with experts to find out what more we can do, this may include blurring any non-graphic self-harm related content with a sensitivity screen, so that images are not immediately visible.

Finding the right balance

Self-harm and suicide are complex issues and we rely on the input of experts in these fields to help shape our approach. Up until now, we’ve focused most of our approach on trying to help the individual who is sharing their experiences around self-harm. We have allowed content that shows contemplation or admission of self-harm because experts have told us it can help people get the support they need. But we need to do more to consider the effect of these images on other people who might see them. This is a difficult but important balance to get right.

During the comprehensive reviews, the experts, including the Centre for Mental Health and Save.org reaffirmed that creating safe spaces for young people to talk about their experiences – including self-harm – online, is essential. They advised that sharing this type of content often helps people connect with support and resources that can save lives.

However, collectively it was advised that graphic images of self-harm – even when it is someone admitting their struggles – has the potential to unintentionally promote self-harm. Which is why we are no longer allowing graphic images of self-harm.

It will take time and we have a responsibility to get this right

Our aim is to have no graphic self-harm or graphic suicide related content on Instagram and to significantly reduce – with the goal of removing – all self-harm and suicide imagery from hashtags, search, the explore tab or as recommended content, while still ensuring we support those using Instagram to connect with communities of support.

We need to create a safe and supportive community for everyone – but this not as simple as just switching off a button. We will not be able to remove these images immediately and we must make sure that people posting self-harm related content do not lose their ability to express themselves and connect with help in their time of need. We will get better and we are committed to finding and removing this content at scale.

We know there’s more that we can do to support the most vulnerable people who use Instagram; that’s why we’ll continue to work with experts and the wider industry to find ways to support people when they’re most in need. You can find out more about our consultation with experts here:

about.fb.com/news/2019/02/protecting-people-from-self-harm

Adam Mosseri, Head of Instagram