Exploring safe moderation of online suicide and self-harm content

Theme Mental health

Workstream Psychological interventions

Status: This project is ongoing

People who access content about self-harm online can find that it affects their mental health. Policymakers are therefore particularly concerned about the how to manage online content relating to suicide or content encouraging users to self-harm.

We must be able to safely and appropriately moderate information about these topics online, so we can protect vulnerable users from its effects. However, anecdotal evidence suggests moderation can have unintended consequences for blocked users. It can make them feel isolated and stigmatised and there has been little research into the best ways to moderate and reach out to blocked users.

Project aims

The aims of this project are to:

  • Assess the impact of blocking online self-harm and suicide content
  • Look at how social networking sites manage mental health content and user wellbeing through content moderation and safety resources

The project has two parts:

  • Exploring users’ experiences of content moderation
  • Exploring how social networking sites manage mental health content and user wellbeing

Users’ experiences of content moderation

What we did

During our project we analysed data collected for the DELVE study. DELVE collected information from a group of individuals who accessed and created self-harm and/or suicide content. Our analysis focused on their experiences of moderation, including their experiences of having content blocked and fulfilling moderator roles within online groups and communities.

What we found and what this means

The individuals who took part in the study reported that:

  • It was difficult to report inappropriate content, particularly when their mental health was poor
  • Inconsistent moderation and unclear communication left them feeling confused and frustrated when their own content was moderated
  • They felt it was important to have moderators with personal experience of self-harm or attempted suicide, but performing this role put these individuals at risk

We have used this information to produce guidance for online industry leaders and policymakers on moderating online self-harm and suicide content.

We worked with Samaritans on this project, who have gained deeper insights into users’ engagement with self-harm and suicide content online. This is informing their ongoing policy and safeguarding work.

How social networking sites manage mental health content and user wellbeing

What we did

We looked systematically at how social networking sites manage mental health content and user wellbeing through content moderation and safety resources. This part of the study looked at Instagram, TikTok, Tumblr, and Tellmi – a mental health support app.

We gathered the information by participating in the platforms as observers. We worked through checklists to systematically evaluate each platform’s approach to moderating mental health content and supporting user wellbeing.

What we found and what this means

We identified several issues relating to the mental health resources available on social networking sites:

  • Out-of-date information and broken links
  • Lack of information for users outside the US
  • Little evidence that self-moderation tools had been tested by users and lack of information on what the tools do and how to use them
  • Lack of clarity about what content is and is not allowed
  • No source of support for users who report harmful content, or whose content is reported

We also found no evidence that platforms had evaluated the effectiveness of their self-moderation tools or their guidance relating to users’ mental health and wellbeing.

Based on these findings, we recommend that social networking sites:

  • Empower users by educating them about safety mechanisms early, giving them agency over their decisions
  • Provide clear information regarding content moderation
  • Involve users in designing their safety mechanisms

This project also highlights the need to improve the scope and global relevance of government regulations to better address users’ needs on platforms.

Screenshot of the research paper on moderating self-harm and suicide content

Experiences of Moderation, Moderators, and Moderating by Online Users Who Engage with Self-Harm and Suicide Content

Exploring Mental Health Content Moderation and Well-Being Tools on Social Media Platforms: Walkthrough Analysis