Social networking sites need to improve safety, study finds

  • 4 June 2025

Social networking sites need to improve their safety and moderation features, according to a study published in JMIR Human Factors. The study was led by researchers at the NIHR Bristol BRC and the University of Bristol.

The researchers looked at Instagram, TikTok, Tumblr and Tellmi – a mental health support app. They found that, whilst platforms employ some successful strategies, they are not effectively addressing the mental health moderation and wellbeing needs of all users.

Based on these findings, the researchers have produced a set of preliminary recommendations for social networking platforms. These propose ways to moderate content relating to mental health, and procedures and tools to ensure users’ wellbeing.

The study also stresses that users need to be involved in designing these features.

Helpful or harmful?

Around 1 in 8 people globally live with a mental health condition.

People with mental health difficulties use social networking sites for many reasons. They may want to disclose their condition, share experiences or seek mental health-related content. This content could be helpful or harmful.

Past research has shown that using social networking sites can cause or worsen mental health symptoms, including depression, anxiety, psychosis, eating disorders and self-harm.

This gives sites a responsibility to support and guide users in managing their wellbeing.

Safety issues

The study identified several issues relating to the mental health resources available on social networking sites. The researchers found:

  • Out-of-date information and broken links
  • Lack of information for users outside the US
  • Little evidence that self-moderation tools had been tested by users and lack of information on what the tools do and how to use them
  • Lack of clarity about what content is and is not allowed
  • No source of support for users who report harmful content, or whose content is reported

They also found no evidence that platforms had evaluated the effectiveness of their self-moderation tools or their guidance relating to users’ mental health and wellbeing.

Recommendations

Based on these findings, the researchers recommend that social networking sites:

  • Empower users by educating them about safety mechanisms early, giving them agency over their decisions
  • Provide clear information regarding content moderation
  • Involve users in designing their safety mechanisms

The study also highlights the need to improve the scope and global relevance of government regulations to better address users’ needs on platforms.

The researchers gathered the information for this study by participating in the platforms as observers. They worked through checklists to systematically evaluate each platform’s approach to moderating mental health content and supporting user wellbeing.

Dr Zoë Haime, Senior Research Associate at the Bristol BRC, said:

“Using these methods, we have been able to explore the realities of online use. This has highlighted how some platforms are failing to adequately protect their users.

“Also, where platforms give the impression of providing supportive measures, these may not be as effective or helpful as they initially seem.

“Our recommendations should be adopted by platforms to ensure safer and better experiences, especially for vulnerable users.”