La Trobe
- No file added yet -

The steps that young people and suicide prevention professionals think the social media industry and policymakers should take to improve online safety. A nested cross-sectional study within a Delphi consensus approach

Download (358.76 kB)
journal contribution
posted on 2024-08-27, 06:22 authored by Jo Robinson, Pinar Thorn, Samuel McKay, Hannah Richards, Rikki Battersby-Coulter, Michelle Lamblin, Laura HemmingLaura Hemming, Louise La Sala

Introduction: Concerns exist about the relationship between social media and youth self-harm and suicide. Study aims were to examine the extent to which young people and suicide prevention professionals agreed on: (1) the utility of actions that social media companies currently take in response to self-harm and suicide-related content; and (2) further steps that the social media industry and policymakers could take to improve online safety.Methods: This was a cross-sectional survey study nested within a larger Delphi expert consensus study. A systematic search of peer-reviewed and grey literature and roundtables with social media companies, policymakers, and young people informed the questionnaire development. Two expert panels were developed to participate in the overarching Delphi study, one of young people and one of suicide prevention experts; of them 43 young people and 23 professionals participated in the current study. The proportion of participants “strongly agreeing”, “somewhat agreeing”, “neither agreeing nor disagreeing”, and “somewhat disagreeing” or “strongly disagreeing” for each item were calculated; items that achieved =>80% of agreement from both panels were strongly endorsed.Results: There was limited consensus across the two groups regarding the utility of the safety strategies currently employed by companies. However, both groups largely agreed that self-harm and suicide-related content should be restricted. Both groups also agreed that companies should have clear policies covering content promoting self-harm or suicide, graphic depictions of self-harm or suicide, and games, pacts and hoaxes. There was moderate agreement that companies should use artificial intelligence to send resources to users at risk. Just over half of professionals and just under half of young people agreed that social media companies should be regulated by government. There was strong support for governments to require schools to educate students on safe online communication. There was also strong support for international collaboration to better coordinate efforts.Discussion: Study findings reflect the complexity associated with trying to minimise the risks of communicating online about self-harm or suicide whilst capitalising on the benefits. However, a clear message was the need for better collaboration between policymakers and the social media industry and between government and its international counterparts.

Funding

The #chatsafe project receives funding from the Commonwealth Department of Health under the National Suicide Prevention Leadership and Support Program.

History

Publication Date

2023-12-15

Journal

Frontiers in Child and Adolescent Psychiatry

Volume

2

Article Number

1274263

Pagination

21p.

Publisher

Frontiers Media S.A.

ISSN

2813-4540

Rights Statement

© 2023 Robinson, Thorn, Mckay, Richards, Battersby-Coulter, Lamblin, Hemming and La Sala. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

Usage metrics

    Journal Articles

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC