In a bold move to combat the spread of suicide and self-harm content online, the nonprofit Mental Health Coalition (MHC) has launched Thrive, a groundbreaking initiative designed to foster collaboration among major social media platforms.
Thrive, which has snagged big names like Meta, Snap, and TikTok as founding members, aims to share “signals” of harmful content across platforms. By distributing hashes, unique digital fingerprints of troubling material, Thrive keeps the focus on the content itself without exposing personal account information.
Meta is at the helm, lending its cutting-edge infrastructure, which also supports the Tech Coalition’s Lantern child safety program. This means members of Thrive can compile data on distressing content and receive alerts for prompt action, relying on their policies for decision-making.
Dan Reidenberg, Thrive’s director and managing director at the National Council for Suicide Prevention, will spearhead the program, ensuring all activities are smoothly coordinated. The involved companies will be tasked with uploading, reviewing, and acting on flagged content, contributing to an annual report to highlight Thrive’s achievements.
Kenneth Cole, the founder of MHC, expressed enthusiasm about this collaboration: “We at the MHC are thrilled to work with Thrive, a unique collective of leading social media platforms united to tackle suicide and self-harm content. Meta, Snap, and TikTok’s involvement signifies a promising step toward saving lives.”
Interestingly, X, previously known as Twitter, is not part of Thrive. X’s moderation capabilities have come under scrutiny, especially after CEO Elon Musk significantly reduced the company’s trust and safety team. Despite promises to bolster moderation with a new center in Austin, Texas, the actual hires fell short of expectations.
Google, the parent company of YouTube, is another notable absentee. The video platform has faced criticism for failing to shield users from self-harm content. A 2024 study by the Institute for Strategic Dialogue even pointed out that YouTube’s algorithms often recommend harmful videos to teens.
While Thrive’s inception marks a commendable effort, platforms like Meta, Snap, and TikTok are not without their controversies. Numerous lawsuits, including one filed by New York City, accuse these tech giants of exacerbating mental health issues. Notably, a British authority found Instagram partly responsible for a teenager’s suicide due to self-harm content on the site.
Research increasingly shows a link between heavy social media usage and mental health declines, including depression and anxiety. Frequent users often correlate with lower well-being and self-esteem, primarily around physical appearance concerns.
As Thrive takes flight, it holds promise for cultivating safer online spaces, ideally mitigating the mental health risks tied to social media. With eyes on the effectiveness of this new venture, the world awaits to see its potential to make a tangible difference.





