In an effort to prevent the spread of suicide and self-harm content online, the nonprofit Mental Health Coalition (MHC) today announced Thrive, a new program aimed at encouraging people to share “signals” of potentially harmful content on online platforms.
Thrive, whose founding members include Meta, Snap, and TikTok, offers a way for platforms to share hashes — essentially unique fingerprints — of content that depicts suicide or self-harm, or that depicts or encourages viral challenges. The hashes are associated only with the content and do not contain identifying information about accounts or individuals, MHC said.
Meta provided the technology infrastructure, which coincidentally is the same one the company provided to Tech Coalition's Lantern child safety program last November.
According to MHC, Thrive members will be able to aggregate information about self-harm content and receive warnings about content that raises concerns or violates policies, allowing members to make the independent decision about whether to take action.
Thrive Director Dan Reidenberg, who also serves as Managing Director of the National Council on Suicide Prevention, will oversee Thrive's operational aspects and promote and monitor the organization's activities. Participating companies will be responsible for uploading, reviewing and responding to the content shared through Thrive, as well as contributing to an annual report that provides insight into the program's impact.
“We at MHC are thrilled to work with Thrive, a unique collaboration of the most influential social media platforms coming together to address suicide and self-harm content,” MHC founder Kenneth Cole said in a statement. “Meta, Snap and TikTok are some of the first partners to join an 'exchange' committed to making a greater impact and saving lives.”
Conspicuously absent from Thrive is X, the platform formerly known as Twitter, which doesn't necessarily have the best track record when it comes to moderation.
The data suggests that X has significantly fewer moderation staff than other social platforms, in part because CEO Elon Musk cut about 80% of the company's engineers dedicated to reliability and safety. Earlier this year, X promised to open a new center of excellence for reliability and safety in Austin, Texas. But the company reportedly hired far fewer moderators for the center than originally planned.
Google, which owns YouTube, is also not a member of Thrive. YouTube has come under scrutiny for failing to protect users from content that encourages self-harm: A summer 2024 study by the Institute for Strategic Dialogue found that YouTube actively promotes videos to teenagers that encourage or normalize suicide.
We've reached out to Google and X and will update this article if we hear back.
That doesn't mean Meta, Snap, and TikTok are doing a good job. Hundreds of lawsuits, including one recently filed by New York City, have accused the tech giants of contributing to the mental health crisis. In a landmark ruling two years ago, British authorities found Meta-owned Instagram liable for the suicide of a 14-year-old girl who was exposed to self-harm content on the platform.
Research is beginning to show a causal link between excessive social media use and reduced well-being or mood disorders, primarily depression and anxiety. Most studies suggest that heavy social media users are much more likely to suffer from depression than less frequent users, and tend to view themselves in a less favorable light, especially with regard to appearance.