Since 2011, Facebook has had a system in place for reporting suicidal content. Of course, Facebook’s advice if you encounter a direct threat on the site is to contact law enforcement – but you can also alert Facebook to the suicidal posts, which it’ll investigate.
As it stands, that reporting process is a bit clunky.
Facebook, in a new initiative with various organizations like Forefront, Now Matters Now, the National Suicide Prevention Lifeline, and Save.org, is improving both the tools it offers for both the reporting of suicidal content and the way it handles it after it’s been reported.
Now, if you see a friend suggest that they may be considering harming themselves, you can click the “report post” button and be given the option to flag it as suicidal in nature. At that point, you’ll be given the options to private message your friend, contact another Facebook friend to help, or get in touch with a suicide helpline.
Once you report the post, Facebook will begin to investigate. If it feels as though the person is indeed in distress, Facebook will initiate a new protocol for the next time the person logs in.
“Hi _____, a friend thinks you might be going through something difficult and asked us to look at your recent post,” says the message.
Facebook will then ask if the person wants to talk to someone (a hotline), get tips and support, and more.
“Keeping you safe is our most important responsibility on Facebook,” says the company. “We worked with mental health organizations Forefront, Now Matters Now, the National Suicide Prevention Lifeline, Save.org and others on these updates, in addition to consulting with people who had lived experience with self-injury or suicide. One of the first things these organizations discussed with us was how much connecting with people who care can help those in distress.”
Facebook says that the new reporting features are now available for about half of US users and will continue to roll out to the rest in the next few months.
Image via Facebook Safety