Mostly everyone on social media has seen it—that post about self-harm or depression. Normally, we brush it off as a joke and keep scrolling. Or we do not know how to intervene and as a result, we do nothing. To combat this dilemma, the social media site Instagram has released a new feature called the Suicide Prevention Tool.
This tool allows users to anonymously flag posts about self-harm. Instagram then sends the poster messages of support, numbers of suicide hotlines and online resources in an effort to prevent suicide. According to an Oct. 19 article on TechCrunch.com, Instagram also sends support messages to users who have searched for certain hashtags related to self-harm—such as #thinspo, which means thinspiration and is associated with eating disorders.
More than 30,000 suicides occur in the U.S. every year and with instant communication, many young people are reaching out via social media, according to a May 2012 article in the American Journal of Public Health.
Assistant Director of Outreach at CAPS, Dr. Priti Shah, agreed that technological advances are constantly changing the way we communicate.
“Today, we communicate more online, chatting, texting, public blogs, or through social media,” Shah said via an email interview. “It is a means of expression. Therefore, it might not be unusual for one to discuss what they are struggling with.”
Ignoring a depressing post is easy because we are not directly confronted with the poster or the consequences of our inaction. However, these posts may be legitimate warnings or threats of self-harm which, without intervention, have serious ramifications. Instagram, like its parent site Facebook, are using the Suicide Prevention Tool to provide assistance to users who express harmful ideas in their posts.
Questions of effectiveness have been raised regarding the legitimacy of suicidal posts and if there could be negative results for potentially reporting someone who isn’t suicidal.
Angel Demodna, a freshman psychology and philosophy major, agreed about the possibility of misinterpreting someone’s post, but said the feature is still extremely helpful to those who may need it.
“It might be difficult to determine what is and what isn’t joking,” Demodna said. “I feel like our generation says ‘oh I’m going to kill myself’ a lot. Or ‘this paper was so difficult I’m going to kill myself.’ So people might receive them [support messages] and not actually need them, but that is better than not receiving them at all.”
Pamela Valer, a sophomore majoring in biology, said she wished she knew about this new feature earlier, as she has seen questionable posts and has not known how to reach out. However, she also noted that social media has become a huge form of expression and sometimes people are not serious in their internet posts.
“It could be a cry for help but most of the time it’s not all that serious,” Valer added.
Valer also raised questions regarding why Instagram, a photo sharing site, has this feature before a more ‘verbal’ social media site such as Twitter.
Regardless of whether or not a message of support is needed, Shah said that it’s important for people to know that someone, even an anonymous blogger, is concerned for their well-being.
“We know that when people feel that others care or feel connected to a community, it can serve as a protective factor for serious depression or suicide, Shah said. “Therefore, I think it’s helpful to have hotline or support information available when someone is in distress.”
Madison Feser can be reached at email@example.com.