You are on page 1of 5

Pro-Suicide vs.

Suicide Prevention Communities

Pro-Suicide vs. Suicide Prevention


Communities: An IT Perspective on Public
Health
Emergent Research Forum (ERF)

Morgan Anne Wood Yong Seog Kim


Utah State University Utah State University
morgan.wood23@aggiemail.usu.edu yong.kim@usu.edu
Abstract
In this work we acknowledge that suicide is a global issue that may could be labeled as an epidemic. With
ever increasing access to connective technologies and forums, the movement for online communities and
support groups is becoming increasingly common and accepted. Online communities offer a level of
anonymity that enables individuals to feel a sense of empowerment and social bonding. When applied to
suicide, online communities can form as pro-suicide (suicide plans are encouraged), claimed neutral (the
forum or community claims they do not advocate for any single position on suicide completion) or
preventative where an individual can seek help for suicidal thoughts or tendencies. We seek to further
understand lexicon and sentiment differences between these communities. In addition, we aim to create a
network of users in the communities to analyze the influence of an individual, and if there are any lexiconic
differences between influencers and non-influences within that community.

Keywords
Suicide, Suicide Prevention, Online-communities, Text Mining, Natural Language Processing, Sentiment
Analysis, Online-forums.

Introduction
Suicide prevention is at the forefront of many countries’ minds considering death due to suicide occurs
every 40 seconds, resulting in 800,000 deaths annually. Even more staggering, attempted suicide is twenty
times greater than that of committed suicide (WHO,2019). Within the United States, suicide is known to be
the 10th leading cause of death (second amongst 15-29-year-old’s) while increasing by 25% over the past
decade (WHO 2019, CDC 2018). With almost global epidemic numbers, much research has been done to
further understand suicide, the etiology thereof, as well as addressing prevention and intervention
techniques by experts in psychology, sociology, to epidemiology, public health, and the medical community.
The readers are suggested to refer to two comprehensive review studies (Nock et al. 2008; Nock et al. 2008).
Many studies focus on specific groups of populations such as adolescents, those struggling with specific
mental health diagnoses (such as schizophrenia), military personal, and so on.
In this study, we seek to further explore suicide within the context of information and communicative
technology (ICT) based on the increasing integration of ICT in our daily lives. In particular, we pay attention
to the trends that individuals in recent years are more likely to share their suicidal ideation (thoughts of
suicide) and behavior through distressed online posts on various social media. In this respect, we first like
to identify any relationships between metrics that measure ICT infrastructure (e.g., fixed-broadband and
mobile-cellular subscriptions) and suicide rates in many countries. We also like to analyze the contents of
several representative pro-suicide and suicide prevention communities and online forums to address their
positive or negative impact on suicide ideation and behavior. Ultimately, we like to develop suicide
intervention strategies using knowledge gained from network, content, and sentiment analysis from these
resources.

Twenty-fifth Americas Conference on Information Systems, Cancun, 2019 1


Pro-Suicide vs. Suicide Prevention Communities

Research Interest and Research Plan


It is important to understand the current scholarship centered on technology, online communities, social
media, and suicide. Social media allows users to create and exchange instantaneously and interactively
user-generated content through various platforms, such as blogging (e.g.,Reddit or personal blogs), video
and image sharing (e.g., YouTube), social networking sites (e.g., Facebook, Snapchat, Twitter), digital
forums, text messaging, and video chat (Luxton et al. 2012). Therefore, social media is regarded as a massive
source that can be used for identifying individuals at risk. For instance, Pourmand et. al. (2018) reported
that individuals were more likely to disclose their suicide ideation on social media sites than disclose key
risk factors to their physicians. In another study, Coppersmith et. al. (2018) also recognizes the ubiquity of
social media, especially regarding self-disclosure of suicide ideation. Coppersmith et. al. sought to identify
individuals at risk utilizing deep learning and natural language processing. They discuss the pervasive ethics
involved with mining and identifying at-risk individuals without their consent or willingness to participate
in the health care system (2018). Rosen, Kreiner, and Levi-Belz explored the rise in suicidal behaviors after
a suicide report of a celebrity by exploring psychological impacts through text analysis of online reader
comments on the on-line reports.

We acknowledge that much work has also been completed in regard to online social communities as a peer
support system. For instance, Eysenbach et. al. conducts a systematic review of the effects of online peer to
peer interactions on forums used to discuss health related issues in a virtual environment (2004). Barak,
Boniel-Nissim, and Suler lay a conceptual foundation on why online support groups work, and highlight
many aspects of anomality, invisibility, personal empowerment, and social bonding to name a few (2008).
This is key to understanding the differences between social media, like twitter, which may contain more
personal identifiers than an online form that uses an alias to interact. Specifically, they conclude that online
communities “have a direct effect on well-being and personal empowerment” and an indirect effect serving
as a buffer against negative effects of distressful conditions.”

Ultimately, strong online groups elevate “participants’ sense of power and control and combat feelings of
powerlessness typical of people like them.” Barak in another work explores emotional support and
prevention through an Israeli project known as SAHAR. Interestingly, SAHAR enables prevention methods
through anonymous peer support but also have the backing of “skilled helpers” (2007). While this is
different than the resources we are currently evaluating for data collection, their work provides the
foundation upon which we may build our own questions and evaluations. This is also seen in Greidanus and
Everalls research as they conduct a content-analysis on a community they identified as anonymous help-
seeker, help providing community. Interestingly, the help seekers would often become part of the
community and would then become, over time, help-providers in addition to the trained crisis-intervention
moderators (2015).

These and many other studies lay a strong foundation to further explore suicide and technology. We
recognize that online communities both pro-suicide and suicide prevention are becoming prevalent not only
in standalone sites, but also as threads on social media sites like Twitter, Reddit as well as in Facebook
groups. While sites (such as alt. suicide holiday or better known as “ASH”) that can contain more ‘pro-
suicide’ discussions, including a resource book to help carry out suicide planning are allowed to exist, search
engines such as google have restructured where they appear. For instance, if one searches “I want to kill
myself” or “How do I kill myself?” none of the top results are pro-suicide, instead the national hotline phone
number is prominently displayed. One must be very specific in their google searches to find the html-based
ASH sites. As such, we seek to study two prime areas and populations. Exploration of these communities
will be explored through text-mining, including sentiment analysis. Data will be collected via web scraping
and text information will be parsed using python and beautiful soup. The primary populations are:

Community 1: Pro-suicide websites and social media feeds will be explored to understand the
sentiment of communication within the identified sites and the interactions. While research has strived to
understand the language surrounding suicide on social media, focusing on self-disclosure and other topics,
this study will seek to understand the network of individuals within these communities through text
analysis. We will target select communities or threads that a user can interact with regularly. To this end,
text analysis will be performed to identify key influencers in these communities and the polarity of their

Twenty-fifth Americas Conference on Information Systems, Cancun, 2019 2


Pro-Suicide vs. Suicide Prevention Communities

language. We do not condone or support the use of these pro-suicide sites. Rather, we seek to understand
the language that may draw an individual to these communities to potentially identify methods that would
encourage users to seek help or transition to a preventative support community.

Community 2: Suicide Prevention websites and social media feeds. This community will
encompass a wider variety of online communities, including helpline websites. Helpline websites and online
forum-based communities, such as https://saneforums.org or threads found on reddit, will be evaluated
using text mining and web scraping to understand the network of peer-to-peer communities, striving to
understand the people involved in the communities, effective language, and the underlying support that
comes from these types of sites.

It is important to note that individuals struggling with suicide ideation may have different approaches and
reasonings to why they use each community. We do not support the use of pro-suicide communities but
strive to understand the language used to potentially develop methods to better assist in effective
communication with individuals struggling with suicide ideation in a way that is more effective for the
individual. In both communities, self-disclosure is requisite. In addition, we also note that an individual
can be a part of these communities without suicidal ideation tendencies. Indeed, we expect many who have
been impacted by suicide to also be active participants in these communities. In addition, especially when
dealing with social media, we acknowledge that individuals who are disingenuous may also post similar
posts to those with real intent. This will need to be addressed during data collection to maintain the integrity
of the data.
Community support where individuals are trained to help provide effective prevention techniques including
discussions surrounding suicide decreases suicide gestures and often suicide completions by up to 70% in
very targeted groups but can fail to reach those who may need the intervention the most (Fountoulakis,
Gonda, Rihmer, 2011). For those who complete suicide, much research has been done surrounding the notes
left to loved ones, and language is highlighted as an important indication of those bordering suicide ideation
(thoughts) and suicide planning, gestures, and ultimately completion. Highlighting emotions such as
thankfulness, guilt, love, information, hopelessness and instructions by the authors suggesting emotion
classification would predict types of suicide notes based on Natural Language Processing effectively- all of
which are found in suicide notes. The most common lexicon classification was found to be Instructions,
hopelessness, information, love, and guilt in descending frequency (Desmet, Hoste, 2013). We would expect
potentially similar results as an individual becomes closer to completion. For those who are struggling prior
to completion or voice their feelings on twitter, a classifier was able to detect those who are at risk for
suicide, suicide ideation, and even ‘flippant references’ to suicide (Burnap, Colombo, Scurfield, 2015). The
following hypotheses were developed based on previous scholar work within the literature review.
H1: Actionable, aggressive, non-passive language will be found in pro-suicide communities,
whereas preventative sites will also have actionable language, but will be more passive in
nature. Coopersmith et. al. affirmed that language is critical, and models performed best when looking at
language over many months, rather than a single tweet. We expect actionable language in both
communities, however, we except the actions associated in pro-suicide communities to indicate more
physical- and potentially aggressive- actions then that of prevention, where more actionable health
language will be used. Based on the suicide prevention hotline, warning signs of suicide are extremely action
heavy, such as ‘Talking about wanting to die or kill themselves,’ ‘taking about being a burden,’ rage or
revenge seeking,’ and ‘extreme mood swings.’ We expect the language used in these communities to reflect
these and other signs.
H2: Both communities will have those who do not personally have suicide ideation or
gestures. However, those in prevention will argue against suicide, those in pro will argue
for. Due to the nature of suicide, not just the individual struggling with suicide ideation is impacted.
Mothers, fathers, siblings, friends, family, associates are all impacted when a suicide is committed. Indeed,
suicide prevention is a community affair, and we expect this to be reflected in online communities as well.
The Suicide Prevention Lifeline suggests that action can be taken by everyone, including ‘talking about
suicide, providing support services, and following up with loved ones’ can be critical. As sch, we expect these
kinds of behaviors on online prevention sites. Again, Barak, Boniel-Nissim, and Suler commits that the role

Twenty-fifth Americas Conference on Information Systems, Cancun, 2019 3


Pro-Suicide vs. Suicide Prevention Communities

of invisibility, anomality, and not personally seeing the effects of their post could escalate on social media
and forum communities, especially if they are not peer or site regulated.
H3: Key influencers in the communities will utilize more actionable, possessive language.
The greatest impact, antidotally, occurs when an individual poses the issues in terms of their own
experiences, and admonish others to do something. We expect to see this in both communities. Key
influencers will later be formally defined, but will be based on the number of interactions online, the effect,
tags in posts, etc.

Data Sources, Foundational Trends, and Work Moving Forward


Data to explore initial trends was sourced from Kaggle (based on World Health Organization, World Bank,
and United Nations Development Program, and United Nations Development Program) and then author
sourced data points through International Telecommunication Union (ITU) of the United Nations
(https://www.itu.int/en/ITU-D/Statistics/Documents/facts/ICTFactsFigures2017.pdf). These data sets
include country level data on the rates of communicative technologies such as the percent of population
who uses the Internet, number of mobile phone subscriptions, and broadband subscriptions. Data compiled
by the ITU is country level, annual data beginning in 2000. Finally, two datasets were joined based on
country name and year. Through this data, we reaffirm that suicide is a global issue, and access to online
forums and information in both communities is rising as connective technology is steadily embraced.

Figure 1. Top Five Countries Comparing Figure 2. Lowest Five Countries Comparing
Suicide (black line) and Technology Suicide (black line) and Technology
Trends. Trends.
Furthermore, we identify and collect data surrounding the two key communities. The first method and
dataset used key phrases that an individual may search if they have suicide ideation or may be starting to
create a suicide plan, then the first 10 pages returned were collected, scraped, and cleaned. This scraping
did not exclude html and other coding tags to better assess underlying IT Infrastructure. In addition, once
ASH communities were identified (through a preventative website mention of them), several pages of this
community were also scraped and included. The end result was 40 main pages that were scraped.
Through this process, we identify that most of these sites were informative such as Wikipedia, helplines, or
blog posts. While some interaction of users appears on the blog posts, this data did not possess the peer-
supported communities we sought. To capture community interactions, we identify two Reddit threads, one
targeting those dealing with suicide bereavement, the other discussing their struggles with suicide ideation
and potential plans. Unlike an ASHer form, these are regulated by moderators and each have their own
community rules. The form dealing with personal suicidal thoughts or tendencies is a ‘neutral’ community
in the fact that blatant advocacy for prevention is not allowed, however personal support through the
original poster’s thoughts or slight advocacy or instruction for completing suicide is allowed. These topics
have 7,600 and 130,000 community members respectively.
The top 50 posts from the last 12 months were collected for each topic. Top posts are filtered by a
combination of metrics, primarily the number of votes from the community a post receives. Posts are easily
identified by an alias username of the original post, and commenters. Many top posts also contain

Twenty-fifth Americas Conference on Information Systems, Cancun, 2019 4


Pro-Suicide vs. Suicide Prevention Communities

community interactions in the form of comments. Here, the original poster (OP) can relate additional
details or all members of the community can participate in discussion based on the thread.
Poster and commenter alias, contents, order, time, and more were collected. Suicide is a prevalent and
global epidemic. We seek to further this body of research by identifying key characteristics of online
helplines and forms to evaluate them for accessibility and to understand the online communities
surrounding suicide via text analysis. Online communities could be identified as pro-suicide or preventative
on social media platforms and online forums. Three hypotheses were identified for the end goal of further
understanding of online communities and the role their support provides for those with suicide ideation.

References
Barak, A. 2007. “Emotional Support and Suicide Prevention Through the Internet: A Field Project
Report,” Computers in Human Behavior (23:2), pp.971-984.
Barak, A., Boniel-Nissim, M., Suler, J. 2008. “Fostering Empowerment in Online Support Groups,”
Computers in Human Behavior (24:5), pp. 1867-1883.
Bakst, S. S., Berchenko, Y., Braun, T., and Shohat, T. 2018. “The Effects of Publicized Suicide Deaths on
Subsequent Suicide Counts in Israel,” Archives of Suicide Research.
Burnap, P., Colombo, W., Scourfield, J. 2015. “Classification and Analysis of Suicide- Related
Communication on Twitter,” Proceedings of the 26th ACM Conference on Hypertext & Social Media. pp.
75-84.
Coppersmith, G., Leary, R., Crutchley, P., Fine, A. 2018. “Natural Language Processing of Social Media as
Screening for Suicide Risk,” Biomedical Informatics Insights.
Centers for Disease Control and Prevention. 2018. “Suicide Rates Rising Across the U.S.,” CDC
Newsroom. https://www.cdc.gov/media/releases/2018/p0607-suicide-prevention.html Kaggle.
https://www.kaggle.com/russellyates88/suicide-rates-overview-1985-to-2016
Desmet, B., Hoste, V. 2013. “Emotion Detection in Suicide Notes,” Expert Systems with Applications.
(40:16), pp. 6351-6358.
Eysenbach, G., Powell, J., Eglesakis, M., Rizo, C., and Stern, A. 2004. “Health Related Virtual Communities
and Electronic Support Groups: Systematic Review of the Effects of Online Peer to Peer Interactions,”
The BMJ.
Fountoulakis, K. N., Gonda, X., Rihmer, Z. 2011. “Suicide Prevention Programs Through Community
Intervention,” Journal of Affective Disorders. (120:1-2), pp. 10-16.
Greidanus, E., Everall, R.D. 2010. “Helper Therapy in an Online Suicide Prevention Community,” British
Journal of Guidance & Counselling. (38,2), pp. 191-204.
Luxton, D. D., June, J. D., and Fairall, J. M. 2012. “Social Media and Suicide: A Public Health Perspective,”
American Journal of Public Health (102:S2), pp. S195–S200.
Marchant, A., Hawton, K., Stewart, A., Montgomery, P., Singaravelu, V., Lloyd, K., Purdy, N., Daine, K.,
and John, A. 2017. “A Systematic Review of the Relationship between Internet Use, Self-harm and
Suicidal Behaviour in Young People: The Good, the Bad and the Unknown,” PLoS ONE (12:8)
Nock, M. K., Borges, G., Bromet, E. J., Cha, C. B., Kessler, R. C., and Lee, S. 2008. “Suicide and Suicidal
Behavior,” Epidemiologic Reviews (30:1), pp. 133–154.
Pourmand, A., Roberson, J., Caggiula, A., Monsalve, N., Rahimi, M., and Torres-Llenza, V. 2018. “Social
Media and Suicide: A Review of Technology-Based Epidemiology and Risk Assessment,” Mary Ann
Liebert, Inc: Telemedicine and e-Health.
Rosen, G., Kreiner, H., and Levi-Belz, Y. 2019. “Public Response to Suicide News Reports as Reflected in
Computerized Text Analysis of Online Reader Comments,” Archives of Suicide Research. Pp. 1-31.
Sindahl, T.N., Cote, L., Dargis, L., Mishara, B. L., Jensen, T.B. 2018. “Texting for Help: Processes and
Impact of Text Counseling with Children and Youth with Suicide Ideation.” Suicide and Life
Threatening Behavior 7(1), e28.
Suicide Prevention Lifeline. Federal Substance Abuse and Mental Health Services Administration.
https://suicidepreventionlifeline.org/how-we-can-all-prevent-suicide/
United Nations: The International Telecommunication Union (ITU). 2017.
https://www.itu.int/en/ITUD/Statistics/Documents/facts/ICTFactsFigures2017.pdf.
World Health Organization. 2019. “Mental Health: Suicide Data.”
https://www.who.int/mental_health/prevention/suicide/suicideprevent/en/.

Twenty-fifth Americas Conference on Information Systems, Cancun, 2019 5

You might also like