Professional Documents
Culture Documents
An Interdisciplinary Journal
Copyright © 2020 ASCR Publishing House. All rights reserved.
ISSN: Print 2247-9228, Online 2601-226X
Volume XXIV, No. 2 (June), 75-91
doi:10.24193/cbb.2020.24.05
Abstract
Proceeding from the insights of truth-default theory, the article examines the extent
by which the human psyche’s default assumption of truth contributes to our
susceptibility to believe in false online content. This study attempts to trace the
cognitive roots of people’s online susceptibility to disinformation. The article also
investigates the role of fact-checking behavior and belief perseverance to being
vulnerable to online disinformation. Data gathered were from 234 survey
respondents and 16 participants from two sets of focus group discussion (FGD).
All subjects were college students from Manila, Philippines. Regression analysis
shows that our assumption of truthfulness and fact-checking behavior are
statistically significant predictors of (susceptibility) Facebook disinformation
experiences. Among others, the study interestingly found out too that susceptibility
to online disinformation is strong among Facebook users when these false online
contents are favorable to them. The article wishes to contribute to the
understanding of the susceptibility of the human mind to different forms of
falsehood proliferating online.
*Corresponding author:
E-mail: zaldy_collado@dlsu.edu.ph
76 Z. C. Collado, A. J. M. Basco, A. A. Sison
2018; Bradshaw & Howard, 2018), tense political issues (Krug, 2017; Mejias &
Vokuev, 2017) and consumer-related forum (Guilbeault, 2018; Vaque, 2018).
The fact that people can challenge, verify, or cross-check whatever
information appears online is what makes disinformation a little less successful at
face value (Graves, 2017). Nonetheless, the article intends to map out what
particular set of tendencies or behaviors is shaping our susceptibility to
disinformation. For those who fight fake online contents, it is not only about being
able to respond with correct and accurate knowledge, but also being able to figure
out the underlying reasons (tendency or behavior) why susceptibility to
disinformation is possible to begin with (Ciampaglia, 2018; Pennycook & Rand,
2019). It is important so as to address and understand the very problem at its roots.
Unlike other studies which suggest that lazy thinking (Pennycook & Rand,
2019), delusionality (Bronstein, Pennycook, Bear, Rand, & Cannon, 2019), or
prejudice (Ray & George, 2019) account for our susceptibility to fake news or false
contents, this article starts by looking at a more primordial reasons why people fall
to online deceptions. We contend that the susceptibility problem lies deeper in the
natural disposition of the human cognition process. Our argument is that not
laziness in thinking, nor laziness in online investigation is what fundamentally
pushes people to believe in fake news. We propose to test the hypothesis that it is
the natural embeddedness of our mental disposition to assume truthfulness than
falsity that we easily become online preys to deception. We would like to examine
whether there is some degree of truth that people are biologically or psychologically
wired to be unsuspecting of lies, leading to our susceptibility to online deception.
Thus, the paper would like to argue along the lines of the truth-default theory which
states that people assume that people usually communicate with truthfulness
(Kalbfleisch & Docan-Morgan, 2019) or honesty (Levine, 2014).
Specifically, the article intends to examine the role of Assumption of
Truthfulness (AOT), a construct after the idea of the truth-default theory, to human
mind’s receptibility to online deception specifically manifested in Facebook
Disinformation Experiences (FDE). Likewise, the study looks also at how Fact-
Checking Behavior or activity (FCB), conceptualized as an investigation of
verifiable facts free from advocacy, partisanship and rhetoric according to
American Press Institute (Amazeen, 2015) and Belief Perseverance (BP), the
tendency to maintain a belief even in the face of evidential discrediting (Beaulieu &
Reinstein, 2010), contribute to a person’s conscious acceptance of what he thought
is true. This present research hypothesized that AOT, FCB, and BP are significant
predictors of FDE.
In addition, we also hint at the idea that while this seemingly natural
default disposition to assume truthfulness than falsity may explain our
susceptibility, the proliferation of online false contents nowadays seems to reshape
this disposition towards the opposite end. Whether this shift means that the human
psyche is starting to develop a skeptical disposition to adapt itself in the online
world is not for the present paper to prove though future similar studies may
suggest that the rise of false online contents is paving the way for an evolutionary
change in human thinking.
METHOD
Survey
The quantitative component used a survey questionnaire. The survey contains
measures for the concepts which are specifically constructed for this study. It
comprises of five sections; socio-demographic profile (age, sex, last semester
average grade, monthly income, level of FB usage, year level) and the
questionnaires for other main variables, namely, assumption of truthfulness (AOT),
fact-checking behavior (FCB), belief perseverance (BP), and Facebook
disinformation experiences (FDE).
Respondents were asked to describe their attitudes and disposition
regarding AOT, FCB, and BP in a five-frequency Likert-scale. For the FDE,
respondents were asked to check which items they had experienced in relation to
occasions where they fell victims to online disinformation. Only confirmed
Facebook users were given the survey tools.
Descriptive, correlation, and regression analyses were run through the
statistical package for the social sciences (SPSS) to determine the results of the
quantitative component.
Participants
We gathered data from college students studying in Manila, Philippines. There were
a total of 234 survey respondents. All respondents were college students, in the
legal age and were able to give their consent. We specifically chose Political
Science Students to be our respondents/participants in the assumption that their
inclination to political discourses made them more likely to encounter political
contents online (which are marred with integrity issues as to their accuracy). The
research took place at a university in the city of Manila, with students from different
cities in the Greater Manila Area or the National Capital Region (NCR) and other
nearby provinces. The university is a private, non-sectarian catholic university. The
same university provided the ethics approval for this study.
Instruments
Assumption of truthfulness. This construct is measured using a 10-item self-
constructed scale. The construct intends to gauge or measure the level of the
student's assumption on the truthfulness of Facebook posts (news-related article,
government announcements, quotations from personalities, history anecdotes,
infographics, video presentations with voice over, etc.). Using a 5-point Likert
scale, respondents are asked to rate the items (for example, “I presume Facebook
contents are truthful”). The following are the scale points: 1= never 2 = rarely
3 = sometimes 4 = often and 5 = always. The overall consistency of this scale is
α = 0.73.
Fact-checking behavior. This construct is measured using a 10-item self-
constructed scale. The construct aims to gauge or measure the level of the student's
engagement in fact-checking behavior vis-a-vis Facebook posts (news-related
article, government announcements, quotations from personalities, history
anecdotes, infographics, and video presentations with voice over, etc.). Using a
5-point Likert scale, respondents are asked to rate the items (for example, “I try to
fact-check related Facebook contents by examining URLs and writers with
questionable reliability and reputation”). The following are the scale points:
1= never 2 = rarely 3 = sometimes 4 = often and 5 = always. The overall
consistency of this scale is α = 0.73.
Belief perseverance. This construct is measured using a 10-item self-constructed
scale. The construct aims to gauge or measure the extent by which students allow
their personal biases or prejudices to affect their views on veracity of information
vis-a-vis Facebook posts (news-related article, government announcements,
quotations from personalities, history anecdotes, infographics, and video
presentations with voice over, etc.). Using a 5-point Likert scale, respondents are
asked to rate the items (for example, “I tend to dismiss or ignore Facebook contents
that do not reflect my personal belief even if I did not investigate the truthfulness of
the contents”). The following are the scale points: 1 = never 2 = rarely
3 = sometimes 4 = often and 5 = always. The overall consistency of this scale is
α = 0.73.
Facebook disinformation experiences. This construct is measured using an 8-item
self-constructed scale. The construct aims to gauge or measure the extent by which
young adults have had an experience being susceptible to believing in fake, non-
credible or false Facebook content (news-related article, government
announcements, quotations from personalities, history anecdotes, infographics, and
video presentations with voice over, etc.). Using a 5-point Likert scale, respondents
are asked to rate the items (for example, “There was an instance when I honestly
believed a not so credible or false Facebook content”). The following are the scale
points: 1 = never 2 = rarely 3 = sometimes 4 = often and 5 = always. The overall
consistency of this scale is α = 0.64.
Procedure
The researchers randomly chose classrooms with on-going classes of political
science students. We introduced our study to the faculty members handling the
classes and afterwards allowed us to proceed for the data gathering. The participants
were oriented about the nature of the study before giving the survey questionnaire.
It took them 10 - 15 minutes to complete the form. After obtaining the survey tools,
participants were thanked and debriefed.
RESULTS
Table 2.
Descriptive statistics for other Main Variables
Variables N M SD Min. Max
Assumption of truthfulness (AOT) 234 .73 2.49 .61 1.33 4.67
Fact-checking behavior (FCB) 234 .73 2.32 .61 1.00 4.25
Belief perseverance (BP) 234 .73 2.43 .61 1.1 4.50
Facebook disinformation experience (FDE) 234 .64 4.32 2.03 0 8
Table 3.
Correlation matrix of study variables
1 2 3 4 5
1 Age -
2 Sex -.048 -
3 Assumption of truthfulness -.134* -.025 -
4 Fact-checking behavior .099 .050 -.279** -
5 Belief perseverance -.194** -.083 .283** -.263** -
6 Facebook disinformation experiences .099 .085 .172** .078 .079
*. Correlation is significant at the 0.05 level (2-tailed).
**. Correlation is significant at the 0.01 level (2-tailed).
Using our self-constructed scales, we specifically asked the instances in which the
respondents fell victims to Facebook disinformation, the top 3 of which are:
believing a false Facebook content as true (80.34%), have felt being happy upon
reading/seeing a content which later on was proved to be false (75.21%), and have
felt being sad upon reading/seeing a content which later on was proved to be false
(75.21%). Forty-seven percent admitted sharing (re-posting) false Facebook
contents (Top 4).
While it is clear that people (respondents) may be an accomplice to the
spread of fake news by re-posting those kind of contents, respondents can also be
instruments for these fake news not to proliferate any longer (Table 5 – Theme 12).
Table 5.
Themes from the Focus-Group Discussion
Code Description Examples
1. Facebook Facebook contents “Basically, my timeline features diverse
Contents are diverse, ranging contents. There are fake and valid news. There
from the entertaining are a lot of fake news there because they are
ones to political used for political agenda”
propaganda and false
or fake news. “Facebook contents are seasonal. If it is during
election, campaign contents are common. It is
during those times that fake news are rampant”
“Usually, I see memes in my feed and then
politics-related content being shared by my
friends”
2. Default The proliferation of “Usually, I no longer believe in the contents I
Disposition fake news online see in my Facebook. False contents were really
leads to skeptical to common. There were times I clicked on
outright unbelieving contents which I found out later as misleading”
disposition
“It is normal to be skeptical about these
Facebook contents. To be a social media
literate one has to be skeptical about these
contents”
Inferential analysis
The main purpose of this present study is to determine if assumption of truthfulness
(AOT), fact-checking behavior (FCB), and belief perseverance (BP) are significant
predictors of Facebook disinformation experiences (FDE). A standard multiple
regression was run to test this hypothesis. All correlations of variables were from
weak to moderate. This indicates that the issue of multicollinearity is unlikely. This
model is statistically significant, F(3, 230) = 4.064, p = .008, confirming our
hypothesis that the variables are predictors of FDE. However, the combined three
independent variables can only explain 5% of the variance in FDE. Among these
variables, only assumption of truthfulness, (β = .196, p < .05) and fact-checking
behavior, (β = .149, p < .05) were found to be significant predictors. The regression
result is summarized in Table 4.
Table 4.
Predicting Facebook Disinformation Experience from AOT, FCB, and BP
CI 95% for b
Predictor b Lower Upper β
Assumption of truthfulness .686 .213 1.158 .196*
Fact-checking behavior .525 .051 .999 .149*
Belief perseverance .207 -.240 .653 .062
Note. R2 = .050, Adjusted R2 = .038, F(3, 230) = 4.064, p = .008.
FGD Data
Our qualitative data reveal ten (10) major themes (as shown in Table 5). The
following major themes emerged from our FGD data: a) diversity of Facebook
contents, b) skeptical default disposition, c) content’s credibility lies in its source, d)
multi-strategies of fact-checking behavior, e) the importance of fact-checking
practices, f) the consequences of false contents, g) acceptance of a counter evidence
does not mean one is rejecting his existing belief, h) acceptance of a counter
evidence means one is rejecting his existing belief, i) being objective is difficult,
and j) there are belief-influencers.
DISCUSSION
Online encounter with fake news may have helped develop counter-strategies
and default belief-formation
Eighty percent of respondents reported being victims of false entries online by
believing them outrightly (as shown in Figure 1). Perhaps, these experiences
(evidenced by the range of contents encountered in Facebook including fake news
shown in Table 5 – Theme 1) may have led respondents to develop skills in
determining which contents are accurate or not. Figure 2 shows what particular
counter-strategies they employ in order to spot which online items carry inaccurate
contents; these strategies are numerous but most respondents reported examining
the dates, images, authors, and cross-checking through other credible sources as the
most common ones and Theme 4, as shown in Table 5, exemplifies these in the
FGD narratives. With these at the background, it may be understandable why
skeptical and outright unbelieving dispositions among participants of the study have
developed or emerged (as shown in Table 5 – Theme 2). It may, therefore, be said
that if not for these deceptions online, respondents would not have had skeptical
thinking when reading online contents which may lay some evidence supporting the
claim of truth-default theory that people’s default disposition is the belief that
people communicate truthfully. Perhaps, the fact that people became victims to
these online disinformation influences them to develop practices that adapt to the
internet world, for instance, the ability to determine the credibility of online sources
(as shown in Table 5 – Theme 3). Various fact-checking organizations are even
established in order to fight online disinformation (Brandtzaeg & Følstad, 2017;
Graves, 2018). Participants attest the significance of having these practices while
online (as shown in Table 5 – Theme 5 and 6).
scrutinizing the details (understandably, the survey shows that most student
respondents examine dates details as a fact-checking strategy shown in Figure 2
rank as top 1). Thus, it shows how our personal interests shape our belief-formation,
and therefore the chances that we are led to Facebook disinformation (FDE). In so
far as truth-default theory is concerned, this implies that our assumption of a
content’s truthfulness increases when such content is personally appealing and
important to us.
Belief perseverance (BP) may help clarify why AOT persists despite
proliferation of fake news online
It seems that respondents’ AOT level (as shown in Table 2), given their extensive
experience of false online contents, is contrary to the common human experience. If
there are a lot of encounters with deceptive contents, respondents would be less
assuming about the positive level of these contents’ veracity or accuracy. However,
despite the proliferation of this kind of contents, respondents still report an average
level of AOT (shown in Table 2.) Table 3, however, reveals that there is a positive
correlation between AOT and BP which might explain the seeming contrary of
AOT to the human experience. This suggests that while respondents are conscious
of the fact that the internet is no longer a truth-only space, their level of belief
perseverance allows them to still assume the truthfulness of these contents because
of the persistence of strongly held belief (Anglin, 2019; Thorson, 2015). In fact,
FGD participants admitted that being objective and entertaining emerging evidence
against a presently held belief is a difficult feat, saying further that they would
accept presented credible evidence but such acceptance would not mean changing
personally existing views (Table 5 – Theme 7 & 9). Thus, it could be said that
somehow our susceptibility to believing these false online items is not because of
the intensity of our assumption of their truthfulness but because those items are the
very items that we already deem as true beforehand. In truth-default theoretical
discourse, this means that BP may serve to reinforce the predictive power of AOT
to our susceptibility to false contents. Surprisingly, BP is not seen to predict
Facebook disinformation experiences (FDE) as shown in the regression analysis in
Table 4. Moreover, though there were reports that student participants would not
change their beliefs despite being challenged by credible pieces of evidence, there
are some who are also equally ready to set aside their existing belief if presented by
a more truthful and convincing evidence (Table 5 – Theme 8).
Limitations of the study. Since the article only included college students, the study
deems not being able to survey any formally educated youth as one of its
methodological limitations. Another limitation is that only Facebook engagement of
the respondents is the basis for analyzing their susceptibility to disinformation. Other
online activities and platforms which may carry also disinformation are excluded in
the study which may, if taken together, provide a different study outcome.
REFERENCES
Alaphilippe, A., Ceccarelli, C., Charlet, L., & Mycielski, M. (2018, June 1). Disinformation
detection system: 2018 Italian elections Case report. EU Disinfolab. Retrieved from
https://disinfoportal.org/wp-content/uploads/ReportPDF/Disinformation-Detection-
System-2018-Italian-Elections.pdf
Amazeen, M. (2015). Revisiting the epistemology of fact-checking. A Journal of Politics
and Society, 27(1), 1-22. doi:10.1080/08913811.2014.993890
Anglin, S. M. (2019). Do beliefs yield to evidence? Examining belief perseverance vs.
change in response to congruent empirical findings. Journal of Experimental Social
Psychology, 82, 176-199.
Beaulieu, P., & Reinstein, A. (2010). Belief perseverance among accounting
practitioners regarding the effect of non-audit services on auditor
independence. Journal of Accounting and Public Policy, 29(4), 353-373.
doi:10.1016/j.jaccpubpol.2010.06.005
Bradshaw, S., & Howard, P. (2017). Troops, trolls and troublemakers: A global Inventory of
organized Social Media manipulation. Computational Propaganda Research
Project. Oxford. Working paper no. 2017.12. Retrieved from https://ora.ox.ac.uk
Bradshaw, S., & Howard, P. (2018, September 17). The global organization of Social Media
disinformation campaigns. Journal of International Affairs, Special Issue, 71(1.5).
Retrieved from https://ora.ox.ac.uk
Brandtzaeg, P. B., & Følstad, A. (2017, November). Why people use chatbots. In
International Conference on Internet Science (pp. 377-392). Springer, Cham.
Bronstein, M., Pennycook, G., Bear, A., Rand, D. G., & Cannon, T. D. (2019). Belief in fake
news is associated with delusionality, dogmatism, religious fundamentalism, and
Tsikerdekis, M., & Zeadally, S. (2014). Online deception in Social Media. Communications
of the ACM, 57(9), 72-80. doi:10.1145/2629612
Vaque, L. (2018). Fake news in the food sector consumer distrust and unfair competition.
European Food and Feed Law Review, 3(5), 411 – 420.