You are on page 1of 18

Cognition, Brain, Behavior.

An Interdisciplinary Journal
Copyright © 2020 ASCR Publishing House. All rights reserved.
ISSN: Print 2247-9228, Online 2601-226X
Volume XXIV, No. 2 (June), 75-91
doi:10.24193/cbb.2020.24.05

Falling victims to online disinformation among young


Filipino people: Is human mind to blame?
Zaldy C. Collado1*, Angelica Joyce M. Basco2, Albin A. Sison2
1
Behavioral Sciences Department, De La Salle University, Manila, Philippines
2
Communication Department, Adamson University, Manila, Philippines

Abstract

Proceeding from the insights of truth-default theory, the article examines the extent
by which the human psyche’s default assumption of truth contributes to our
susceptibility to believe in false online content. This study attempts to trace the
cognitive roots of people’s online susceptibility to disinformation. The article also
investigates the role of fact-checking behavior and belief perseverance to being
vulnerable to online disinformation. Data gathered were from 234 survey
respondents and 16 participants from two sets of focus group discussion (FGD).
All subjects were college students from Manila, Philippines. Regression analysis
shows that our assumption of truthfulness and fact-checking behavior are
statistically significant predictors of (susceptibility) Facebook disinformation
experiences. Among others, the study interestingly found out too that susceptibility
to online disinformation is strong among Facebook users when these false online
contents are favorable to them. The article wishes to contribute to the
understanding of the susceptibility of the human mind to different forms of
falsehood proliferating online.

Keywords: Truth-default theory, assumption of truthfulness, fact-checking


behavior, belief perseverance, susceptibility to online disinformation

The proliferation of false information on the internet negatively influences how


people see other online contents (Bradshaw & Howard, 2018; Brandtzaeg &
Følstad, 2017; Krishna Kumar & Geethakumari, 2014). While truthfulness is valued
in societies, online content creators seem to differ by deliberately producing false
information aimed at manipulating online readers or consumers. The literature
assigns the term “disinformation” to suit such behavior. Disinformation refers to the
act of deceiving people using, among others, fake web entries, an act which is a
conscious effort to mislead (Fallis, 2014). Disinformation is a common tactic in the
online world to solicit the desired reactions from a targeted audience. The tactic is
observed during the election period (Alaphilippe, Ceccarelli, Charlet, & Mycielski,

*Corresponding author:
E-mail: zaldy_collado@dlsu.edu.ph
76 Z. C. Collado, A. J. M. Basco, A. A. Sison

2018; Bradshaw & Howard, 2018), tense political issues (Krug, 2017; Mejias &
Vokuev, 2017) and consumer-related forum (Guilbeault, 2018; Vaque, 2018).
The fact that people can challenge, verify, or cross-check whatever
information appears online is what makes disinformation a little less successful at
face value (Graves, 2017). Nonetheless, the article intends to map out what
particular set of tendencies or behaviors is shaping our susceptibility to
disinformation. For those who fight fake online contents, it is not only about being
able to respond with correct and accurate knowledge, but also being able to figure
out the underlying reasons (tendency or behavior) why susceptibility to
disinformation is possible to begin with (Ciampaglia, 2018; Pennycook & Rand,
2019). It is important so as to address and understand the very problem at its roots.
Unlike other studies which suggest that lazy thinking (Pennycook & Rand,
2019), delusionality (Bronstein, Pennycook, Bear, Rand, & Cannon, 2019), or
prejudice (Ray & George, 2019) account for our susceptibility to fake news or false
contents, this article starts by looking at a more primordial reasons why people fall
to online deceptions. We contend that the susceptibility problem lies deeper in the
natural disposition of the human cognition process. Our argument is that not
laziness in thinking, nor laziness in online investigation is what fundamentally
pushes people to believe in fake news. We propose to test the hypothesis that it is
the natural embeddedness of our mental disposition to assume truthfulness than
falsity that we easily become online preys to deception. We would like to examine
whether there is some degree of truth that people are biologically or psychologically
wired to be unsuspecting of lies, leading to our susceptibility to online deception.
Thus, the paper would like to argue along the lines of the truth-default theory which
states that people assume that people usually communicate with truthfulness
(Kalbfleisch & Docan-Morgan, 2019) or honesty (Levine, 2014).
Specifically, the article intends to examine the role of Assumption of
Truthfulness (AOT), a construct after the idea of the truth-default theory, to human
mind’s receptibility to online deception specifically manifested in Facebook
Disinformation Experiences (FDE). Likewise, the study looks also at how Fact-
Checking Behavior or activity (FCB), conceptualized as an investigation of
verifiable facts free from advocacy, partisanship and rhetoric according to
American Press Institute (Amazeen, 2015) and Belief Perseverance (BP), the
tendency to maintain a belief even in the face of evidential discrediting (Beaulieu &
Reinstein, 2010), contribute to a person’s conscious acceptance of what he thought
is true. This present research hypothesized that AOT, FCB, and BP are significant
predictors of FDE.
In addition, we also hint at the idea that while this seemingly natural
default disposition to assume truthfulness than falsity may explain our
susceptibility, the proliferation of online false contents nowadays seems to reshape
this disposition towards the opposite end. Whether this shift means that the human
psyche is starting to develop a skeptical disposition to adapt itself in the online
world is not for the present paper to prove though future similar studies may

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
Z. C. Collado, A. J. M. Basco, A. A. Sison 77

suggest that the rise of false online contents is paving the way for an evolutionary
change in human thinking.

METHOD

Survey
The quantitative component used a survey questionnaire. The survey contains
measures for the concepts which are specifically constructed for this study. It
comprises of five sections; socio-demographic profile (age, sex, last semester
average grade, monthly income, level of FB usage, year level) and the
questionnaires for other main variables, namely, assumption of truthfulness (AOT),
fact-checking behavior (FCB), belief perseverance (BP), and Facebook
disinformation experiences (FDE).
Respondents were asked to describe their attitudes and disposition
regarding AOT, FCB, and BP in a five-frequency Likert-scale. For the FDE,
respondents were asked to check which items they had experienced in relation to
occasions where they fell victims to online disinformation. Only confirmed
Facebook users were given the survey tools.
Descriptive, correlation, and regression analyses were run through the
statistical package for the social sciences (SPSS) to determine the results of the
quantitative component.

Participants
We gathered data from college students studying in Manila, Philippines. There were
a total of 234 survey respondents. All respondents were college students, in the
legal age and were able to give their consent. We specifically chose Political
Science Students to be our respondents/participants in the assumption that their
inclination to political discourses made them more likely to encounter political
contents online (which are marred with integrity issues as to their accuracy). The
research took place at a university in the city of Manila, with students from different
cities in the Greater Manila Area or the National Capital Region (NCR) and other
nearby provinces. The university is a private, non-sectarian catholic university. The
same university provided the ethics approval for this study.

Instruments
Assumption of truthfulness. This construct is measured using a 10-item self-
constructed scale. The construct intends to gauge or measure the level of the
student's assumption on the truthfulness of Facebook posts (news-related article,
government announcements, quotations from personalities, history anecdotes,

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
78 Z. C. Collado, A. J. M. Basco, A. A. Sison

infographics, video presentations with voice over, etc.). Using a 5-point Likert
scale, respondents are asked to rate the items (for example, “I presume Facebook
contents are truthful”). The following are the scale points: 1= never 2 = rarely
3 = sometimes 4 = often and 5 = always. The overall consistency of this scale is
α = 0.73.
Fact-checking behavior. This construct is measured using a 10-item self-
constructed scale. The construct aims to gauge or measure the level of the student's
engagement in fact-checking behavior vis-a-vis Facebook posts (news-related
article, government announcements, quotations from personalities, history
anecdotes, infographics, and video presentations with voice over, etc.). Using a
5-point Likert scale, respondents are asked to rate the items (for example, “I try to
fact-check related Facebook contents by examining URLs and writers with
questionable reliability and reputation”). The following are the scale points:
1= never 2 = rarely 3 = sometimes 4 = often and 5 = always. The overall
consistency of this scale is α = 0.73.
Belief perseverance. This construct is measured using a 10-item self-constructed
scale. The construct aims to gauge or measure the extent by which students allow
their personal biases or prejudices to affect their views on veracity of information
vis-a-vis Facebook posts (news-related article, government announcements,
quotations from personalities, history anecdotes, infographics, and video
presentations with voice over, etc.). Using a 5-point Likert scale, respondents are
asked to rate the items (for example, “I tend to dismiss or ignore Facebook contents
that do not reflect my personal belief even if I did not investigate the truthfulness of
the contents”). The following are the scale points: 1 = never 2 = rarely
3 = sometimes 4 = often and 5 = always. The overall consistency of this scale is
α = 0.73.
Facebook disinformation experiences. This construct is measured using an 8-item
self-constructed scale. The construct aims to gauge or measure the extent by which
young adults have had an experience being susceptible to believing in fake, non-
credible or false Facebook content (news-related article, government
announcements, quotations from personalities, history anecdotes, infographics, and
video presentations with voice over, etc.). Using a 5-point Likert scale, respondents
are asked to rate the items (for example, “There was an instance when I honestly
believed a not so credible or false Facebook content”). The following are the scale
points: 1 = never 2 = rarely 3 = sometimes 4 = often and 5 = always. The overall
consistency of this scale is α = 0.64.

Procedure
The researchers randomly chose classrooms with on-going classes of political
science students. We introduced our study to the faculty members handling the

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
Z. C. Collado, A. J. M. Basco, A. A. Sison 79

classes and afterwards allowed us to proceed for the data gathering. The participants
were oriented about the nature of the study before giving the survey questionnaire.
It took them 10 - 15 minutes to complete the form. After obtaining the survey tools,
participants were thanked and debriefed.

Focus group discussion


The qualitative data were gathered through two sets of focus-group discussions
(FGDs). The FGD guide is composed of three questions inquiring on the
assumption of truthfulness, three questions on fact-checking behavior, and four
questions on belief perseverance. We sought to understand our cohorts’ online
attitudes with regarding these variables. Researchers have asked their permission
for the use of a voice-recording device. A thematic analysis was conducted for the
two sets of FGD transcripts.

Participants and sampling


The FGD participants were purposively determined. Only those who answered “all
the time” in the survey question on the level of FB usage were considered to be
participants. That is to ensure meaningful or relevant answers to the FGD questions
as the topic deals essentially with online engagement. These FGDs took place right
after conducting the surveys in a separate room. We were able to gather a total of 16
FGD participants with 8 participants for each set of FGD. All participants were
college students, in the legal age and were able to give their consent.

RESULTS

Description of survey sample


Study respondents are comprised generally of second year college students whereas
3rd year college students are the least represented (See Table 1).
Table 1.
Descriptive statistics for the Socio-demographic Characteristics
Variable Frequency (%) Mean SD Min Max
Sex
Male 108 (45.8)
Female 126 (53.4)
Year Level
First year 59 (25.0)
Second year 82 (34.7)
Third year 4 (1.7)
Fourth year 62 (26.3)

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
80 Z. C. Collado, A. J. M. Basco, A. A. Sison

Variable Frequency (%) Mean SD Min Max


Fifth year 27 (11.4)
Income
Under 40,000 72 (30.5)
40,000 – 59,999 75 (31.8)
60,000 – 99,999 41 (17.4)
100,000 – 249,999 33 (14.0)
250,000 and above 13 (5.5)
GWA
1.00 – 1.25 10 (4.2)
1.50 – 1.75 56 (23.7)
2.00 – 2.25 36 (15.3)
2.50 – 2.75 107 (45.3)
3.00 – 3.25 21 (8.9)
3.50 – 3.75 4 (1.7)
Facebook use 2.10 .93 1.0 5.0
Age 20.43 2.00 18 27
Note. GWWA = General weighted average

Most respondents are 20 years old, and females, outnumbering males at


least by 8%. In general, respondents belong to average economic class families,
earning less than Php 60.000.00 per month (US $1,200). Based on the latest
reported general weighted average (GWA), most respondents are relatively
academically underperforming. In addition, the majority of the respondents are also
found to be active in their online engagement. Table 2 shows the descriptive
statistics of the main variables. The correlation of variables is summarized in
Table 3.

Table 2.
Descriptive statistics for other Main Variables
Variables N  M SD Min. Max
Assumption of truthfulness (AOT) 234 .73 2.49 .61 1.33 4.67
Fact-checking behavior (FCB) 234 .73 2.32 .61 1.00 4.25
Belief perseverance (BP) 234 .73 2.43 .61 1.1 4.50
Facebook disinformation experience (FDE) 234 .64 4.32 2.03 0 8

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
Z. C. Collado, A. J. M. Basco, A. A. Sison 81

Table 3.
Correlation matrix of study variables
1 2 3 4 5
1 Age -
2 Sex -.048 -
3 Assumption of truthfulness -.134* -.025 -
4 Fact-checking behavior .099 .050 -.279** -
5 Belief perseverance -.194** -.083 .283** -.263** -
6 Facebook disinformation experiences .099 .085 .172** .078 .079
*. Correlation is significant at the 0.05 level (2-tailed).
**. Correlation is significant at the 0.01 level (2-tailed).

Using our self-constructed scales, we specifically asked the instances in which the
respondents fell victims to Facebook disinformation, the top 3 of which are:
believing a false Facebook content as true (80.34%), have felt being happy upon
reading/seeing a content which later on was proved to be false (75.21%), and have
felt being sad upon reading/seeing a content which later on was proved to be false
(75.21%). Forty-seven percent admitted sharing (re-posting) false Facebook
contents (Top 4).
While it is clear that people (respondents) may be an accomplice to the
spread of fake news by re-posting those kind of contents, respondents can also be
instruments for these fake news not to proliferate any longer (Table 5 – Theme 12).
Table 5.
Themes from the Focus-Group Discussion
Code Description Examples
1. Facebook Facebook contents “Basically, my timeline features diverse
Contents are diverse, ranging contents. There are fake and valid news. There
from the entertaining are a lot of fake news there because they are
ones to political used for political agenda”
propaganda and false
or fake news. “Facebook contents are seasonal. If it is during
election, campaign contents are common. It is
during those times that fake news are rampant”
“Usually, I see memes in my feed and then
politics-related content being shared by my
friends”
2. Default The proliferation of “Usually, I no longer believe in the contents I
Disposition fake news online see in my Facebook. False contents were really
leads to skeptical to common. There were times I clicked on
outright unbelieving contents which I found out later as misleading”
disposition
“It is normal to be skeptical about these
Facebook contents. To be a social media
literate one has to be skeptical about these
contents”

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
82 Z. C. Collado, A. J. M. Basco, A. A. Sison

Code Description Examples


“It’s like 50/50. I don’t believe immediately.
Though I doubt the veracity, I don’t jump into
conclusion. I investigate”
“When I see certain news, I immediately
thought that these news are false since fake
news are rampant in the social media”
“My default position is that I see these contents
automatically as false. That is because fake
news is so common in Facebook. It is not the
same as broadsheet or broadcasting media that
I can consult credibly”
3. Content The credibility of “For me, I know that the content is credible if
Credibility Facebook content lies the source comes from sites known to post
in the credibility of accurate news”
the source
“Usually, you would know immediately if the
information is truthful by determining whether
they came from known media outlet”
“Just look at the URL. Legitimate news usually
came from sites ending with .org, .net, or .gov”
4. Fact-checking Fact-checking comes “There are a lot of hoaxes, for example, the
practices in many forms such death of this person or that person. So, to verify
as cross validation, such news I search in google or Youtube to
grammar check, and confirm”
finding supporting
evidence like photos “I examine the grammar of the content.
and videos Usually, fake news carries grammatically
wrong contents”
“When I read a particular content, I search
similar contents. If I found that there are
similar stories from other sources, such content
is legitimate. If you see news agencies telling
the same story, chances are, that is true”
5. Fact-checking - Fact-checking is an “Fact-checking is important. There are a lot of
Importance important online liars in the world, in different aspects of life. In
practice owing to the love, in government, in news. One must not be
proliferation of fake stupid. One has to really fact-check”
contents.
“It’s important to fact-check so that we will not
be misled. So that we will not believe what’s
wrong”
“Yes, it’s important. It’s important to be
critical of what one reads. Because others can
be affected of what you share in Facebook”

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
Z. C. Collado, A. J. M. Basco, A. A. Sison 83

Code Description Examples


6. False content – Sharing fake contents “When you upload or share false content, you
consequences has the potential to are the one to be primarily bashed. If you cite a
destroy one’s fake news, you will be the one to be criticized,
integrity and wrongly so you will be the fake news”
influence others’
“If they see your posted false content, the
course of action
tendency is it will spread and be believed by
others. It will have a domino effect. By that
time, it’s difficult to retract”
7. Acceptance of One may accept a “It’s very difficult for me to change my beliefs.
Evidence – not credible evidence I think it’s alright to accept a credible evidence
a rejection of against his/her belief against my belief. But my belief is still my
belief but may not be belief. I will not change it”
tantamount to the
“Let’s say this current administration. I hate
rejection of his/her
this administration. Perhaps, I will accept
existing belief
whatever credible evidence is presented to me
saying that it is doing well. But I will not
change my belief”
“Yes, I may accept the credible evidence
against my views, but it will not change my
mind. If the former president is still alive, I
think the country will not be poor nowadays. I
will stick to my belief”
8. Acceptance of Standing always for “Even if I have an existing belief but which is
Evidence – truth encourages contrary to what a credible evidence suggests, I
basis of changing an existing will change my mind. I stand for truth. If I
rejecting belief in the face of idolize somebody and he violated something,
existing belief credible evidence that’s a different story”
“For me, I stand with the truth. So, I will
change my perspective if there’s a credible
evidence”
“I will accept the facts presented. So, I will
stand with the truth by hanging my views”
9. Being objective It’s hard for people to “Yes, there is judgment already against the
- difficult be objective when article. Even if I haven’t read anything, I judge
being confronted by based on the title alone. If the accompanying
contents that they do photos seem to be untrustworthy too, I am no
not believe longer interested to read further”
“Yes, I am not objective when it comes to a
particular topic or content. Anything that says
about that topic, I judge it”
“I don’t think I can really be objective. I hate
this government runs the country. So, whatever
I read in the news, I already have judgement”

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
84 Z. C. Collado, A. J. M. Basco, A. A. Sison

Code Description Examples


10. Belief Teachers serve as “I consider my professors as source of credible
formation - credible source of knowledge because they are the ones
influencer knowledge knowledgeable. I have this particular teacher
who really shapes my point-of-view”
“The credible ones are the professors since
they are knowledgeable. They are credible
since they do not just base on experience but on
facts”
“The teachers are the ones credible since they
tell based on facts not on experiences. That is
their field of expertise, so they are the ones
believable”
11. Fake news – People seem to be “I have shared fake news about a class
susceptibility increasingly suspension announcement before”
susceptible to fake
contents when, at “That time I was so lazy, I didn’t want to go to
face value, these school so when I saw a post regarding a
contents work in their suspension of class, I immediately shared
favor which I learned later on as fake news”
“There as a national activity in December that
time, so I was very excited since I intended also
to attend the wedding of my parents. So, when I
saw that announcement of class suspension, I
was very happy without realizing that it was
actually an announcement for last year”
12. Fake news - The victim of fake “I tell that what I posted was fake news. So, I
redeeming the news may also be the admit it then I tell them not to believe it”
self key to discontinue its
proliferation “When I discovered that what I shared was
fake content, I told them that it was a joke. I
even tweeted that people should not believe the
rumors”
“After I learned that it was fake news, I told my
friends about it. I also reminded them not to
share it further and I said sorry”

Eighty percent of respondents reported being victims of false entries online by


believing them outrightly (as shown in Figure 1).

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
Z. C. Collado, A. J. M. Basco, A. A. Sison 85

Figure 1. Respondents’ Facebook Disinformation Experiences

Figure 2 shows which specific fact-checking practices are being observed


by respondents, the most common are fact-checking by looking at the date
(34.96%), fact-checking by examining the images (32.05%), fact-checking by
cross-checking through other credible online sources (32.05%), and fact-checking
on the credibility of authors or writers (31.62%). Despite being pronounced as an
important fact-checking strategy based on the FGD (Table 5 – Theme 10),
consulting teachers or experts in the field ranks as the least fact-checking method
based on the survey results (as shown in Figure 2).

Figure 2. Respondents’ Fact-Checking Practices

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
86 Z. C. Collado, A. J. M. Basco, A. A. Sison

Inferential analysis
The main purpose of this present study is to determine if assumption of truthfulness
(AOT), fact-checking behavior (FCB), and belief perseverance (BP) are significant
predictors of Facebook disinformation experiences (FDE). A standard multiple
regression was run to test this hypothesis. All correlations of variables were from
weak to moderate. This indicates that the issue of multicollinearity is unlikely. This
model is statistically significant, F(3, 230) = 4.064, p = .008, confirming our
hypothesis that the variables are predictors of FDE. However, the combined three
independent variables can only explain 5% of the variance in FDE. Among these
variables, only assumption of truthfulness, (β = .196, p < .05) and fact-checking
behavior, (β = .149, p < .05) were found to be significant predictors. The regression
result is summarized in Table 4.
Table 4.
Predicting Facebook Disinformation Experience from AOT, FCB, and BP
CI 95% for b
Predictor b Lower Upper β
Assumption of truthfulness .686 .213 1.158 .196*
Fact-checking behavior .525 .051 .999 .149*
Belief perseverance .207 -.240 .653 .062
Note. R2 = .050, Adjusted R2 = .038, F(3, 230) = 4.064, p = .008.

FGD Data
Our qualitative data reveal ten (10) major themes (as shown in Table 5). The
following major themes emerged from our FGD data: a) diversity of Facebook
contents, b) skeptical default disposition, c) content’s credibility lies in its source, d)
multi-strategies of fact-checking behavior, e) the importance of fact-checking
practices, f) the consequences of false contents, g) acceptance of a counter evidence
does not mean one is rejecting his existing belief, h) acceptance of a counter
evidence means one is rejecting his existing belief, i) being objective is difficult,
and j) there are belief-influencers.

DISCUSSION

Online encounter with fake news may have helped develop counter-strategies
and default belief-formation
Eighty percent of respondents reported being victims of false entries online by
believing them outrightly (as shown in Figure 1). Perhaps, these experiences
(evidenced by the range of contents encountered in Facebook including fake news
shown in Table 5 – Theme 1) may have led respondents to develop skills in

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
Z. C. Collado, A. J. M. Basco, A. A. Sison 87

determining which contents are accurate or not. Figure 2 shows what particular
counter-strategies they employ in order to spot which online items carry inaccurate
contents; these strategies are numerous but most respondents reported examining
the dates, images, authors, and cross-checking through other credible sources as the
most common ones and Theme 4, as shown in Table 5, exemplifies these in the
FGD narratives. With these at the background, it may be understandable why
skeptical and outright unbelieving dispositions among participants of the study have
developed or emerged (as shown in Table 5 – Theme 2). It may, therefore, be said
that if not for these deceptions online, respondents would not have had skeptical
thinking when reading online contents which may lay some evidence supporting the
claim of truth-default theory that people’s default disposition is the belief that
people communicate truthfully. Perhaps, the fact that people became victims to
these online disinformation influences them to develop practices that adapt to the
internet world, for instance, the ability to determine the credibility of online sources
(as shown in Table 5 – Theme 3). Various fact-checking organizations are even
established in order to fight online disinformation (Brandtzaeg & Følstad, 2017;
Graves, 2018). Participants attest the significance of having these practices while
online (as shown in Table 5 – Theme 5 and 6).

AOT and FCB as predictors of susceptibility to false online contents (FDE)


The regression analysis found out that assumption of truthfulness (AOT) and fact-
checking behavior (FCB) are both predictive of Facebook disinformation
experiences (FCB) (shown in Table 4). The assumption that Facebook contents are
true, at the onset, is likely to predict that one may fall victim to fake news (FCB).
Perhaps, this is not a surprising finding since it is tempting to just accept things as
they are if we think they are true at the onset, which is evidenced by the negative
correlation of assumption of truthfulness and fact-checking behavior (FCB) (as
shown in Table 3). Thus, logically, this also tells us that our fact-checking behavior
influences the possibility by which we can be victims of Facebook disinformation
(FDE). However, the predictive power of these variables is minimal only at 5% (as
shown in Table 4). This suggests that other variables may further explain the
susceptibility to false disinformation. For example, other studies found out that
higher levels of social media use (Nichols, McKinnon, & Geary, 2016), heightened
emotionality (Martel, Pennycook, & Rand, 2019) and ideological preference
(Guess, Nagler, & Tucker, 2019) are correlated with increased susceptibility.
Another factor that may be considered, based on the FGD narratives, is that people
become more susceptible if the inaccurate or deceptive online contents they read are
in line with their interest (Table 5 – Theme 11). In this case, student respondents
shared that there were instances that they believed in fake news regarding the
suspension of classes (announcements which are resurfacing from the past). Since
such kind of announcement had been well within their interest, they acted on these
announcements immediately by sharing them both offline and online without

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
88 Z. C. Collado, A. J. M. Basco, A. A. Sison

scrutinizing the details (understandably, the survey shows that most student
respondents examine dates details as a fact-checking strategy shown in Figure 2
rank as top 1). Thus, it shows how our personal interests shape our belief-formation,
and therefore the chances that we are led to Facebook disinformation (FDE). In so
far as truth-default theory is concerned, this implies that our assumption of a
content’s truthfulness increases when such content is personally appealing and
important to us.

Belief perseverance (BP) may help clarify why AOT persists despite
proliferation of fake news online
It seems that respondents’ AOT level (as shown in Table 2), given their extensive
experience of false online contents, is contrary to the common human experience. If
there are a lot of encounters with deceptive contents, respondents would be less
assuming about the positive level of these contents’ veracity or accuracy. However,
despite the proliferation of this kind of contents, respondents still report an average
level of AOT (shown in Table 2.) Table 3, however, reveals that there is a positive
correlation between AOT and BP which might explain the seeming contrary of
AOT to the human experience. This suggests that while respondents are conscious
of the fact that the internet is no longer a truth-only space, their level of belief
perseverance allows them to still assume the truthfulness of these contents because
of the persistence of strongly held belief (Anglin, 2019; Thorson, 2015). In fact,
FGD participants admitted that being objective and entertaining emerging evidence
against a presently held belief is a difficult feat, saying further that they would
accept presented credible evidence but such acceptance would not mean changing
personally existing views (Table 5 – Theme 7 & 9). Thus, it could be said that
somehow our susceptibility to believing these false online items is not because of
the intensity of our assumption of their truthfulness but because those items are the
very items that we already deem as true beforehand. In truth-default theoretical
discourse, this means that BP may serve to reinforce the predictive power of AOT
to our susceptibility to false contents. Surprisingly, BP is not seen to predict
Facebook disinformation experiences (FDE) as shown in the regression analysis in
Table 4. Moreover, though there were reports that student participants would not
change their beliefs despite being challenged by credible pieces of evidence, there
are some who are also equally ready to set aside their existing belief if presented by
a more truthful and convincing evidence (Table 5 – Theme 8).
Limitations of the study. Since the article only included college students, the study
deems not being able to survey any formally educated youth as one of its
methodological limitations. Another limitation is that only Facebook engagement of
the respondents is the basis for analyzing their susceptibility to disinformation. Other
online activities and platforms which may carry also disinformation are excluded in
the study which may, if taken together, provide a different study outcome.

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
Z. C. Collado, A. J. M. Basco, A. A. Sison 89

Conclusion. The article attempts to provide an understanding of our susceptibility


to Facebook online contents. This is a particularly important task since online
deception may result in unforeseen consequences (Tsikerdekis & Zeadally, 2014).
The study suggests that somehow the disposition of the human psyche to assume
the truthfulness of online content without a thorough examination of its veracity
makes us more vulnerable to online deceptions. This means that the claim of the
truth-default theory that we assume people tell the truth most of the time may partly
explain why people believe in fake news or false contents. Aside from our
assumption of truthfulness, this study also suggests that our belief perseverance
contributes to our inclination to believe (false) online contents. While deception
online is rampant, people seem not to care about accuracy as long as these contents
are in line with their beliefs. Thus, falling victims to online deception. Although this
is the case, the results of both quantitative and qualitative evidence do not provide
an extremely strong case to blame the human psyche in our susceptibility to
disinformation. Future studies which may contribute to a more integral and
complete understanding of the human susceptibility to online disinformation must
be promoted. One of these potential future studies is to investigate people’s
susceptibility to fake news because they seem to validate their existing beliefs as we
have found out.

REFERENCES
Alaphilippe, A., Ceccarelli, C., Charlet, L., & Mycielski, M. (2018, June 1). Disinformation
detection system: 2018 Italian elections Case report. EU Disinfolab. Retrieved from
https://disinfoportal.org/wp-content/uploads/ReportPDF/Disinformation-Detection-
System-2018-Italian-Elections.pdf
Amazeen, M. (2015). Revisiting the epistemology of fact-checking. A Journal of Politics
and Society, 27(1), 1-22. doi:10.1080/08913811.2014.993890
Anglin, S. M. (2019). Do beliefs yield to evidence? Examining belief perseverance vs.
change in response to congruent empirical findings. Journal of Experimental Social
Psychology, 82, 176-199.
Beaulieu, P., & Reinstein, A. (2010). Belief perseverance among accounting
practitioners regarding the effect of non-audit services on auditor
independence. Journal of Accounting and Public Policy, 29(4), 353-373.
doi:10.1016/j.jaccpubpol.2010.06.005
Bradshaw, S., & Howard, P. (2017). Troops, trolls and troublemakers: A global Inventory of
organized Social Media manipulation. Computational Propaganda Research
Project. Oxford. Working paper no. 2017.12. Retrieved from https://ora.ox.ac.uk
Bradshaw, S., & Howard, P. (2018, September 17). The global organization of Social Media
disinformation campaigns. Journal of International Affairs, Special Issue, 71(1.5).
Retrieved from https://ora.ox.ac.uk
Brandtzaeg, P. B., & Følstad, A. (2017, November). Why people use chatbots. In
International Conference on Internet Science (pp. 377-392). Springer, Cham.
Bronstein, M., Pennycook, G., Bear, A., Rand, D. G., & Cannon, T. D. (2019). Belief in fake
news is associated with delusionality, dogmatism, religious fundamentalism, and

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
90 Z. C. Collado, A. J. M. Basco, A. A. Sison

reduced analytic thinking. Journal of Applied Research in Memory and Cognition,


8(1), 108-117. doi:10.1016/j.jarmac.2018.09.005
Ciampaglia, G. L. (2018). Fighting fake news: a role for computational social science in the
fight against digital misinformation. Journal of Computational Social Science, 1,
147–153 (2018). doi:10.1007/s42001-017-0005-6
Fallis, D. (2014). A functional analysis of disinformation. iConference 2014 Proceedings,
621-627, doi:10.9776/14278.
Graves, L. (2018). Boundaries not drawn: Mapping the institutional roots of the
global fact-checking movement. Journalism Studies, 19(5), 613-631.
doi:10.1080/1461670X.2016.1196602
Graves, L. (2017). Anatomy of a fact check: Objective practice and the contested
epistemology of fact checking. Communication, Culture and Critique, 10(3), 518–
537. doi:10.1111/cccr.12163
Guess, A., Nagler, J., & Tucker, J. (2019). Less than you think: Prevalence and predictors of
fake news dissemination on Facebook. Science Advances, 5(1), eaau4586. doi:
10.1126/sciadv.aau4586
Guilbeault, D. (2018, September 17). Digital marketing in the disinformation age.
Journal of International Affairs, Special Issue, 71(1.5). Retrieved from
https://jia.sipa.columbia.edu
Kalbfleisch, P., & Docan-Morgan, T. (2019). Defining truthfulness, deception, and related
concepts. In T. Docan-Morgan (Ed.). The palgrave handbook of deceptive
communication (pp. 29-39). doi:10.1007/978-3-319-96334-1_2
Krishna Kumar, K., & Geethakumari, G. (2014). Detecting misinformation in online social
networks using cognitive psychology. Human-centric Computing and Information
Sciences, 4, 1-22. doi:10.1186/s13673-014-0014-x
Krug, G. (2017). Destabilizations of news. Journal Russian Journal of Communication, 9(2),
201-203. doi:10.1080/19409419.2017.1323175
Levine, T. (2014). Truth-Default Theory (TDT): A theory of human deception and deception
detection. Journal of Language and Social Psychology, 33(4), 378-392. doi:
10.1177/0261927X14535916
Martel, C., Pennycook, G., & Rand, D. G. (2019, September 9). Reliance on emotion
promotes belief in fake news. doi:10.31234/osf.io/a2ydw
Mejias, U., & Vokuev, N. (2017). Disinformation and the media: the case of
Russia and Ukraine. Media, Culture and Society, 39(7), 1027–1042.
doi:10.1177/0163443716686672
Nichols, C., McKinnon, L., & Geary, A. (2016). Rumor has it: Examining the effects of
facebook addiction on political knowledge gullibility. The Journal of Social Media
in Society, 5(1), 229-264. Retrieved from https://www.thejsms.org
Pennycook, G., & Rand, D. (2019). Lazy, not biased: Susceptibility to partisan fake news is
better explained by lack of reasoning than by motivated reasoning. Cognition, 188,
39-50. doi:10.1016/j.cognition.2018.06.011
Ray, A., & George, J. (2019). Online disinformation and the psychological bases of
prejudice and political conservatism. Proceedings of the 52nd Hawaii International
Conference on System Sciences, 2742-2752. doi:10.24251/HICSS.2019.330
Thorson, E. (2015). Belief echoes: The persistent effects of corrected misinformation.
Political Communication, 33(3), 460-480. doi:10.1080/10584609.2015.1102187

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
Z. C. Collado, A. J. M. Basco, A. A. Sison 91

Tsikerdekis, M., & Zeadally, S. (2014). Online deception in Social Media. Communications
of the ACM, 57(9), 72-80. doi:10.1145/2629612
Vaque, L. (2018). Fake news in the food sector consumer distrust and unfair competition.
European Food and Feed Law Review, 3(5), 411 – 420.

Cognition, Brain, Behavior. An Interdisciplinary Journal


24 (2020) 75-91
Reproduced with permission of copyright owner. Further
reproduction prohibited without permission.

You might also like