You are on page 1of 21

Critical Thinking and Its Role in Reducing Cognitive Bias

Wesley H.W. Tong

Introduction
“Education then, beyond all other devices of human origin, is the great equalizer of the
conditions of men...” ― Horace Mann

“The mind can only see what it is prepared to see.”


― Edward de Bono, Serious Creativity

In the long and storied history of critical thinking, the existence of a digital world of
information, misinformation, and disinformation has played its role for long enough
barely to qualify as a cameo, has lodged itself onto the screen just adequately to register
as a blip on the historic radar, by chronological measurements. Yet, its impact during an
era in which the disseminating of facts and fiction compel the actions of its audience on
to actions of global consequence is unmistakable.

The semblances of this impact multiple and complex, they reveal human flaws that may
be considered timeless and built into the seams of our nature, and yet their
manifestations vary in accordance with the times. In our current age, as we face a
deluge of information and stimulation brought upon by technological achievement, how
well we are equipped to weather this particular storm is the matter at hand.

Definitions of and Perspectives on Critical Thinking


The triumvirate of ancient Greek thinkers that began with Socrates (circa. 470-399
B.C.E.), whose method of inquiry and examination vigilantly challenged people to
support and justify their own beliefs, was rooted in the concept of evidence. Using
dialogue as his main platform, he relied on a continuous questioning and answering of
views between participants engaged in discussion as the foundation of the Socratic
method.

Although he left behind no writings of his own, his teachings and philosophies were
recorded by others, primarily his students, Plato and Xenophon. Although portraits
written by those two disciples of their teacher do not always reconcile, they agree that
the Socratic method was used by its maker to define beliefs, investigate supposition and
eliminate hypotheses whenever contradictions and failures in logic within them could be
brought to light; and through dialectic, Socrates established these as fundamentals in
critical thinking, as he continually worked to distinguish the demonstrably reasonable
and logical from that which lacked evidence or rational foundation, however much an
unsupported belief might appeal to “our native egocentrism, however much they serve
our vested interests, however comfortable or comforting they may be.” He also
propounded the challenging of authority whenever evidence and deduction proved that
doing so was based in truth. (Bartel et al, 1997)

The second of the Greek triumvirate, Plato, left his mark in large part through dramatic
depictions of his teacher's teachings. Since Socrates left no written legacy of his own,
the task of recording his ideas fell to Plato, whose contributions to the Socratic dialogues
remain among the most prominent and enduring within the genre. Featuring Socrates as
protagonist of the dialogues, Plato illustrates his mentor's spectrum of conceptions and
precepts, as well as his processes in thinking and discourse. More than one scholar has
posited that Plato included his own philosophies in these dialogues, incorporating them
into the colloquies of his protagonist and one-time mentor. For his many surviving
works that continue informing to this day, his role in the evolution of critical thinking
can probably not be overstated.

The third of the trio, Aristotle, credited with the statement “It is the mark of an educated
mind to be able to entertain a thought without accepting it,” also authored a supremely
important perspective on the nature of human persuasion, which applies to this day:
ethos, pathos, and logos. An acutely brief summary of these three modes of persuasion
would define ethos as the credibility of information (or of the source of information),
pathos as an appeal to emotions to be felt by an audience, and logos as an appeal to an
audience's sense of logic and reason (Aristotle, 4 B.C.E.). Arguably, awareness of these
modes has never been more significant than it is now, during the current and constant
stream of information ever dispersed onto the world's populace.

Since the Greek triumvirate, different ages have seen their own contributions. A handful
of prominent names from centuries past should include (but not be confined to) Thomas
Aquinas, Abu Nasr Al-Farabi, Desidarius Erasmus, John Locke, Isaac Newton, Niccolo
Machiavelli, Thomas Hobbes...Two other names can be credited with the first texts
regarding critical thinking: Francis Bacon, whose book The Advancement of Learning
(1605) argued the case for empirical science as basis for discovery and fact, rather than
the natural tendencies of the human mind, and Rene Descartes, whose Rules For the
Direction of the Mind (published 1684, 1701) detailed his method of critical thought
based on a “principle of systematic doubt”--in which every thought is questioned,
doubted, and tested (Bartel, et al, 1997).

For modern definitions of critical thinking, especially those involving formal schooling,
we jump ahead to John Dewey, the educational pioneer. Dewey commonly used the
term “reflective thinking” to denote the concept which he actively injected into his
vision of curriculum. The ideas of problem-solving and proof were key to
differentiating thinking from thinking well, and finally to mastering the art of thinking.
Dewey examined the human tendency to form views based on insufficient knowledge,
and peered into the flaws of stereotype and prejudice (Dewey, 1910).

He set the stage for critical thinking in modern American education. Jumping yet ahead
to the 1980s, a movement to implement such skills in American education revealed an
elusiveness in precise definition. Was it to be treated as a “one-size-fits-all” skill or
“domain specific,” i.e. a set of proficiencies particular to individual fields? (Clemens,
2014) Could history, biology, literature, and economics be examined by a single
analytical process, or must they be treated disparately in the course of this educational
reform? Various interpretations of critical thinking as an educational concept have
strengthened and divided, but a univocally hopeful sign is its serious undertaking.

The Current Age of Information


As this writing examines the increasing need for critical thinking skills in the face of an
inundation of data and opinion, it offers the following outline of three factors existent in
the shaping of informational landscapes around the globe.

The first factor is the increasing affordability and therefore accessibility of the internet to
more and more of the world's population. A Pew Research study published in 2019
found that in countries with advanced economies, 94% of adults own a mobile phone,
76% have their own smart phones, 90% use the internet, and 67% participate in social
media. In emerging economies, the aggregate numbers total as such: 83% own mobile
phones, 45% have smart phones, 60% use the internet, and 49% social media. (Silver,
2019)

A separate study done the same year by the Pew Research Center shows that of the ten
developing countries with the highest usage of smart phones, at the top was Vietnam,
where 97% of the adult population had sole usage of a device, and 1% had shared access
to one, while in India, at No. 10, 70% of adults had sole usage and 13% shared usage to
a device (Anderson et al, 2019). In both studies it is evident that, as technological
efficiencies progress, technology is becoming cheaper and therefore easier to purchase
and more widespread in use.

The impact of this technology's distribution was dramatically demonstrated during the
Arab spring of nearly a decade ago. As smart phones had become more available to
people in Egypt by that time, SNS played a significant role in the ousting of President
Mubarak (Sifry, 2011). Protesters used different platforms to oust President Mubarak:
Facebook to schedule rallies, Twitter to coordinate in real time, and You Tube to show
the rest of the world what was happening.

The second factor shaping the face of information is the ease with which it can now be
posted. Templates offered with free blogs can make them appear as professional-
looking as websites of established news outlets, the level of polish which would have
been unfeasible without professional help or hires at the turn of the century. Low-cost
websites can be acquired with domain names that can avoid the identification of those
sites as personal blogs, imbuing them with a higher level of credibility, for less than
USD $100 per year.

Included in the backdrop of this second factor is the technological ease with which
information can be dispensed on massively popular platforms, having become not only
easily distributed; it is also a facile matter now for even amateurs to portray and present
in convincing form, due to digital advancements. A technological blink ago, highly-
trained web designers and coders were at the center of developing the internet's content,
whereas today's influencers, some in their teens, have access to phone applications that
construct videos within minutes.

The third factor is, objectively, a more disconcerting and potentially harmful reality.
There is much less accountability for veracity of published information than there was in
the pre-internet world. This ties in to the second factor, as previously writers and
publishers (of paper-based reading materials!) used to face greater costs to print and sell
their products than a blogger or website host would currently face. Similarly, published
misinformation and disinformation were vulnerable in many cultures to financial
penalties, i.e. lawsuits. Television/Radio stations and networks, magazines, and
newspapers can be sued for publishing and airing material that they know to be false and
that may be adjudicated as defamatory. Social media companies, however, have a
special protection under law, specifically Section 230 of the Communications Decency
Act of 1996 (Grant, 2021). Individuals posting libelous content can be sued, but not the
host companies themselves. Section 230 is part of U.S. law and applies specifically to
American companies, as European nations, Canada, Japan, and the majority of other
countries have no such statutes (EFF, Section 230 of the Communications Decency Act).
Notably, Google (which acquired You Tube in 2006), Facebook (who bought out
Instagram in 2012), and Twitter are all based in the United States.

On top of the legal difficulties involved with holding publishers of information


accountable to the truth are the sheer numbers of websites and posts now in existence,
some of which stand for the long term, others evaporating after 24 hours.

Cognitive Bias

“...being right all the time acquires a huge importance in education, and there is this terror of
being wrong. The ego is so tied to being right that later on in life you are reluctant to accept that
you are ever wrong, because you are defending not the idea but your self-esteem.”
― Edward de Bono, PO: Beyond Yes and No
The term cognitive bias was introduced by Amos Tversky and Daniel Kahneman in the
1970s and defined as “people's systematic but purportedly flawed patterns of responses
to judgment and decision problems” (Mata and Wilke, 2012). It can be paraphrased as a
subjective reality which possibly leads to perceptual distortion. Astrophysicist Neil
deGrasse Tyson defines it as “a strong, preconceived notion of someone or something,
based on information we have, perceive to have, or lack,” and “systematic errors in a
person’s subjective way of thinking.” He further explains that such biases serve as
“mental shortcuts the human brain produces...to quickly help it make sense of what it is
seeing” (Tyson, Masterclass, 2020).

The cognitive biases catalogued and explained online number in the hundreds—at the
time of this writing, Wikipedia's list includes 197—and seems to grow and evolve with
the zeitgeist. From commonplace vocabulary (e.g. stereotyping, false memory, gender
bias) to popularized and widespread psychological terms (e.g. contrast effect,
groupthink, Pygmalion effect) to references contemporaneous to our present day (IKEA
effect, Google effect), conceptions and perceptions of cognitive bias are developing
organically.

Of the many biases impacting our collective reality, confirmation bias is especially
relevant to our current era of information. Our human capacity to (if not detach
ourselves from) at least alleviate the degree by which this bias holds sway may be the
crux of ensuring that this internet age serves rather than ruins us. Well, that is perhaps a
bit heavy-handed...but I feel it no exaggeration to say that having access to a potentially
unlimited plethora of data, interpretation, and exchange should have shown us the great
virtue of technology; and it has instead put up to light a formerly hidden decay in our
educational fabric.

Tyson's general definition of cognitive bias as beliefs that we hold which can be
disproven, that can be demonstrated as untrue, and which derive from an array of
underlying causes, one of which is a human desire to feel special. And one aspect to this
manifestation of human ego is, as stated in the de Bono quotation at the start of this
section, the urgency to be right, the flip side of which is not to be proven wrong.

Confirmation bias can be defined as “the effect that leads us to look for evidence
confirming what we already think or suspect, to view facts and ideas we encounter as
further confirmation, and to discount or ignore any piece of evidence that seems to
support an alternate view” (Yagoda, 2018), and it feeds directly into our ego's fragilities.
Tyson portrays it fairly succinctly, and adroitly:

There's something that I want to be true, or that I think is true, and I look at a hundred things
and I find the three things that are true and I say “See, I'm right!” Well, how about the nine out
of these ten that don't agree with you? You don't even notice them. You have devalued them.
They don't even show up on your radar. You remember the hits and forget the misses...You
went right to the thing that agreed with you and you sat back and you said “I'm even more right
than I thought I was because I found something that agrees.” This is the problem with searching
on the internet. Search engines on the internet are the epitome of confirmation bias.”
(Tyson, Masterclass, 2020)

In The Social Dilemma, a 2020 docudrama, pioneers of the tech companies that sculpted
the internet into the likeness we now know voice their agreement with Tyson's sentiment
about search engines. Tristan Harris, Joe Toscano (both formerly of Google), Guillaume
Chaslot (formerly of You Tube), Tim Kendell, Justin Rosensteir, Sean Parker, Sandy
Parakilas (all formerly of Facebook), Jeff Seibert (formerly of Twitter), and Bailey
Richardson (formerly of Instagram) detail their use, and the effects, of algorithms and
artificial intelligence in the effort to keep users of their sites engaged and returning for
more—essentially, addicting people to their products. Put simply, algorithms developed
by a handful of engineers became highly effective in predicting the online actions of
their consumers, as well as in gaining and maintaining their attention. Originally to
maximize advertising revenue, the composite technology born from the works of these
engineers permeated into the larger scheme of things, including the dissemination of
facts, misinformation, and disinformation. Search engines are able not only to direct
consumers to online shopping sites, but also to sources of news, science, editorials, and
any other content circulated by like-minded people. (The Social Dilemma, 2020)

“The difference that social media has made is the scale and the ability to find others who share
your world view. In the past it was harder for relatively fringe opinions to get their views
reinforced.”
― Will Moy, director of Full Fact, independent fact-checking organisation based in the UK

Public Trust and Distrust of Media

tribalism— behaviour, attitudes, etc. that are based on supporting and being loyal to a tribe or
other social group (Oxford dictionary)

Probably, there is no better example of distrust in the media and mainstream information
than the rise of the flat-earth movement. The year 2017 saw the start of flat-earth
conferences, where people from around the world who believe that the Earth is flat, not
a sphere, gather to share their views.

Although there seem to be no comprehensive studies regarding how many flat-earthers


exist, organizers of the 2019 conference in Dallas, Texas, reported approximately 600
attendees. David Weiss, one of the participants, explained his reason for going: "We've
all been communicating online (but) this brings us together so we can shake hands and
give each other hugs...We can collaborate, we can make new friends. Because guess
what, our old friends... we lost a lot of friends" (Picheta, 2019).
Flat-earth conventions have been described as a kind of perfect storm, where believers in
a litany of conspiracy theories have the chance to meet like-minded people with whom
to share world views. One of the premises of the flat-earth theory is that NASA and the
U.S. government has been lying to the public about the moon landing for more than 50
years. Mark Sargent, described as one of the founders of the flat-earth movement, has
credited You Tube for its proliferation in recent years. " 'Flat Earth was a binge watch
on YouTube,' he adds, aided by algorithms and personalized recommendations that
turned flat earth research into a never-ending rabbit hole” (Picheta, 2019).

The binge-watch nature of websites employing algorithms has been the great game-
changer. An automatic and (depending on our user settings) continuous leading by the
hand to more stories and videos likely to further confirm our belief systems.

In America's situation, the sharp divide in diverging belief systems has been well
documented from the political perspective. A survey conducted in January 2020 by the
Pew Research Center examined public perception of thirty mainstream media outlets,
including network news (NBC, ABC, and CBS), cable news (e.g. Fox News, CNN,
MSNBC), periodicals (e.g. The New York Times, Wall Street Journal, Washington Post,
and Time magazine), radio shows (e.g. Rush Limbaugh, Sean Hannity), non-profit
organizations (e.g. the Public Broadcasting System, National Public Radio), and internet
media companies (e.g. Buzzfeed, Vox, Business Insider). Of the more than 12,000 U.S.
adults polled, trust and distrust ran largely along political lines. For example, of those
who identified themselves as liberal Democrats, 70% believed CNN (one of the outlets
dubbed “fake news” by President Donald Trump) to be a reliable news source, while
only 16% of self-described conservative Republican had similar views of the cable news
outlet. Conversely, 75% of conservative Republicans relied on Fox News, which has
until recently been in favor with President Trump, while only 12% of liberal Democrats
called it a credible source. Among Democrats and Democratic-leaning independents,
more trusted than distrusted most of the thirty media outlets, the reverse being true
among Republicans and GOP-leaning participants in the poll.

Overall, the survey illustrates a continuing trend among American news consumers, the
tendency to gravitate toward sources of information that affirm their own political and
social views, which is in turn leading to a sharper divide in the media landscape. Amy
Mitchell, director of journalism research at the Center, qualifies this interpretation of the
study by noting that the findings “don't reveal completely separate media bubbles.
There are some news sources that both Democrats and Republicans turn to...”
(Gramlich, 2020, https://pewrsr.ch/2O8F7Tp) , but results mentioned above nonetheless
quantify a widening gulf between people on opposite ends of political and social
spectrum, not only in their world views but also in the wellsprings from which they
derive their facts and data.
A later Pew Reseach study during the same year, concerning skepticism among the
American public toward the news media in general, explores some underlying factors in
its distrust:

One area in which this plays out is in perceptions of why errors occur in news stories.
Republicans overall are more likely to think that mistakes happen because of ill will.
Six-in-ten Republicans and Republican-leaning independents cite a desire to mislead
audiences as a major reason why significant mistakes make their way into news stories,
compared with about a third of Democrats (32%) who feel this way. (Gottfried,
Mitchell, and Walker, 2020).

The publication notes that “within the GOP, this view is especially prevalent among
Republicans who strongly approve of the job that Trump is doing as president.” The
study recognizes support for President Trump as “another dividing line,” since he has
repeatedly made reference to (most of) the U.S. media as “enemy of the people”
(Newsweek, 2020).

Although the United States, by its visibility on the world stage, may be considered one
of the starker examples of a widening chasm between belief systems among different
segments of its population, a Gallup poll conducted in 2018 showed similar occurrences
unfolding worldwide, as it concludes that “the more politically polarized a country is,
the less trust the public tends to have in journalists” (Ritter, 2019). These results are not
to be compared directly with those of either 2020 Pew Research study previously cited,
as trust in journalists, specific news outlets, and the news media in general are three
disparate matters. The survey findings of all three studies are mentioned here simply to
speak to a general growing skepticism and even cynicism toward mass media, or at least
toward chosen parts of the mass media.

Ritter goes on to conclude:

Trust in journalists is complicated. A high level of trust may mean the media and journalists are
doing a good job, or it may indicate an acceptance of false narratives by society. In contrast, a
trust deficit may mean society is "woke" or it may indicate excessive cynicism that portends an
acceptance of a post-truth reality. Regardless, a low level of trust can pose a danger because it
erodes the media's and journalists' ability to operate as the fourth estate that holds power
accountable and promotes civic discussion...As political division grows, the news media and
journalists willingly or unwillingly become participants in the political fray. Reporting on
contentious topics and attempts to hold powerful interests accountable can lead to
accusations of media bias. (Ritter, 2019)

As polarized societies become ever more tribal, logical and critical thinking become
greater necessities for deciphering the onslaught of information that can more and more
easily be tailored to fit any seekers.

Reliance on Social Networking Services

“Social media amplifies exponential gossip and exponential hearsay to the point that we don't
know what's true, no matter what issue we care about.” ― Tristan Harris, The Social Dilemma

“Over time, you have the false sense that everyone agrees with you, because everyone in your
news feed sounds just like you. And that once you're in that state, it turns out you're easily
manipulated, the same way you would be manipulated by a magician.”
--Roger McNamee, Facebook Early Investor Venture Capitalist, The Social Dilemma

What Roger McNamee goes on to explain in his analogy between Facebook's execution
and magician manipulation is the lever of a set-up. Just as the magician is furtively
choosing the card that the audience member will draw, so does Facebook select the
friends and links that it knows, through algorithms and A.I., will appeal to each user.
Certainly more subtle than forceful, the power of this manipulation through suggestion
is in its lack of coercion; users feel in control due to the freedom of choice presented to
them which, in fairness to Facebook and algorithms, is an existential factor built in to
the framework of SNS.

An inadvertent consequence materialized when this power of suggestion began to create


different realities for different users. Rashida Richardson, director of policy research at
A.I. Now Institute, follows McNamee's assertions with her own discernment that, as
users are pulled by algorithms and A.I. in directions pleasing to them, the set of facts
visible to each consumer is tailored to that individual, thereby denying a common frame
of reference that might be seen by a collective populace. “We are simply all operating
on a different set of facts. When that happens at scale, you're no longer able to reckon
with or even consume information that contradicts with that world view that you've
created” (Richardson, The Social Dilemma, 2020).

A counter view of SNS as innocuous recreation could certainly maintain that we are
simply living through a phase of ultra-effective advertising, made possible by
technology. And, short of a dystopian Matrix-like revolution executed by artificial
intelligence, this technology strips us of no freedoms and leaves us to decide how we use
it. Instead, one might contend, human judgment and its potential failing is a greater
reason to be cautious of the worldwide web.

Following are two anecdotal examples of how misinformation on SNS, specifically


Facebook, ended in real and grave results. The first took place in India, in 2018. A
group of five Indian men were lynched in the Dhule district of the western state of
Maharashtra due to rumors spread through WhatsApp messages about child abductors;
the men were mistakenly believed to be traveling for that purpose. It was not an isolated
event, but rather one in a handful of such tragedies resulting from misinformation spread
through SNS over the course of several weeks in India (Gupta and Wilkinson,2018).

A second example was one of far more widespread and intentional harm. Myanmar, a
country with more than 18 million users of Facebook, which for many is the only
platform by which to acquire and share news, witnessed events that the United Nations
has deemed acts of genocide. Users targeting Rohingya Muslims incited through
Facebook acts of murder, rape, and the destruction of villages, ultimately displacing over
700,000 Rohingya. Facebook has publicly admitted a failure to monitor and shut down
bad actors and hate speech (British Broadcasting Corporation, 2018).

Distinctions between misinformation and disinformation, and why the differences matter

I think the problem is that in our school systems, and to some degree...with school boards
around the country that are mandating curriculum and textbooks, you start seeing this weird
watering down of scientific fact so that our kids are growing up in an environment—and this
connects to what I was saying earlier about the media—where everything's contested, that
nothing is true, because if it's on Facebook, it all looks the same. And if you're reading
something from a Nobel Prize-winning physicist next to some guy in his underwear writing in
his basement...On text it looks like it's equally plausible. And part of what we have to do a
better job of, if our democracy's to function in a complicated, diverse society such as this, is to
teach our kids enough critical thinking to be able to sort out what is true and what is false,
what is contestable and what is incontestable..And we seem to have trouble with that.
(Obama, 2016)

In the same interview, former President Obama posits that the erosion of “a common
baseline of facts” is at the crux of the problem, following with the entreaty, “...we can
have a conversation about how to deal with climate change, but if we have a big chunk
of the country that just discounts what 99% of scientists say completely, it's very hard to
think how we move the democracy forward.”

That same year, following the U.K.'s Brexit vote and the U.S. Presidential election,
researchers conducted a study analyzing 376 million Facebook users' interaction with
over 900 news outlets and found that “new information platforms feed the ancient
instinct people have to find information that syncs with their perspectives” as “people
tend to seek information that aligns with their views.” Results were documented in “The
Future of Truth and Misinformation Online” (Anderson and Rainie, 2017).

The publication references a BBC Future Now interview of 50 experts in internet


technology, summarized in “Lies, Propaganda and Fake News: A Challenge for Our
Age” (Gray, 2017). In it, Gray quotes Kevin Kelly, co-founder of Wired magazine:
“Truth is no longer dictated by authorities, but is networked by peers. For every fact
there is a counterfact and all these counterfacts and facts look identical online, which is
confusing to most people,” the latter portion of which echoes President Obama's
comment about fabricated news shared on SNS holding a deceptive sway over many
viewers.

One somewhat promising outcome in the course of Anderson and Rainie's research
surfaced in another of their referenced studies, “Many Americans Believe Fake News Is
Sowing Confusion,” which was conducted immediately after the 2016 U.S. election in
which over half, specifically 64%, of Americans surveyed expressed concern that false
news stories were problematic in the current environment (Barthel et al, 2016),
reflecting at least a majority awareness among the public of this issue, at a time directly
following an event of enormous political upheaval. Furthermore, those surveyed placed
responsibility fairly equally among three bodies: government, SNS sites/search engines,
and the general public. 43% of respondents felt that the general public bore great
responsibility in stopping the spread of fabricated news stories, while 45% placed such
onus on government and elected officials, and 42% on SNS sites and search engines.

The previously mentioned “The Future of Truth and Misinformation Online” dissects an
extensive look into research conducted by Pew Research Center along with Elon
University’s Imagining the Internet Center in which they surveyed technologists,
scholars, and strategic thinkers regarding the spreading of “fake news” by human agents
and online bots. The main question posed:

“In the next 10 years, will trusted methods emerge to block false narratives and allow the most
accurate information to prevail in the overall information ecosystem? Or will the quality and
veracity of information online deteriorate due to the spread of unreliable, sometimes even
dangerous, socially destabilizing ideas?”

Responses were almost evenly split: 49% predicted that yes, it would get better, while
51% thought no. Of the 51%, two key reasons cited for their answer were that bad
actors who produce fake news are able to appeal to the worst human instincts, and that
our brains are not wired to cope with the pace of today's technological change. The 49%
more hopefuls cited faith that future technology could fix the problem of
mis/disinformation, and also faith in human nature to assemble and collaborate when
faced with grave situations.

Of course, accidental misinformation and intentional disinformation are disparate


threats. But in the hopes of the hopefuls, technology and better inner voices could be
equally vigilant toward either peril; after all, wrong information is equally wrong. On
the other side, intentional disinformation can be more carefully crafted to fit the
confirmation biases of website users, especially with the data mined and amassed
through algorithms. Furthermore, the A.I. employed by bad actors would be unreigned
on those actors' own websites.
Perhaps unsurprisingly, a poll eliciting such a balanced response of optimism and
pessimism did not yield a conclusive resolution. However, both sides agreed in the end
on two things: support for an ethical and credible public press is integral to having an
informed populace, and information literacy will be a necessary component in future
education. (Anderson and Rainie, 2017)

“It’s not about the technology being the existential threat, it’s the technology’s ability to bring
out the worst in society. And the worst in society being the existential threat.”
--Tristan Harris, The Social Dilemma

Critical Thinking Education and Its Role In Defusing Bias

“We are allowing technologists to frame this as a problem that they're equipped to solve. That
is...That's a lie. People talk about A.I. as if it will know truth. A.I.'s not going to solve these
problems. A.I. cannot solve the problem of fake news.”
--Cathy O'Neil, data scientist and author of Weapons of Math Destruction

“There's the objective truth, which the methods and tools of science are invented and designed
to establish. Those are true whether or not you believe in them...You can keep your 6,000-year
universe, but understand that it's a personal truth that you get from your personal religion. If
you rise to power and have control over laws and legislation in a pluralistic land, it is a recipe
for disaster if you're going to take your personal truths and create laws that have to then apply to
everyone.” (Tyson, Real Time, 2019)

Before defining his conception of objective truth, Tyson prefaced with two other truths
in his lexicon. The first was personal truths, or truths that no one can take away.
Religious and theological beliefs would fall under this umbrella. The second was
political truths, or things that become accepted after constant repetition has set them into
some firm footing.

As a scientist, he most likely should be expected to favor the last of the trio, objective
truth. And, in the context of his three definitions, it sounds to me as if he doesn't quite
view the first two as truths at all. Possibly, his characterization of personal truths could
allow for convictions that he himself might not necessarily share; for example, a belief
in a supernatural power that created the universe, while not proven by science, is neither
disproven. And also possibly political truths could be either be true or false—the main
criterion that he cites is their acceptance due to repetition, but he doesn't disallow the
chance that they might actually be fact.

The essential notions that I gleaned from his commentary were the expression of
tolerance and suspension of judgment. As he confirms toward the end of his narrative,
this tolerance and suspension can only go so far; he goes on to explain that if a
creationist convinced of the world's age to be 6,000 years old ever tried implementing
this history into a Board of Education curriculum, he would draw the line. His
allowance of these three truths, while not absolute, is a way of acknowledging other
perspectives and showing them respect. It may also be an act of inclusion.

Observers and recorders of bias have put forth measures ranging from various individual
cognitive activities to codified systems of thought toward the effort of teaching critical
thinking in schools and on individual bases. An example of a designed program,
Philosophy in the Classroom (also called Philosophy for Children) was established more
than fifty years ago. Conducted with students from kindergarten and beyond, the
program encourages discourse among children at whatever level they are capable and
comfortable. My own experience with the program took place in public elementary
schools in Hawai'i.

A significant part of the program is exploring questions and topics through a series of
questions called WRAITEC, or the “good thinker's tool kit.” WRAITEC, an acronym,
presents the following concepts in simple language: [W] asks “What do you mean
by...?”, [R] asks for reasons to support thoughts, opinions, and feelings, [A] asks if
assumptions are being made, [I] seeks to recognize inferences, or the “If...then...” of our
thoughts, [T] asks if our statements are true, [E] asks for examples and evidence for our
claims, and [C] asks for counterexamples. The idea is for children to refer to this set of
questions during discussion and contemplation, through which they might recognize
insights that uncover flaws in their reasoning (Lipman et al, 1980). A wide array of
topics can be explored through this lens of discussion and conversation, from simple and
visceral feelings (e.g. What is beautiful?) to problem-solving and conflict-resolution.

Individual cognitive activities may come in singles, as well. An example is Edward de


Bono's “PMI,” or Plus, Minus, Interesting...It is not an analytical tool for determination
the accuracy of information or veracity of statements, but rather a means by which one
may reach decisions. I mention it here as part of de Bono's overall picture which, if I
may attempt to summarize, is one where a thinker considers multiple points of view.
Just as his Six Thinking Hats process does, PMI allows human beings to escape what he
terms “the intelligence trap,” a cognitive tendency to use thinking skills solely to support
our currently held positions. Broadening perspectives through PMI is in a sense
irrevocable, as we cannot unthink our thoughts, only deny them if we so wish (de Bono,
1995).

Another mental exercise that has been advocated by de Bono and others is a thought
process termed “slow-thinking.” Daniel Kahneman, half of the duo that coined the
cognitive bias term, sees slow-thinking as one of the most effective checks on bias. He
labels the quick-thinking part of the brain, and the part consequently more liable to make
mistakes, out of haste, as System 1. “Slow-thinking organizations” that institute policies
could monitor predictions and decisions and require checklists as a means to mitigate
impulses born from intuition, and could create a sort of System 2, a form of thinking that
“with very long-term training, lots of talk, and exposure to behavioral economics, what
you can do is cue reasoning, so you can engage System 2 to follow rules ” (Atlantic,
Yagoda).

What Kahneman seems to be connoting is that System 2 thinking will more likely step
back and digest information derived from a full (or at least fuller) picture, rather than
rely simply on initial and visceral impressions. Neil deGrasse Tyson's statements about
the importance of seeking and acknowledging a broad range of data runs a similar
thread. In his Masterclass video, “Cognitive Bias,” he states “The power of a single
person's testimony has way more influence than it deserves on our thoughts and on our
behaviors.” He goes on to illustrate with an anecdote in which a consumer seeking to
buy a car, who has done ample preparatory research, witnesses a highly dissatisfied
customer at the car lot. The customer's intense complaintiveness would be powerful to
most people, even if their research found that statistics, ratings, and reviews contradict
the witnessed charges. “What it means is, we don't trust data as much as we trust
eyewitness testimony. Singular testimony. You almost have to train yourself away from
the passions of people you know to then recognize the significance and value of the cold
statistics that actually contain access to the truth that you seek. That takes practice”
(Tyson, 2020).

Critical thinking can also move us away from cognitive bias through indirect paths.
When Tyson renders his portrait of a “cosmic perspective,” what compels him has
nothing to do with a conscious effort to be fair and accurate and objective (although
these are good things to be), but rather an overlying view of reality that inspires a
broader conception of himself and his surroundings. In his words, “Cosmic perspective
teaches you that you are special not for being different from everyone else, but for being
the same...has the power to reset what you think is important in life...has the power to
humble you, but in a good way...” He recounts a key moment in biology class when he
learned that “more bacteria live and work in one centimeter of my colon than the
number of people who have ever existed in the world...From that day on, I began to
think of people not as masters of space and time but as participants in a great cosmic
chain of being” (Tyson, “The Cosmic Perspective,” 2007). Whether the perspective be
rooted in cosmos or art or any field that transports, becoming free from biases
sometimes results from having forgotten them.

Information Literacy and Fluency


Finally, there is critical thinking precisely helpful in deciphering the internet universe.
The Association of College and Research Libraries defines information literacy as a "set
of integrated abilities encompassing the reflective discovery of information, the
understanding of how information is produced and valued and the use of information in
creating new knowledge and participating ethically in communities of learning".
(ACRL, 2000)

UNESCO defines information literacy as a human right which “empowers people in all
walks of life to seek, evaluate, use and create information effectively to achieve their
personal, social, occupational and educational goals” (UNESCO, 2005)

The web-based multimedia resource, S.O.S. for Information Literacy, defines


information literacy as “the ability to locate, organize, evaluate, manage and use
information,” the long-term objective being to “lay the groundwork for success in every
phase of a student's life both in and out of school “ (S.O.S. for Information Literacy)

There is a wide array of conceptions in the field of information literacy, some common
principles of which include knowing the content and nature of information to seek, being
able to effectively find information, being able to evaluate any found information, and
using information effectively toward a purpose or objective.

"Information Fluency is the optimal outcome when critical thinking skills are combined
with information literacy and relevant computer skills."
(Associated Colleges of the South, 2002)

Small and simple steps toward this literacy may require mere minutes. The non-profit
National Public Radio suggests keeping aware of this handful of factors: note the
domain name and URL, read “About Us” sections on websites, cross reference
quotations, especially ones that may seem incendiary, check comments, and reverse
image search (Davis, 2016).

Conclusion
Distrust of government and institutions, the conceiving of conspiracy theories, and
general disagreements over what is factually true are hardly new. Arguably, they are
necessary components of a functional society that endeavors to be open and truthful.
Since the time of ancient civilizations, questioning authority and asking questions about
our questions has been a pillar in western thought and education.

What has changed is the ease with which information, misinformation, and
disinformation can now be spread. The internet has played an explosive role in how the
questioning pulls us, in what directions on the informational landscape we choose to go.
In the past, falsehoods were more difficult for an ordinary citizen to spread, for reasons
mentioned earlier—higher costs and fewer means to publish, less advanced technology
for amateurs wishing to produce high-quality and convincing circulations—and were
therefore less impactful.

SNS has allowed views held greatly in the minority to disseminate on global scale, and
for people that hold such convictions to find like-minded thinkers more readily and
easily. As mentioned earlier, flat-earthers are a prime example of how the abundance of
online repetition has called into question, for multitudes of viewers, scientific fact (that
the Earth is a sphere) to which contradiction was virtually unheard in the era of modern
science, until recently.

Another very public rejection of conventional science arose during the past year in the
5G coronavirus theory. Whereas scientists and health officials caution that covid-19 is
spread through air particles and bacteria, believers in the 5G theory maintain that
radiation, via mobile phone signals, are responsible for infections (Carmichael and
Goodman, 2020). This theory gained traction around the world through Facebook posts
dating from January of 2020 (Lawrie and Shraer, 2020). The natural consequence was a
complicating of society's effort in containing corona, as believers in the 5G theory would
not believe masks to be an effective measure. This divide in perspectives, unlike the
flat-earth vs. globe one, most likely had real and tangible effects on public health and
mortality.

The informational landscape in politics and social issues, too, has seen great shifts away
from conventional norms, also a result of the constant permutations that come with the
capacity to post materials almost instantaneously.

Jonathan Greenblatt, Chief of the Anti-Defamation League (ADL), the non-


governmental organization that tracks and studies hate crimes, has said, “"Facebook is
the most sophisticated advertising platform in the history of capitalism” (Levine, 2020),
its algorithms leading viewers from content to similar content, affirming belief systems.
In that interview, Greenblatt referred to the site as “the frontline of fighting hate” due to
its accessibility to extremists. George Selim, a senior advisor to the ADL, sounds a note
of optimism in that article in light of efforts by companies such as Twitter, whose
approach to misleading content includes the labeling of tweets as “Misleading
Information” warranting removal, “Disputed Claim” warranting a warning, and
“Unverified Claim,” which warrants no action (Twitter, 2020). Facebook, as well, has
updated its policies to more vigorously reporting hate speech, implementing a third-
party audit of its content moderation system (Facebook, 2020), reducing its amount of
hate speech on their platforms, and banning over 250 white supremacy groups (Levine
2020).

More recently, following the January 6, 2021 incident of insurrection in which


participants overtook the Capitol Building in Washington, D.C., vocally supporting
Donald Trump's claims that the November 2020 Presidential election declared him loser
due to fraudulence and ballot-tampering, Facebook and Instagram suspended Trump's
account indefinitely (but for a minimum of two weeks); Twitter has closed his account
permanently (Hoffman, 2021).
Organizations evidently have available options for mitigating the spread of
misinformation, disinformation, and propaganda, but the efficacy of their measures is
limited. After Twitter banned former President Trump, his base supporters left the
platform en masse in favor of websites friendlier and more in line with their views, such
as Parler, whose servers crashed due to an overwhelming multitude of new users, and in
subsequent days went dark following suspension by Google and Apple (Levine, January
2021). A similar exodus took place from Fox News after it called the election for Joe
Biden in November 2020; displeased and betrayed by their chosen news source, exiting
Fox viewers found alternate platforms that supplied confirmation to their beliefs.

And so, it seems, systems alone cannot stem the tide of inaccuracies and falsehoods, and
we circle back to the individual. How to learn introspection and self-examination is one
of the primary burdens in an age of information. How to see outside our own spheres, to
step out of our own egos.

One recent reading has comforted me over the past few months, and it had nothing to do
with research for this writing. In Leadership In Turbulent Times, the presidential
historian Doris Kearns Goodwin outlines the lives and trials of four past U.S. Presidents,
the first of them Abraham Lincoln. This was a man discouraged from attending school
by his father, who deemed it a waste of days that would better be put to farming. As a
boy, Lincoln borrowed as many books as he could lay hands on, that they might
accompany him out in the fields; he would walk fifteen miles to the nearest courthouse
to absorb whatever experiences he might witness; as a young man living in New Salem,
he would hike twenty miles to Springfield in order to borrow books about the law. (Yes,
it seems those stories of long walks and borrowed books were true.) Besides law, he
taught himself and explored a myriad of other subjects and practices, among which were
prose, poetry, grammar, speech-writing, geometry, trigonometry, history, government
and civics and, of course, politics. The obstacles which he overcame during his self-
education—his father, the distance he needed to traverse, the scarcity of resources
available to him in the countryside—were spring to both melancholy and life-affirming
humor (Goodwin, 2018). Imagining what he, as a young and older man, might have
discovered and nurtured in his self-education with the benefit of a resource as massive as
today's internet is stunning for its possibilities just as it is somewhat poignant for its
mistiming; he and the worldwide web missed each other by about 150 years. In reading
his biography, letters, and speeches, I have no doubt that his was a mind equipped to
filter misinformation and disinformation, and that intelligence in general (though his
seems to have been rare and perhaps even singular) travels all generations, is not
beholden to technology, and depends not on the era into which one is born. The
potential among us continually renewing, adaptation ever surviving, we can decide on
any day to use knowledge for the better and, expanding its reach through new
inventions, ever for the advancement of our being.
References
Anderson, Janna and Rainie, Lee. “The Future of Truth and Misinformation Online.”
Pew Research Center, October 2019.

Anderson, Monica, et al. “Use of smartphones and social media is common across most
emerging economies,” Pew Research Center, 2019.

Aristotle. The Rhetoric, 4th century B.C.E.

Associated Colleges of the South. (2002). ACS information fluency - definition.


Retrieved from http://www.colleges.org/_oldsite/techcenter/if/if_definition.html

Association of College & Research Libraries, 2016, p. 8. Framework for Information


Literacy, retrieved from
http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/infolit/framework1.pdf

Bacon, Francis. The Advancement of Learning, 1605.

Bartel, Ted; Elder, Linda; Paul, Richard. “A Brief History of the Idea of Critical
Thinking.” State of California, California Commision on Teacher Credentialing, March
1997.

Barthel, Michael, Holcomb, Jesse, and Mitchell, Amy. “Many Americans Believe Fake
News Is Sowing Confusion.” Pew Research Center, December 2016.

The British Broadcasting Corporation. “Facebook admits it was used to 'incite offline
violence' in Myanmar.” BBC News, November 2018.

Carmichael, Flora and Goodman, Jack. “Coronavirus: 5G and Microchip Theories


Around the World.” BBC Reality Check, June 2020.

Clemens, David. “Here is How Academics Ruined Critical Thinking.” The James
Martin Center for Academic Renewal, 2014.

Conger, Kate and Isaac, Mike. “Inside Twitter's Decision to Cut Off Trump.” The New
York Times, January 2021.

Davis, Winnie. “Fake Or Real? How To Self-Check The News And Get The Facts.”
NPR, December 2016.

De Bono, Edward. Tactics: the Art and Science of Success. Gardners Books, February
1995.
Dewey, John. How We Think, 1910.

Domonoske, Camila. “Man Fires Rifle Inside D.C. Pizzeria, Cites Fictitious Conspiracy
Theories.” NPR, December 2016.

Electric Frontier Foundation (EFF). “Section 230 of the Communications Decency


Act.”

Ellis, Jonathan and Hovagimian, Francesca. “Are School Debate Competitions Bad for
Our Political Discourse?” New York Times, October 2019.

Goodwin, Doris Kearns. Leadership in Turbulent Times. Simon & Schuster, 2018.

Gottfried, Jeffrey; Mitchell, Amy; Walker, Mason. “Americans See Skepticism of News
Media as Healthy, Say Public Trust in the Institution Can Improve.” Pews Research
Center, August 2020.

Gramlich, John. “How Pew Research Center evaluated Americans’ trust in 30 news
sources.” Pew Research Center, January 2020.

Grant, Rebecca. “Trump vs. Twitter and Facebook: What's Really Behind Feud That
Banned Trump From Sites?” Fox News, January 2021.

Gray, Richard. “Lies, propaganda and fake news: A challenge for our age.” BBC Future
Now, March 2017.

Gupta, Swati and Wilkinson, Bard. “WhatsApp India: Five lynched after online child
kidnap rumors.” CNN, July 2018.

Hoffman, Adonis. “Twitter, Facebook Right to Block Trump—Big Tech Must Self-
Regulate to Protect Public Safety.” Fox News, 2021.

InformationLiteracy.org. (2006). S.O.S. for Information Literacy.


Informationliteracy.org. Retrieved from

Komando, Kim. “Twitter Rival Parler is in the Post-Election Spotlight: Here's Why.”
The Kim Kommando Show, Fox News, November 2020.

Lawrie, Eleanor and Schraer, Rachel. “Coronavirus: Scientists Brand 5G Claims


'Complete Rubbish.'” BBC Reality Check, April 2020.
Levine, Jon. “Trump supporters ditch Twitter en masse after president's suspension.”
Fox News, January 2021.

Levine, Mike. “Domestic terrorism and hate exploded in 2020,” ABC news, December
2020.

Lemon, Jason. “On Fox News, Trump Says Most of the Media is the Enemy of the
People, but Fox News Isn't.” Newseek, 2020.

Lipman, Matthew, Oscanyan, Frederick S., Sharp, Ann Margaret. Philosophy in the
Classroom. Philadephia: Temple University Press, 1980.

Mata, R. and Wilke, A. “Cognitive Bias.” Encyclopedia of Human Behavior (Second


Edition), 2012

Obama, Barack. 2014 State of the Union address.

Obama, Barack. Interview on Real Time with Bill Maher, November 2016.

Paul, Richard. “What Is Critical Thinking?” (abstract) National Forum, 1985.

Picheta, Rob. “The flat-Earth conspiracy is spreading around the globe. Does it hide a
darker core?” CNN, November 2019.

Ritter, Zacc. “How Much Does the World Trust Journalists?” Gallup, 2019.

Sifry, Micah L. “Did Facebook Bring Down Mubarak?” CNN News, February 2011.

Silver, Laura. “Smartphone Ownership Is Growing Rapidly Around the World, but Not
Always Equally.” Pew Research Center, February 2019.

Sher, Robert. “How to Find the Millenials Who Will Lead Your Company.” Forbes
magazine, 2014.

Tyson, Neil deGrasse. “The Cosmic Perspective.” Natural History Magazine, April
2007.

Tyson, Neil deGrasse. Masterclass: Scientific Thinking and Communication.

UNESCO. “Information Literacy,” United Nations Educational, Scientific and Cultural


Organization, 2005.
Yagoda, Ben. “The Cognitive Biases Tricking Your Brain.” The Atlantic Magazine,
September 2018.

You might also like