Professional Documents
Culture Documents
Critical Thinking Effect - Uncover The Secrets Ocal Thinking & Logic Mastery), The - Thinknetic
Critical Thinking Effect - Uncover The Secrets Ocal Thinking & Logic Mastery), The - Thinknetic
THINKNETIC
Did You Know That 93% Of CEOs Agree That This Skill Is More
Important Than Your College Degree?
If your thinking is flawed and what it takes to fix it (the solutions are
included)
Tried and true hacks to elevate your rationality and change your life for
the better
Enlightening principles to guide your thoughts and actions (gathered
from the wisest men of all time)
Introduction
Afterword
One Final Word From Us
Continuing Your Journey
References
Disclaimer
INTRODUCTION
Facts
What exactly is a fact? More importantly, how do you know
something is a fact? What if it is merely an opinion or a claim?
A fact is a piece of information that we can verify. We can observe it,
or we can find out it is true from a reliable source. For example, the
atomic number of carbon is six; the US Civil War took place between
1861 and 1865; Armstrong was the first to walk on the Moon. These
are all facts.
Without diving deep into philosophy, we can note here that ‘truth’ is
an abstract idea. Nobody can say for certain what is real (or fake,
come to that). Later, when we look at scientific skepticism, we will
examine this in more detail. For now, let’s assume that truth can
exist and that critical thinking at least brings us closer to it.
As a critical thinker, you should inspect any so-called fact you
encounter. How do you know it is a fact and not an assumption? Try
to verify the ‘fact’ yourself; it may be an assumption if it cannot be
verified. On investigating, you may find the assumption is incorrect.
In some cases, assumptions are the best we can do. For example, if
you were designing a novel product, at first, you might assume that it
would appeal to customers who buy related products. Later, you
could gain evidence using market research.
When you investigate a given fact, you might find that it is outdated
or even a total misconception. Scrutinize facts, and learn to
recognize the good facts and reject the bad ones.
Opinions
Opinions may resemble facts, but they are subjective judgments.
People often misrepresent opinions as facts, perhaps because
strongly held opinions may even feel factual. Opinions are always
evaluative or comparative, even if they use the same form as a fact
by stating that something ‘is’ something. Saying that something is
the best must, therefore, be an opinion.
Take this statement:
“Joseph Bloggs is the best downhill skier because they have won the
most gold medals.”
This sentence is an opinion based on a fact. You can verify or falsify
the fact that they won most medals by reading medals tables. The
opinion that such a fact makes Joseph Bloggs the best downhill skier
cannot be verified: it is somebody’s perspective.
A new skier may be the best, even when they have not won anything
yet. They might be able to beat Joseph Bloggs in every race, but if
medal count is the best measure of skiing ability, the new skier
cannot be said to be the best.
Our motivations, attitudes, and emotional states have huge effects
on our opinions [2,4]. This renders opinions vulnerable to all sorts of
biases; not surprisingly, two people with identical information can
very easily hold opposite opinions. Of course, opinions can change
completely over time and need not be based on facts at all.
Claims
Like opinions, claims are often wrongly presented as facts. Claims
may be factual, but the definition of claim is an assertion presented
without proof. Therefore, distinguishing claims from facts is easy;
you just need to check whether the source supplies any evidence for
the claim.
Claims can be implied rather than stated. ‘Before and after’ photos in
beauty adverts are a good example. The adverts may or may not
overtly claim that the treatment improves the skin, but the skin
certainly looks healthier in the ‘after’ photo.
Companies produce adverts to make viewers spend money rather
than showing them the truth, resulting in advertisers presenting
claims as facts. But claims crop up in the wild, too.
Conspiracists claim that mankind did not land on the Moon in 1969,
but NASA faked the mission using camera tricks in a television
studio. We can say this is a claim because there is no evidence of
the proposed fakery.
A fake Moon landing would entail faking a lot of evidence. Fake
technical data and fake filmed footage are only the beginning. NASA
would have had to have persuaded their entire staff to give fake
testimony, not to mention fake paperwork.
Evidence
It is not just conspiracy nuts who persist even when faced with
overwhelming evidence against their beliefs [2]. We all do it. At times,
we are all guilty of ignoring or misunderstanding evidence. This
leads us to an important question: what exactly is evidence, and how
should we use it?
Evidence is an everyday term, but as critical thinkers, we need a
more technical definition. Evidence refers to a body of information
that supports a given position.
We typically use evidence to prove or disprove a belief or decide
whether a judgment or opinion is valid. Of course, you need
evidence from different sources.
A good body of evidence comes from multiple reliable sources.
Imagine overhearing a conversation at a party. Somebody claims
that ‘investments are a great way to make money.’ A successful
investor is listening; he nods enthusiastically and starts bragging
about the huge profits he has made. Wouldn’t you want to hear the
other side?
The more evidence supports a conclusion, the more likely that
conclusion is to be true. You might collect evidence from pre-existing
sources or decide to gather your own.
Picture a range of experts who are interested in why people fall into
problem gambling. The medic does not agree that sociology surveys
are the best way to research this, but the sociology professor thinks
they are the only way that makes sense.
However, the two researchers would examine different aspects of
addiction. The medic in this example decides to look at physical
differences in the bodies and brains of addicts and non-addicts;
perhaps pre-existing variation predicts who can gamble casually
without becoming addicted. In contrast, the sociologist wants to look
at socioeconomic factors like gamblers’ family situations, housing
issues, and poverty.
The gambling study could involve neuroscience, interviews with
gamblers, big data science, and more, in addition to surveys and
clinical studies. All these approaches are helpful because they look
at the problem at different levels. The resulting body of evidence,
taken together and processed according to good logic, could
generate more robust data than the medic or the professor alone.
The group can investigate all potential causes of gambling and
compare how well all the different factors predict who becomes a
problem gambler.
In conclusion, uncertainty is a good thing because it drives us to
examine problems in more depth. You can never gather all the facts
or examine all the evidence. The best you can do is test your ideas
and beliefs and improve them as you go along, based on a wide
range of evidence.
Source Of Message
Firstly, find out about the source. Sources are individuals or
organizations, and the following advice applies to both.
A source may be an expert on some topics and naive about others.
Sources may be biased, have special interests in certain topics, or
pet theories. They may be more or less reliable, more or less
trustworthy. Think about the following aspects of the source:
Is it an academic or government publication? We have to assume
these are more trustworthy than commentators. This is because their
vested interest lies in providing accurate information for the
population, whereas commentators’ motivation is more variable.
Is the source paid (or rewarded in some other way) for conveying the
message? Publishers can and do pay experts to communicate
specific information.
Where do they get their information? Is it a primary or secondary
source? Secondary sources can misquote primary sources. They
might even treat other secondary sources as if they were primary
sources. This magnifies errors and misconceptions. Find the original
information if you want to assess it fairly.
What does your expertise tell you? If the source is somebody you
know, perhaps you know that they make outlandish claims quite
often. This could factor into your assessment.
When analyzing messages, especially from people you know,
remember that people’s reasoning skills vary. The source may not be
aware of all the aspects just described, and they may feel that they
have made a very good case. Perhaps with a good debate, you can
help them to improve.
At times, we all forget our deep-seated assumptions and
motivations. Do not forget that critical thinking takes practice.
Purpose Of Message
Next, examine why the source composed the message. Knowing a
message’s purpose may alert you to possible distortions and half-
truths. What was their real motivation?
Here, you need to view the message’s fine details. If it is on a
website, what kind of website? For example, somebody’s private
blog has a different purpose from a government website. See
whether they have declared any interest in the products or topics
they discuss; like influencers, blog writers are often given ‘freebies’
in exchange for promoting the product.
A message might not be an obvious advert, but still be a promotional
text. For example, companies often feature blogs about their
products and services; you would not necessarily take these texts at
face value. Instead, think about what interest the company might
have in the topic: web traffic, affiliate links, direct purchases, or
simply to get you reading more about pet insurance.
People make persuasive messages for many reasons, and they can
be subtle. Analyze the language to detect whether the message
might be covert persuasion rather than unbiased information.
Persuasive texts may feature many adjectives and adverbs, chatty
language, and high-school rhetorical devices like alliteration and the
‘rule of three.’
Word choices also reveal the author or speaker’s biases and
opinions. Say you are reading reviews of a scientific book about
climate change. One reviewer refers to the ‘climate scare,’ whereas
the other calls it the ‘climate emergency.’ They have a different
opinion, but in the context, both phrases mean the same thing.
Another aspect of purpose is that the source may prefer one
conclusion or decision from the outset. They might then filter out and
distort the evidence to support the position they have already
chosen. You can tackle this issue by using alternative sources to
research the topic and filling in those gaps yourself.
Field Of Work
As well as the source’s motivation and the message’s purpose, you
must understand at least something about their field of work. This is
even more important if it is not your specialty. You need to get to
grips with the basics.
Firstly, what are the fundamental goals? Imagine a hospital where
radiographers and nurses work together to produce and analyze
magnetic resonance images. Radiographers aim to produce the best
images possible, whereas nurses aim to keep patients comfortable
and well-informed about the procedure. Sometimes these goals
might clash since the scanning procedure is uncomfortable and
noisy. Specialist staff at all workplaces need to work together in this
way to be effective.
Similarly, to assess the truth or falsehood of a message, you must
understand the sphere the source of the message works in. This
contextual information enables you to judge the message on its own
merits. Further, there is no point judging the quality of a
radiographer’s work in the same way you would judge nursing care.
Secondly, what basic concepts or assumptions does the source
employ? Individuals may not even be aware of their basic
assumptions, but you, as a critical thinker, should be able to discern
them.
In everyday life, a basic assumption might be that when you enter a
table service restaurant, you wait in line, and then somebody shows
you to a table. You do not have to ask somebody what to do; you just
know. Similarly, physicists assume that light’s speed is a universal
constant; they do not attempt to measure it in every experiment.
Finally, what kinds of data do they use to expand their knowledge
and inform their decisions? Whether you agree with the specific
methods or not, try to assess them fairly rather than from a
prejudiced position. Be flexible yet rigorous, like a scientist.
Research the message behind the message you receive, and put
your critical thinking skills to good use.
Action Steps
Now that we have explored the features of critical thinking and how
to interpret messages better, it is time to put some of these ideas into
action. Try these suggested exercises.
1. The Fact Check
Identify a purported fact, either from your work or the media. This
can be anything, as long as you can read it in context and research it
to analyze it.
Suggestions:
Feel free to ask any other relevant questions you can think of, based
on what we have looked at in this chapter. Now that you have done
this once, you have a framework for assessing messages you
receive using critical thinking.
2. Observational Study
Firstly, visualize a person you think has good critical thinking skills.
Write a few notes about them using the questions below, or make a
mind map.
What kind of person are they? What have they said or done that
makes you think they are great at critical thinking? What outcomes
do they produce?
Examine the evidence you have written down, and conclude whether
this person is a good critical thinker. Perhaps bring this exercise to
mind next time you speak to them or witness their critical thinking,
and make a few more observations.
Now repeat the same process for somebody you think has poor
critical thinking skills, including what makes you think they are bad at
critical thinking. Put your notes or mind maps side by side and
compare them.
This exercise will help you focus on the good (or bad) critical
thinkers’ traits and behaviors. It also starts you thinking about the
real-world applications of critical thinking.
Summary
In the story at the beginning, the University relied on its staff to
disclose conflicts of interest, and they trusted the market data that
the company reported. However, multiple factors, including
misplaced trust in Keller, led them to invest in a failing company.
A poor decision cost the University more than just money. Why?
Could this have been prevented if the Finance Committee had
applied what we had learned in this chapter? Perhaps.
Emotions played a role in the investment: the desire for success,
trust in Keller. They appraised the company’s success incorrectly
due to inadequate evidence (they relied on the market data). Keller,
the investment recommendation source, turned out to be unreliable
due to having a personal interest in the company.
In the story, the University did not have all the information needed to
make the correct decision. No doubt, you will have been in similar
situations yourself. Hopefully, the techniques covered so far have
equipped you with more tools to deal with information you encounter
in the future.
Apart from features of the information we receive, what else keeps
us from getting to the truth? The answer is complex, and we will
delve into it very soon in the next chapter.
Takeaways
1. Critical thinkers must distinguish between facts, opinions, claims,
and evidence.
2. You should be realistic and even humble about your knowledge.
However, pairing logic with your own experience is a key part of
thinking critically.
3. Remember to assess the author and their motivation, as well as
the message.
4. Use multiple reliable sources, including other people, to help you
reason towards better conclusions and decisions.
2
“M om! Dad! I need to speak to you!” the kid yelled. He had just got
back from his first day at grade school, and he had serious beef
with his parents.
“What is it?” asked the concerned parents.
“The other kids all laughed at me.”
A sad tale of juvenile bullying, you might think. Yes, but there was
more to it. The kid had started school with something fairly crucial
missing from his social life.
His parents were overjoyed when he was born. As high achievers
themselves, they wanted their children to do well in life.
The kid’s father had heard about an interesting research study. He
spoke with his spouse, and they both agreed it could not harm their
child.
The study was the famous Mozart Effect. First published in the early
1990s, this experiment indicated that students who listened to
Mozart did better on certain cognitive tests than those who did not
listen to Mozart [8]. The students performed as though their IQ was
8-9 points higher than those who listened to a relaxation tape or
silence. Furthermore, a prestigious scientific journal published the
study.
This got parents, as well as scientists, very excited. Everybody
wanted to grab those extra IQ points for their child. There may even
have been a boom in baby headphones’ sales and Best Of Mozart
CDs (this was the 1990s, remember).
Our family took this to an extreme, however. The kid had passed
unnoticed through kindergarten, but by grade school, his deficit was
apparent. Shockingly, he had never listened to anything other than
Mozart.
More, his test scores were average at best, and he was the victim of
several bullying incidents within the first few weeks of school.
That was when his Mom decided to investigate further.
Scientists found the Mozart Effect very hard to replicate, but they
kept trying. More often than not, Mozart listeners performed about as
well as those who listened to different music or silence [9,10].
The kid's Mom also found out that the cognitive enhancement effect
was small and probably only lasted a while after the music finished
— anything mildly stimulating made people do a bit better on the
tests.
What she regretted, though, was naming her son Wolfgang.
With the Mozart effect, one experimental study became so well-
known that people did not even notice the subsequent studies. Other
studies were less dramatic and therefore did not grab the parent’s
attention.
Is Mozart special? In a musical sense, of course. But there is
probably not a special Mozart module in the brain that switches on
superior learning processes.
The failure to replicate the Mozart Effect suggests that the original
effect was due to general characteristics of the music, like
complexity or interestingness. Aspects of the experimental situation
might also have led to these seemingly impressive results [9,10].
Recent analysis suggests that scientists published more ‘Mozart-
positive’ results due to publication bias. This is similar to confirmation
bias, which we will look at in detail in this chapter.
Beliefs
Beliefs are an important part of human life. We all hold prior beliefs
about things, people, and ideas, and one generation passes them on
to the next via social learning. Sometimes, we believe what we want
to believe despite evidence against it; we can refer to this as wishful
thinking [12].
So, where do erroneous beliefs come from? Our brains do not intend
to deceive us, but knowing the truth is not always their main concern.
Erroneous beliefs are a byproduct of the psychologically adaptive
process of social learning [2]. Social learning supports many useful
tasks, such as learning our native language. As social creatures, we
need social cohesion and shared experiences, and we start paying
attention to other humans (and potentially learning from them) as
infants [13]. So, it is only natural that we are so open to acquiring
ideas directly from others, especially those we trust.
Second-hand information has great potential to lead to false or
distorted beliefs. Humans love to tell good stories, and the storyteller
may highlight certain aspects and ignore others, either to make the
story more entertaining or to emphasize certain parts of it [2].
In turn, prior beliefs can lead to biased perceptions of people,
objects, and events, thereby affecting future perceptions and
experiences. People can then pass these biased beliefs onto others.
This may remind you of the children’s game Telephone or Chinese
Whispers, in which one person whispers a verbal message to the
next along a long line. The original message disappears by the end
of the game.
Another aspect of our beliefs is that we tend to believe what we want
to believe [2], and this includes our beliefs about ourselves. We may
adopt socially acceptable beliefs to avoid being rejected by others [1].
Like many of our psychological tendencies, there is nothing wrong
with this, but at times it could obstruct our critical thinking.
Emotions
Social emotions such as trust and the desire for acceptance can
affect what we believe, but emotions have huge effects on cognition.
Psychologists have documented mood congruent effects in memory
and attention [14,15].
This means that people tend to notice and remember information
that fits with their current mood; you may observe this phenomenon
casually in everyday life now that you are looking for it. For example,
when somebody feels joyful, they might notice beautiful scenery or
enjoy a good meal more than when they are in a neutral mood. Our
emotions, therefore, influence not only what information goes in but
also how our minds process it.
In controlled experiments, a scared or sad person is more likely to
perceive others’ faces as threatening or negative. Someone
experiencing a happy, exuberant mood is more likely to label faces
as friendly. The first person might be more likely to recall unpleasant
events from their own life, whereas the second would recall more
happy and joyful experiences [14,15].
This example illustrates that memory retrieval is an active process;
your memory is not like a library issuing you the same memory every
time. Instead, the cognitive system reconstructs the memory each
time [1].
Fallacies
The term fallacies often refer to commonly held false beliefs,
including some examples of folk wisdom. For example, many people
believe that more babies are born during the full moon [2]. In fact
(verifiable, reliable fact, that is!), no more babies are born on the full
moon than during any other phase of the moon.
False belief fallacies can affect our reasoning processes if we
assume that pieces of received wisdom are true without examining
them in more detail.
Fallacies also refer to logical fallacies. These are errors of reasoning
commonly known as non-sequiturs. To reason properly, we must
make sure that our conclusions follow logically from our arguments’
premises. The study of logical fallacies has a lengthy history, and
there are many of them [1].
1. Ad Hominem Fallacy
Ad Hominem means "against the person." It means attacking the
person rather than attacking their point or conclusion [1,20]. You might
witness this fallacy in a political debate.
For example, one politician argues passionately against a new
shopping mall in the town, but their opponent points out that they live
in that town and the new mall would bring a lot of extra noise and
traffic to the area. The opponent argues that the first politician is
therefore concerned for themselves, not necessarily for the
residents.
Here, the first politician described a concept, but the other
proceeded to attack the first as a person, ignoring the debate’s topic.
Attacking the opponent is not an effective way to argue against their
idea, so we describe ad hominem as a fallacy. Like the other factors
described here, this fallacy can lead to divergence from important
topics. People sometimes use it deliberately to divert attention and
discussion away from certain topics.
There are two types of ad hominem [21]. The circumstantial variety is
when a source is speaking hypocritically, and somebody else points
it out. This type of ad hominem may constitute a legitimate
argument, but it is still a logical fallacy. The second variety is abusive
ad hominem, where somebody uses another’s personal traits to
attack their idea, where the traits are unrelated to the idea.
In practice, ad hominem rebuttals are not always irrelevant. Let us
think about a political debate. One politician attacks the other’s
personality or life choices. But what if these are relevant to the
argument?
This example illustrates circumstantial ad hominem: the opponent
points out the first politician’s hypocrisy. Suppose the first politician
had no obvious self-interest in canceling the new mall. In that case,
the opponent could still attack them to convince the populace that
they were not trustworthy and discredit their opinion. This is abusive
ad hominem, a fallacy we should certainly try to avoid.
2. Hasty Generalization
Hasty generalization is another important fallacy that we need to
understand. It means jumping to a conclusion based on too little
evidence. A more technical definition is generalizing from a sample
or single instance to a whole population. However, the sample may
be too small or not representative of the general case.
Imagine a friend saying:
“My Grandpa lived to be ninety-six years old, and he drank a bottle
of whisky every day of his life!”
Unfortunately, Grandad does not prove that alcohol is a recipe for a
long and healthy life. This anecdote, a single example, does not
outweigh decades of medical evidence.
Generations of thinkers have described this fallacy. Aristotle
discussed it first, followed by many more scientists and philosophers.
Alternative names for hasty generalization include faulty
generalization, the fallacy of accident, the fallacy of neglecting
qualifications, and many others [22].
Hasty generalization is easy to commit. People under pressure in
busy jobs, seen as authorities on the topic at hand, might mistakenly
conclude too early. Hasty generalization can also lead to wrongly
assuming that every instance is the same, based on one or two
examples. It can also lead to people ignoring situations where their
conclusion is false. In the example of Grandpa and his whiskey, the
speaker focuses on the single example at the general case’s
expense.
You can see how hasty generalization could become a serious
problem and prevent us from getting to the truth.
3. Bandwagon Fallacy
The bandwagon fallacy means falling into the trap of thinking that the
majority is always right. People commit this fallacy when they agree
with the majority without seeking further information [23].
A classic psychological study revealed that many people would
agree with the majority opinion even when they can see that the
majority is wrong [24]. This experiment’s task was shockingly simple:
participants had to choose the longest line from a few options, and
the lines were different lengths. The experimenters put individual
participants in groups with fake participants, and all the fake ones
chose a line other than the longest line.
Asch’s study showed that many people agreed with the majority but
then expressed concern and confusion because the majority gave
the wrong answer. The experiment put people into an unnatural
situation, but we can also see the bandwagon effect in real-life
scenarios.
In real life, the majority opinion is often fine, and we can choose to
follow it without dire consequences [25]. For example, most people
would agree that dogs make good pets and rhinoceros do not.
Choosing a pet is a relatively benign decision, though.
In contrast, turbulent environments lead to more copying; the correct
path is harder to discern in more ambiguous situations [27]. Think
about how this relates to a high-pressure business environment,
where the situation may be highly complex, to begin with, and
changes rapidly. In these situations, organizations follow each
others’ business decisions more than in a calm and stable business
environment [26].
People and organizations jump on bandwagon for many reasons.
They may genuinely believe it is the best option, or they may see
others they admire jumping on the same bandwagon, which gives
that choice more credence [26]. However, the bandwagon effect is a
failure to apply logic and our own experience. Information about the
majority’s opinions and choices is easy to obtain and quick to
process, but most are not always right. Even the majority opinion of
a group of experts is not always correct.
5. Confirmation Bias
Confirmation bias is a bias towards information that confirms what
we think we already know. Take this example:
Jayshree firmly believes that all Hollywood actors over 30 years old
have had cosmetic surgery. Every time she sees somebody whose
face looks smoother than last year, she points it out to her friends.
What do you think Jayshree says when she watches a movie and
the actors look no different? Nothing, of course. It is unremarkable
that the actors have aged normally. Jayshree notices evidence that
supports her belief, but she is oblivious to the evidence against it.
Confirmation bias is extremely common, affecting what information
we notice and what information we seek out [28]. People have a
strong tendency to seek out information that confirms their beliefs
due to a strong desire to maintain those beliefs [12]. Returning to our
example, Jayshree might search the internet for ‘celebrity plastic
surgery’ information, but she would not be looking for information on
who has not had plastic surgery.
When faced with a message, beware of confirmation bias. It is
similar to wishful thinking: sometimes we believe what we want to
believe, and evidence supporting what we believe grabs our
attention.
6. Anchoring
Anchoring occurs when we over-rely on the most prominent feature
of a situation, person, or object. Anchoring strongly affects our
judgment and estimation [11]. This may be the first piece of
information we encountered or the information that we feel is most
important. Anchors are mainly numerical. For example, someone
taking out car finance might choose to focus on the interest rate,
displayed in large figures on the website, rather than processing
additional information.
Anchoring biases our judgments, but also things like estimates. If
you go to a car showroom, you may have room to negotiate.
Nonetheless, your mind anchors your initial offer around the price
quoted on the window. This is known as anchoring and adjustment:
the first number we see biases our subsequent thinking [1,11].
Psychology experiments show that different anchor points can lead
to vastly different decisions. Furthermore, the anchor does not even
need to be related to the question to influence a person’s answer [17,
29]. This shows that anchoring is pervasive and, to some extent,
automatic.
Anchoring is sometimes also known as a heuristic, and it does
enable our minds to take a shortcut and stop processing more
information. However, it is sometimes automatic and, at other times,
more conscious [11]. Automatic anchoring is more like a suggestion:
the anchor primes somebody’s estimate or choice by activating
similar numbers or ideas in mind, and the person experiencing this
may not be aware of it.
On the other hand, deliberate anchoring is when you consciously
adjust your initial estimate to get closer to the real answer. This
process is more controlled, but people typically stop adjusting too
early, meaning the anchor still biases their final response. We are
more likely to stop adjusting too early if we are under time pressure
or are multi-tasking [11,17].
7. False Consensus
This bias comes from social psychology, the study of personality and
social interaction. False consensus focuses on how we see
ourselves relative to other people. Like the arsonist who might have
once said, 'Well, everyone loves to set fires, don't they?', we
overestimate how common our actions or traits are in the general
population.
This bias emerges when people hear about other people's
responses [30,31]. Whether we read others’ answers to a set of
questions or hear about decisions made in a scenario, we see other
people's responses as more common and typical when they match
our own. Conversely, we see others' responses as strange and
uncommon when they diverge from our own.
False consensus effects are larger when the question is more
ambiguous. One study asked people specific questions like ‘are you
the eldest child?’ and more general questions like ‘are you
competitive?’ The study reported a much more pronounced false
consensus effect with more generic questions [32]. This provides
more evidence for the effect and suggests that when people have
more room to interpret the question in their way, they perceive others
as more similar.
8. Halo Effect
The halo effect is not about angels; think about the type of halo you
see around a streetlamp in the mist. This bias occurs when
something is seen positively because of an association with
something positive, like the light from the streetlamp spreading out
as it refracts through the mist particles. You could call this the ‘glory
by association’ bias.
We all know that first impressions matter in our relationships. This
bias is part of that. Our initial impressions of people and things can
create a halo, overly influencing what we think of them.
When people have to rate others on positive criteria like competence
or intelligence, their ratings are influenced by how warm and friendly
they seem to be [33]. The halo effect even occurs for traits we know
are unrelated, such as height and intelligence.
As you can imagine, the same applies to objects and ideas.
Companies like to use beautiful people and scenery in their adverts
and promotions because this gives potential customers a positive
impression of the company and the product.
9. Availability Heuristic
The availability heuristic affects us when we have to judge probability
or frequency [12]. We assume things we can imagine or recall easily
are more common or more likely. Another way to conceptualize this
is to assume that the first things we think of are the most important
[1,11].
You can see how the availability heuristic can be useful. When
deciding where to take a vacation, your first thought is more likely to
be somewhere you want to visit rather than an obscure destination
you have barely heard of. The desired destination is more available
in your memory, as well as more vivid.
This heuristic draws on several characteristics of human memory
[17,34,35]. Firstly, the recency effect: we have better memories for
recent events or things we have seen or heard recently. Secondly,
we remember things that make us feel emotional. Finally, we
recollect personally relevant and vivid information far better than dry,
boring stuff. Any of these or all of them together can create high
availability.
The opposite is also true. If you cannot think of many instances of
something, you will think it is less common or less probable. When
researchers asked participants for a very large number of
advantages of something, such as their college course, they found it
hard to think of enough. These students rated their course as worse
than others who had to think of fewer advantages [11].
This example seems paradoxical at first, but not when you think of it
in terms of availability. The course’s positive aspects felt less
common to those who were asked for more because they could not
think of the full set of advantages requested. This illustrates how the
availability heuristic could be a problem, depending on questioning
techniques.
If we can call examples to mind easily, we think events are more
likely to have happened before or to happen again in the future. For
instance, people worry that terrorist attacks are possible or even
probable. A young graduate’s family warns them against moving to
New York, Paris, or London because of 'all the terrorists.' These
attacks are readily available to people's minds, so they feel that
attacks are more likely than they are.
Availability is a useful heuristic because it allows us to make rapid
judgments and decisions. People are more influenced by availability
when they process information quickly and automatically, for
example, when feeling happy or distracted [11].
Pattern Recognition
Our brains are incredibly good at recognizing patterns. People often
perceive faces in facelike configurations, like the Man in the Moon,
known as visual pareidolia. A large area of our visual brain is
dedicated to face processing, so it is not surprising that we perceive
them even when they are not there [12].
Pareidolia is automatic: people do not try to see these patterns; they
just do [2]. You have almost certainly had this experience. Countless
internet memes show objects like houses and cars that look like
faces. Sometimes it can take a few moments for the pattern to
resolve itself into the image. Still, other times it strikes you straight
away, and it is difficult or impossible to go back to see the image as a
less meaningful pattern.
Pareidolia can occur in other senses: hearing Satanic messages in
music played backward or ghostly voices in radio static.
Automatic pattern perception illustrates similar tendencies to optical
illusions, like flat images that appear three-dimensional. These are
not just fun and games. Both pattern recognition and false
perceptions could lead to false beliefs, and people can and do seek
information to support them.
In summary, our brains are incredibly good at recognizing patterns
yet poor at statistics [12]. We regularly perceive meaning in random
stimuli.
Action Steps
Our brains do a great deal of information processing that we are not
always aware of. We are quite fortunate to have all these short-cuts
making processing more efficient. Try these suggested exercises to
explore these ideas further before we move on.
1. Fantastic Fallacies, And Where To Find Them
Find a list of fallacies, biases, and heuristics in an online
encyclopedia or psychology website. See how many there are?
Read some of them and make a note of your thoughts. You could
look at things like:
Summary
At the beginning of this chapter, the story illustrates that sometimes
we get it wrong; sometimes, this applies even when we exercise
good critical thinking skills. Our cognitive processes may be
sophisticated, but they are also economical. In the story, the parents
believed they were benefiting their son by playing Mozart because
they believed the high-profile research paper suggesting that Mozart
made people more intelligent.
The parents only read the initial research study on the Mozart Effect.
They did not follow it up: hasty generalization. They did not realize
that other scientists had found it so hard to replicate the Mozart
Effect. They fell into confirmation bias by only noticing media reports
praising (and confirming) the Mozart Effect.
The halo effect may have operated too because Mozart is generally
accepted as one of the best classical composers. If it had been an
obscure composer, would the paper have gained such a high profile?
The population found it easy to fall in love with the idea that Mozart's
music was special in yet another way.
Nor were the parents skeptical; if they had been, they would have
researched the effect for themselves rather than taking it at face
value. Scientists aim to be skeptical at all stages of their workflow,
from ideas to analyzing the data from completed research. The next
chapter elucidates scientific skepticism in greater detail.
Takeaways
1. Our minds abound with fallacies, beliefs, emotions, biases, and
heuristics, all of which impact our perceptions and how we process
information.
2. These can have massive effects, so we need to remove their
effects if we want to reach solid conclusions and make good
decisions.
3. It may not be possible to overcome these biases induced by our
minds completely, but critical thinking can help.
3
F ifteen-year-old
in her hands.
Alanna Thomas burst into tears and buried her face
Paul and Elder [57] describe nine intellectual standards that should
help us think both lucidly and metacognitively about ideas. These are
standards that scientists strive to meet in their communications, and
they give you a helpful framework whether you are composing an
argument or receiving one from another source:
Clarity: to reason about a claim, we must be clear about what they
mean. Therefore, when you are communicating, you need to aim for
maximum clarity as well. This standard is a prerequisite for all the
other standards.
Accuracy: you may not have access to resources to check the
accuracy of all points made, but you can assess it by thinking about
whether the claim is verifiable and whether the source is trustworthy.
Precision: information should be appropriately precise for the point
under discussion. A claim could be accurate but imprecise; for
example, ‘the company’s profits fell last year’ is less precise than
saying they fell by 18% last financial year.
Relevance: we might reason clearly, accurately, and precisely, but
this is pointless if we deviate from the core topic.
Depth: this means dealing with the complexities and relationships of
the concept under discussion rather than over-simplifying it.
Breadth: this means taking in multiple (relevant) points of view and
recognizing alternative perspectives on the issue. For example,
business strategies often look at environmental, ethical, and social
concerns, as well as economic factors
Logic: this means ensuring that the arguments work logically: does
the evidence lead to the conclusion, and does the argument have
internal consistency?
Significance: this is related to relevance, but sometimes relevant
points are trivial. We need to ensure that our reasoning focuses on
the important aspects of the problem.
Fairness: our reasoning should be bias-free and honest. We should
aim not to argue only for our own interests. Others may interpret
unfair arguments as attempts to manipulate and deceive them.
Hopefully, you can see how these standards relate to scientific
skepticism and communication. All of these standards apply to
science but also our everyday lives, both work-related and personal
problems. Therefore they are useful to remember when composing
or reading claims and other communications.
Scientific Revolutions
Thomas Kuhn was a philosopher and scientist who wrote about how
science moves forward. His work heavily influenced the postmodern
view, but he did not argue that science is anti-progress. Instead, he
said that we do ‘normal science,' and knowledge moves forward in
jumps, which he called paradigm shifts. ‘Paradigm’ refers to the
prevailing world view or scientific approach of its time. For example,
history saw a great paradigm shift away from classical Newtonian
physics when Einstein advanced his theory of relativity [ 1,61].
Normal science is an incremental process. Small advances taken
together, debated by scientists in journals and conferences,
gradually increase knowledge. Scientists predict many discoveries in
advance during normal science, based on theories that they believe
have solid foundations. Education imparts received wisdom to
budding scientists, and they become fluent in its specific methods
and language and continue research along the established lines.
A paradigm shift results from a crisis in science. The existing theory
can no longer explain observations, or a radical new theory gets
proposed that explains things better than the old one.
Examples of paradigm shifts in science:
1. Copernicus’ proposal that the Sun, rather than the Earth, lay at the
center of the Solar System.
2. Lavoisier’s discovery that chemical elements combined to make
molecules with various properties, superseding alchemical views of
chemistry.
3. In the 1880s, the ‘germ theory’ that tiny organisms (rather than
bad air) caused diseases.
A paradigm shift means a change in what scientists study and how
they study it, and how society views that topic, what methods we use
to investigate it, and what conclusions are acceptable. These are
huge shifts, hence the alternative term: scientific revolution.
So what fuels paradigm shifts? There are three major influences.
Firstly, anomalies. Scientific anomalies happen when scientists find
things they cannot explain. If enough of these happen, a new idea
could gain momentum and lead to fundamental changes (a paradigm
shift). Small anomalies may occur in science all the time, but nobody
is looking for them to not be perceived or recorded.
Secondly, new technology (ways of measuring things) can fuel
paradigm shifts. For example, medical imaging techniques to
psychological sciences led to the new field of functional brain
imaging in the early 21st century.
Finally, when a new paradigm appears, scientists need to compare
the new and old paradigms with each other and with observations.
Some may be looking to verify the new paradigm and falsify the old
one; others will do the opposite. Everybody works to find out which
theory fits the facts better. They do 'extraordinary science' to see
what is going on and rewrite the textbooks. Extraordinary science
helps to complete the paradigm shift from old to new.
So what happens afterward? We might casually call outdated
science incorrect, but it was fine in its own time. Outdated science
took steps toward the truth, and the new science grew out of the old.
The discoveries made might still stand but get interpreted differently
under the new paradigm.
Science keeps going between paradigm shifts because people like
to solve problems. Even if the progress is slow and piecemeal, new
research is important. Scientists may work within the established
boundaries or try to push things forward slightly all the time. As long
as their approach is similar enough to their contemporaries, their
results comprise mainstream science.
Kuhn’s critics proposed that science does progress between the
paradigm shifts. For example, Einstein’s theory of general relativity
began as a theoretical description. Later, other scientists found
empirical evidence and general relativity led to a wealth of
knowledge and technology that we would not have had otherwise.
Where postmodernism gets interesting is in its applications to real-
world settings like management and education. A postmodern
approach in these areas fosters an open-minded attitude: if the
establishment is no more correct than anybody else, everybody's
ideas are potentially valuable. If there is no objective truth, a new
business process or teaching technique is never guaranteed to
succeed, nor is it guaranteed to fail [58]. That is quite liberating.
Action Steps
We have examined scientific skepticism in detail, with the aim of
helping us get to the truth. Why not have a go at these optional
exercises and apply some of the ideas we have discussed?
1. Opening The Mind
Write a skeptical and open-minded proposition or theory of your own.
It may be helpful to use something trivial for this practice exercise. It
can be as simple as ‘Why I should get my driveway resurfaced this
summer,’ or ‘An explanation of why I choose not to dye my hair.’ Use
the following helpful habits of mind [2]:
a. Gather as much evidence as possible. For instance, what is the
current state of your driveway, and what are the risks of not getting it
resurfaced?
b. Beware of false positives and false negatives in the evidence. For
example, you might read that driveway surfaces typically fail after
five years, but check who wrote this and what they base it on, and
see what other sources say.
c. Think broadly: consider everything that might possibly impact the
proposal or theory. This might include personal finances, the broader
economy, environmental concerns - whatever factors are most
relevant to your proposal.
d. Consider what somebody with the opposite opinion to yours would
write: how they would explain it and/or what they might decide. This
will help you maintain an objective perspective.
2. Metacognition Exercise
It is normal and natural to be resistant to changing our minds, but we
learned here that reflecting on our own cognitive habits can help
enhance them. Use this quick questionnaire as a self-reflection
exercise, or rate somebody who you know well. Adapted from
Snelson [53].
a. How would you rate your ability to accept any new minor idea with
a lot of evidence to support it?
I accept it immediately
I accept it after doing my own research
I need a few weeks to absorb the new idea
I do not change my mind over minor ideas
b. How would you rate your ability to accept any new major idea with
a lot of evidence to support it?
I accept it immediately
I accept it after doing my own research
I need a few weeks to absorb the new idea
I do not change my mind over major ideas
c. How would you rate your ability to accept any new revolutionary
idea with a lot of evidence to support it?
I accept it immediately
I accept it after doing my own research
I need a few weeks to absorb the new idea
I do not change my mind over revolutionary ideas
3. Standard Process
Analyze an article to check whether it meets the intellectual
standards suggested by Paul & Elder [57]. Choose something like an
editorial discussing a controversial topic. Is it:
Clear?
Accurate?
Precise?
Relevant?
Deep?
Broad?
Logical?
Significant?
Fair?
Summary
The story that began this chapter showed us that people reach faulty
conclusions even when they try to keep an open mind and discover
the truth: the police thought they had solved the crime, and Lin
thought she had found a better explanation. They were both wrong.
With a truly skeptical attitude, somebody would have doubted both
explanations, put them to one side, and investigated further. They
would have been open to alternative explanations and would not
have been averse to changing their mind even once they thought
they had the correct answer.
Scientific skepticism is not easy. It takes vigilance and discipline to
learn, but like critical thinking and other skills that we discuss here,
you can hone your skills. The processes can become more
automatic and less effortful as you develop your expertise.
Next, we will look at how to deal with claims you see in the media.
That includes social media, so it should be a great way to practice
your skeptical attitude!
Takeaways
1. When assessing claims, act like a scientist: see whether the claim
is verifiable and falsifiable. If not, perhaps somebody is asking you to
believe something without sufficient reason.
2. When making decisions and forming conclusions, keep a balance
between skepticism and open-mindedness.
3. To reach the truth, aim for lucidity. Sweep your preconceptions out
of the way and experience the world as it really is, without your
previous experience blinkering you to new facts and evidence.
4. Keep the postmodernist view in mind: perhaps we can never know
the truth, and perhaps meaning is completely relative. If that is the
case, many things are possible.
4
Action Steps
1. Media Literacy Practice
Perform a general web search for a topic of interest and assess two
of the resulting articles or webpages using CRAAP and lateral
reading. Notice whether the two approaches give you different
impressions of the sites, perhaps even leading to different
conclusions.
2. Deep Dive
Choose a news story that interests you; perhaps it relates to your
business or personal concerns. It could be one that you found during
Action Step 1. It should be sufficiently complex and mainstream for
you to find at least four different sources. Research the information
these sources report, aiming to be as diverse as you can. For
example, look for left and right-wing sources from both the mass and
social media. Chart on a piece of paper what they agree on and what
they disagree on. Can you see different 'facts' reported? What about
word choices indicating bias? You can repeat this exercise in the
future if you want to assess another news story in depth.
Summary
This chapter’s story showed how fake news concocted by extremists
snowballed and nearly spelled disaster for a company and a
community. This fictional story’s message was serious: many real-
world fake news stories have had terrible consequences. The few
examples given here should give you an idea.
The mass media is now immune to getting things wrong. Still, even
journalistic outlets vary in the quality standards they set for
themselves, so it is important to apply your critical thinking skills here
too. Three of the characters in the story displayed great analytical
skills in picking apart the mess of blogs and social media posts that
led to the misinformation problem. They presented their findings
rationally and calmly that defused the situation. In the end, this
positive outcome could have even enhanced the company’s
reputation.
Next, we move on to look at how others try to deceive us to our
faces and how we can sort the truth from the lies in these everyday
situations.
Takeaways
1. To separate sense from nonsense in mass media and social
media, we need to apply the rules of logic and use our own
expertise.
2. We need to be alert to fake news, which is deliberately concocted
to fool people, and not confuse it with real news, satire, editorial
opinion, propaganda, or advertisement [69].
3. Take a skeptical approach even if the story feels true, and beware
of ‘news’ that seems too extreme to be true.
4. You can use media literacy tools and resources, such as CRAAP
and lateral reading, to evaluate the source publication and the
author, recognize bias and opinions, and assess the accuracy of
claims.
5
A fter reading the fine print, Alicia decided she was happy with the
terms of the business loan.
She had recently met Aaron Lowen, a business development
consultant from Howsville, the next town along from hers. He strolled
into her ice-cream parlor and quickly persuaded her to open another
cafe in Howsville. She refused at first: she liked running a single site
business, and her customers found it charming to buy ice cream
from a family-run concern. The expansion was too risky.
However, Aaron insisted.
“They don’t even have real gelato in Howsville! With this artisan
Italian ice-cream, you’ll make a fortune! I promise you, there are no
decent ice-cream cafes at all.”
A smile flitted across Aaron’s face. Quickly, he looked serious again.
“Is this really a good opportunity?” Alicia asked.
“Yes, definitely,” Aaron grinned.
Alicia noticed a strange wobble of the head, but thought no more of
it.
So here she was: opening a new café. Once the loan was in place, it
was all hands on deck.
However, Aaron had not been a hundred percent truthful. A local
gourmet ice-cream company was running trucks and pop-up cafes
across town, and they had no qualms about targeting her new store.
Sometimes the local kids would even come in to criticize her product:
“Not as good as Toni’s.”
The trouble at the new branch rapidly damaged the entire business.
It seemed time to cut their losses. Then, vandals broke into the new
café. They wrecked the displays and littered ice-cream and toppings
everywhere. Alicia closed for the day, and her employees cleaned up
while she called the police and the insurance company.
This was almost the end of the whole company, but Alicia smiled and
kept going. Her sister-in-law gifted her some of the profit from her
own business, which kept Alicia afloat for a while. Sadly, the new
café was still not viable, so Alicia decided to close down.
On the last day, they organized an ‘everything must go’ event, with
half-price ice-creams for all the local high school and college kids.
Late in the afternoon, this turned into free ice-creams for all.
Alicia confided in a middle-aged lady who was enjoying a cookie and
cream cone. The lady was sympathetic:
“It’s very sad, but Aaron from Toni’s has such a good grasp of the
local business environment and so many friends and contacts in the
town. You were brave to compete with him.”
“Aaron who?”
“Aaron Lowen, our local entrepreneur. He’s involved in most of the
businesses in town, and even wants to open up in your town as well.
Can I get some white chocolate sprinkles with this?”
In a flat tone, Alicia directed her to ask at the counter. So Aaron had
lied.
Finally, she had found the missing piece of the puzzle: Aaron was
deliberately trying to put her out of business, and it had almost
worked. He had almost cost her everything. If Aaron had succeeded,
he would have been the number one ice cream seller in both towns!
She had to applaud his audacity: pop-up ice cream cafes and trucks,
rather than fixed premises, meant she had not discovered that there
was already a popular artisan ice cream maker in Howsville. So she
was back to her initial position, but it could have been much worse.
A few months later, things had improved. Sympathetic locals who
heard about the diabolical deception flocked to Alicia’s home town
cafe. It was a warm spring, so she added two bicycle-based ice
cream sellers. All this led to record sales, as well as bad publicity for
her rival Aaron.
Alicia was intelligent and successful, but she missed the signs of
deception. Aaron gave away some clues: the quick smile that flitted
across his face when he claimed there were no ice-cream cafes in
Howsville, and the head wobble when he confirmed that it was a
good opportunity, betrayed his real opinions. He promised something
that sounded too good to be true, and he appeared trustworthy,
using his expertise as a business consultant to add credence to his
claims.
Alicia noticed these clues but did not know how to interpret them.
She did not know that even accomplished liars reveal themselves
occasionally, as the human body and face express our emotions
even when we work hard to suppress them.
Most people are basically honest, but one deliberate deception could
potentially cost us a lot. Therefore, as well as examining claims and
evidence in detail, being skeptical about ideas, and examining
evidence, we need to look at other clues that can tell us if somebody
is lying, whether it is unintentional or deliberate on their part.
relationship.
To detect lies of all types more successfully, we need to look at the
communication in context, including its content. We are more likely
to succeed using a rational approach, comparing what they say to
other evidence and our prior knowledge than by over-relying on
signals from the potential liar’s face or voice [81,86].
In terms of what they say, liars give less consistent accounts, told in
a less personal way, that feels less plausible to listeners. Their
stories seem to be told from a distance and are less clear; these
effects are more reliable than non-verbal cues such as eye gaze and
expressions [81].
Another clue is the structure of the story. Liars are more liable to tell
you what has happened in the order that it happened, whereas
somebody speaking the truth moves around the story’s timeline. This
structured approach suggests that liars have carefully composed
their story but, ironically, end up relating something that sounds less
natural [80].
Nobody Is Immune
Action Steps
Now that we have looked at how to use critical thinking and evidence
to spot lies and deception in everyday life, it is time to apply some of
this knowledge. Try the following action steps.
1. The Lying Game
Play a game of lie detection with somebody close to you. Each of
you can prepare a handful of lies and truths that you will try to
convince the other person are true. Remember this is a fun learning
exercise, so use humorous or innocuous facts about yourselves that
the other person does not necessarily know. Use some of the
techniques covered in the chapter to convince them and try to detect
the lies correctly, and have a conversation afterward about how it
went.
2. Proof Of Lies
Try some of the techniques for spotting a liar. Find an online video
from a few years ago of somebody you know is lying because
someone else exposed them or they confessed. This could be from
politics, an interview with a public figure, or a televised court case.
Watch the video in slow motion and look out for some of the signals
we have examined in this chapter:
You could then do the same but listen for any acoustic signals, such
as raised pitch and frequent hesitation, perhaps comparing their
verbal behavior to an example when you know they are not lying.
Summary
In the story at the start of this chapter, it turned out that the business
consultant had seriously misled the business owner: the rival
company was a serious threat to her business’ expansion after all.
How could she have picked up on this?
Unfortunately, there is no surefire way to tell if somebody is
deceiving you, especially if it is somebody you do not know well.
However, Alicia could have checked the facts: did the other
neighborhood have real gelato? Was the promise that there were no
decent ice cream cafes in that town too good to be true? The
deceiver also showed a possible micro-expression (a fleeting smile
at an odd time) and an emblem gesture when we slightly shook his
head, possibly revealing that he was saying the opposite of the truth.
She might have been able to figure it out, but perhaps assumed that
this man was telling the truth because most people are honest.
In the next chapter, we will explore what some people might call a
special category of scam. We look at pseudoscience and how to
distinguish it from real science and technology.
Takeaways
1. Tune into the visible and audible signs of potential deception: you
can learn them through careful observation and practice. However,
you need to apply critical thinking to what they say and pair this with
a keen observation to get closer to the truth.
2. There is no sure-fire way to detect lies, but knowing the person or
establishing a baseline will help. Even a host of behavioral clues
cannot prove that somebody is lying.
3. People believe others by default, and research suggests this is
warranted as most people are honest.
4. Selling products and ideas is perhaps the exception; scams and
frauds are sadly very common, but you can detect them and
overcome them using a skeptical, analytical approach.
6
What Is Pseudoscience?
Now that we have a clear definition of what science is and its
methods, we need to define pseudoscience. As the prefix ‘pseudo’
implies, pseudoscience refers to beliefs and activities that might
resemble science but are not science [12]. We call them ‘not
science’ because they diverge from mainstream scientific theory, and
in some cases, scientific methods cannot test them either [97].
The line between science and pseudoscience is not always clear,
though. Investigators working in pseudoscience are free to employ
hypothesis-testing and scientific techniques to examine evidence
and conclusions. But they sometimes commit mistakes and produce
misinformation in the process, and end up presenting incorrect
conclusions.
Examples of pseudoscience:
Alternative medicine: alternative therapists sometimes fail to specify
how the therapy works or make general references to things like the
energy that the practitioner harnesses or directs into the client's
body. It would be difficult to devise an adequate control situation to
compare to these therapies. Pseudoscientific therapies often rely on
hearsay rather than clinical trials, and this can be subject to
confirmation bias and the hasty generalization fallacy [1].
Psychic powers: many people across the world believe in
supernatural powers like extra-sensory perception and clairvoyance.
Believers and scientists alike find it difficult to test these ideas, and
although many have tried, the evidence is inconclusive [1].
Astrology: predicting people’s personality traits and future events
from the position of the stars, the Moon, and planets is another
ancient practice that appeals to people across the world. The
predictions are vague and often not falsifiable, and therefore have
not been tested in a rigorous way like scientific theories [98].
Investigators have found no correlation between star signs and
personality traits [99].
We should not confuse folk remedies and young sciences with
pseudosciences. Be skeptical of ancient traditions: they might work
and might not, but age alone does not imply efficacy [95]. We should
also be open-minded about young sciences while establishing their
methods and world views, although further scientific investigation
may falsify them. One example is germ theory, which the scientific
establishment thought was implausible at first, but further
investigations confirmed that microbes, not foul air, caused diseases
[61].
Action Steps
1. Detective Work
Make a brief list of possible pseudosciences and use your skills to
gather evidence and decide whether you think they are real science
or pseudoscience. If you need ideas, choose a couple of these:
Iridology
Mycology
Homeopathy
Neurochemistry
Geomorpholog
Macrobiotics
2. Study Skills
Devise a scientific theory within your field of expertise, and plan an
investigation. This could be something work-related, within a leisure
pastime (such as sports or creative work), or something silly and fun.
Whatever you choose, aim to be thorough. It is fine if you cannot
conduct the study for real. For example, if it involves time, resources,
or ethical issues.
Work through the general scientific method to hone your idea and
generate something you can test. Make casual observations,
formulate a research question, narrow this to a testable hypothesis,
and consider how you would analyze the data. If you are not a
statistical expert, never fear - you can always draw a graph and
compare the data visually.
Finally, consider what valid conclusions you could draw from different
results. Congratulations, you have just proved you are not a
pseudoscientist!
Summary
In the anecdote at the start of the chapter, we met Marlon, who was
confused by his Mom’s insistence on keeping horse chestnuts in the
corners of her apartment. She said this was a well-known way of
keeping spiders out of the home, but she could not explain why
horse chestnuts put them off. This vague explanation is similar to
pseudoscience: people might believe something works, but they do
not know why.
Marlon’s Mom believed the practice worked because it was
traditional, and she also exhibited confirmation bias. Even Marlon
succumbed to it slightly when he reflected that he had never seen a
spider in his Mom’s house. However, less reliable than objective
evidence.
A scientific approach to any idea requires observation, followed by
defining a solid research question that you can test in the real world.
This kind of study does not always get done for pseudoscience. In
many instances, it cannot be done because there is no adequate
control condition to compare to pseudoscientific practice. Overall,
science and pseudoscience alike provide us with ample
opportunities to exercise our critical faculties.
Takeaways
1. Scientific methods and processes are the most reliable ways to
explore and find out about the world.
2. However, not everything that resembles science is actual science.
Mistakes and misrepresentations in the form of pseudoscience can
tempt people towards incorrect conclusions.
3. Pseudosciences persist for many reasons, including inherent
biases, wishful thinking, tradition, and certain personality traits.
4. Keep an open mind about novel ideas, but remember that some
ideas are more useful than others because they help us understand
and predict the world.
AFTERWORD
Marvin lazed on the decking at his lake house, watching the fish
whirling around in the clear water. His work phone vibrated on the
kitchen counter, but he let it ring. He knew the call spelled no good
for his summer retreat.
Hours later, the evening drew in, and Marvin finally got around to
checking his missed calls. He was surprised to see that his bank had
called him and left a message. That was unexpected. He managed
to call them on the number they left and got through to an operator
straight away. The line was terrible, but the voice on the other end
sounded urgent.
“I can’t hear you,” said Marvin.
“Give me your password, Mr. Keller,” the crackly voice said.
“Of course…” Marvin duly gave the operator his password and
further security details.
Apparently, there was a problem with his account, which meant he
had to wire his money to a different account urgently.
Did you guess? Scammers targeted Marvin and managed to get him
to transfer all his funds to them. He did not even notice until he
returned from his lake house to double trouble: his work colleagues
had realized he had ripped them off, and he realized he had gained
nothing because he had fallen for a telephone banking scam.
In this example, the protagonist was lax about checking the
credentials of the person calling him. The signs were there,
particularly the unscheduled contact and urgency of transferring the
funds. His lack of skepticism about the call ended up costing him a
lot.
Separating sense from nonsense is a massively difficult task, not
least because potential deceptions bombard us all the time, almost
as if they were waiting in line for us to drop our guards. However, we
can get closer to the truth by applying critical thinking techniques to
information we encounter each day. In summary:
Critical thinking approach: this means reasoning logically, using
evidence rather than working to justify conclusions we desire.
Gathering information to argue for a predetermined conclusion is
easy but wrong. With critical thinking, we can be sure that our
decisions are conscious, and deliberate and based on facts. We
must be clear about the difference between facts, opinions, and
claims. We must know about the role emotions play in human
cognition. Lastly, we must seek evidence relating to purported facts,
including researching the source of and reason for any message.
Our complex minds: how our brains work can lead to blurred
boundaries between truth and non-truth, or even getting things
completely wrong without even being aware of it. Humans are
emotional creatures with a drive to learn from and believe others, so,
unsurprisingly, misinformation spreads. Furthermore, biases,
fallacies, and heuristics all have a significant influence on our
thinking, sometimes without us ever becoming aware of it.
Scientific skepticism: this is an attitude that can help gauge the truth
of claims. Be like a scientist and question whether a claim you hear
can be verified or falsified. Scientific skepticism means overcoming
our natural inclination to process information quickly and
automatically, and instead stepping back, slowing down, and really
analyzing what we encounter. Skepticism means doubt, not
necessarily disbelief, and it works best with an open-minded outlook.
The media: social media and the mass media are the major sources
of information for the vast majority these days, but they vary in
reporting accuracy. Some information can even be completely false,
designed to lure people in to spend money and/or time on websites
run by shysters. Use media literacy techniques like lateral reading to
get a deeper understanding of the information you see in the media,
rather than taking it at face value.
Deception: dishonesty is fairly widespread outside of the media, too.
Most people are honest enough about the things that matter, but we
would all be wise to stay alert for the signs that people are lying to
us. Faces, voices, and body language all provide clues, but we
should pay attention to what they say as well. Similarly, be alert to
the signs of fraudsters using scams like advance payment schemes.
Pseudosciences: are explanations or techniques that claim a
scientific basis or approach, but they are distinct from sciences in
several ways. Science uses a cycle of observation, testing, and
refinement of theories and methods, aiming to advance knowledge in
a specific area. In contrast, pseudosciences are sometimes difficult
to test in a truly scientific manner. However, cynics sometimes
mislabel progressive science as pseudoscience, so we should do
our best to assess new ideas in an open-minded and skeptical
manner.
In conclusion, now that you have the tools required to separate fact
from fiction, make sure to do your critical thinking as well as you can
and work to develop it. Critical thinking helps you recognize and
avoid harmful and useless thought patterns. It helps you to reach
better conclusions. It improves the quality of your thinking, raising
your chances of achieving your goals. Good luck!
ONE FINAL WORD FROM US
If this book has helped you in any way, we’d appreciate it if you left a review on
Amazon. Reviews are the lifeblood of our business. We read every single one
and incorporate your feedback in developing future book projects.
The most successful people in life are those who enjoy learning and asking
questions, understanding themselves and the world around them.
We created the Thinknetic Community so that like-minded individuals could
share ideas and learn from each other.
It’s 100% free.
Besides, you’ll hear first about new releases and get the chance to receive
discounts and freebies.
You can join us on Facebook by clicking the link below: