You are on page 1of 15

Our Greatest Strength Is Now Our Biggest

Weakness: Consider the Stick Bug...

By Dave McAlinden

After about age five, we no longer live in the present moment. We live in our predictions. This
means we are living in a milliseconds-ahead future we’ve created to align with how we think the
world s​ hould ​be based on our experiences. For example, you are not actually reading these
sentences in real-time. You are making predictions about what words come next based on how
cues align with your own predictive model for the English language. The only time we become
“present” is when those predictions are buttfarts. Excuse me ...when those predictions are
broken​. That feeling you just had when you read the oh-so-charming word ‘buttfarts’ was you
breaking out of your predictive stream and into the present moment. But now that my syntax
once again follows your model for it, you’re back to living in the future.

Structures in the mind that link concepts and relationships among other concepts are what
psychologists call ‘schemas’ or ‘schemata’. These are constructed when bits of information are
labeled and combined into clusters that connect with other clusters to codify what we encounter
in the world. Essentially, schemas are what make up our predictive models.
These structures have allowed us to survive and evolve, to create associations, and to recognize
and compare patterns. However, they can be a major hindrance when it comes to updating old or
accepting new information. Especially if you’re an adult. And that’s because many of our
existing schemas come with deeply embedded misconceptions.

To explain this more concretely, let’s take a moment to


unpack how color works:
Visible light is electromagnetic energy of which the human eye is only able to see a limited set of
wavelengths. That limited set is Red, Blue, and Green. These are the primary colors of the ​light
spectrum, not to be confused with the primary colors of p​ igment.​ The pigment of an object has to
do with the electron arrangement in the atoms of it. This determines what photons can be
absorbed or reflected. Therefore, quite literally, Washington Apples aren’t really red; their
electrons are just arranged to not absorb red frequencies, and thus that’s what gets reflected.
What's reflected is what we see.
So, light bounces off of stuff around us and eventually hits the retinas at the back of the eyeballs
where it gets pieced together into a signal which is then zapped into our brains to be coded into
what we experience as vision. If we experience an object that reflects a specific frequency
enough times, that frequency becomes part of our schema for that object. In a way, that object
gets a kind of mental hashtag. For example, “Washington Apple: #red #fruit #doctor #crunch
#teacher #pie #etc”. Thus, the color red is embedded in our schema for Washington Apples.
Now, let’s try to manipulate that process by messing with white light.

White light is the combination of all the primary colors of light— Red, Blue, and Green; but, if
you subtract any one color, the white ceases to be white. A combination of the remaining two
colors will emerge. For example, if you take out the Green light, you’ll see Magenta. If you take
out the Red light, you’ll see Cyan. Adding or taking away wavelengths at the source of the light
will change your experience of it. This means, since we construct our reality through the senses,
the brain only gives you a filtered version of the information that exists around us. One of the
crazy things that result from this filtration process is called "color constancy".

Color constancy is the tendency of the color of a ​familiar ​object to look the same under any type
of l​ ight c​ ondition, even if the object isn’t reflecting the actual color we perceive. For example, if
you have always loved eating strawberries and you've never seen one that wasn't red, and then
you see a bunch of strawberries, your brain will a ​ ssume t​ hem to be red even if they are displayed
in light that subtracts the red wavelengths. That is, there is no red light for the object to reflect.
Our predictions automatically fill in gaps or replace information to bias the experience of reality
to match what we expect.

So, if you placed a picture of strawberries into Photoshop and then removed all the red from it,
you will still ​experience ​red within the cyan-saturated image even though it isn’t there to be
experienced. This means your mind lies to you so you can continue to live in what o​ ught​ to be
the truth based on what in the past has always been the case. Due to our evolutionary wiring, in
such circumstances, the facade is more conducive to our survival than the reality would be. As a
result, that's what is projected in the mind’s eye.

This reveals that we don't see what is truly happening in the real world. We observe the ​effects​ of
reality on our brains through a survival filter based on our respective schemata rather than
observe reality as it ​is b​ ased on its contextual properties. In turn, pun intended, our predictive
models color our experiences.

This is just one reason why experience is poor evidence for claims about the nature of things.
But, it also indicates that we are rarely ever in the present moment. We live in our predictions for
how the world "​should" ​be.

It doesn’t end at color.


Consider this basic insight from the field of statistics: a measurement can be reliable yet at the
same time not valid. For example, your bathroom scale can be consistent in giving you the same
weight when you step on, step off, and then on again. However, if the scale hasn’t been
calibrated correctly it will display a consistently inaccurate representation of your weight.
Therefore, the measurement is reliable but not valid.

Now apply that concept to how long-term memory works—long-term memory is like a piano:
combinations of data create meaning the same way a combination of piano keys creates a chord,
a melody, a movement, etc. The tuning of the piano is what determines the clarity of the
combinations and the accuracy of their execution. You could play the right combination of notes,
but if it’s not tuned properly then those combinations become discordant. This is what happens
when we develop bad habits and buy into misconceptions or incorrect information.
For the most part, we do have reliable prediction-models; but, they’re often invalid to some
degree. The validity of a measurement depends on how accurately it is set to the information it is
measuring. Unfortunately, adjusting our mental settings isn’t as easy as re-calibrating a bathroom
scale. Imagine if you experienced serious stomach cramps when you reached for your scale’s
reset button. You’d forget all about re-calibrating your scale and go lie down. In fact, you'd
inadvertently condition yourself to avoid the reset button altogether. Now, imagine if every time
you tried to tune your piano, the heavy hard-rock maple lid slammed down on your fingers.
You’d say, “Whatever. Sounds close enough” and then go ice your hand. That’s what it’s like for
our brains when faced with having to update previously encoded information. As a result, it
becomes exceedingly difficult to be critical of a measurement that leads to an otherwise dubious
prediction.

When our predictions are broken, we are brought into the present moment briefly and are faced
with a choice: adjust our predictions to align with the updated information, or ignore the
information and seek refuge in the false security of those predictions. Most of the time we dig in
our heels, double down, and discount the new information as an anomaly or perceive it as further
proof that we were correct all along. Social media, bar-guments, and Thanksgiving dinner
conversations are full of relevant examples as I’m sure you have noticed. Behavioral economists
have termed this phenomenon "The Backfire Effect’.

We tend to disregard information that contradicts our beliefs. In turn, those beliefs are
strengthened even though they may have just been debunked. Since our beliefs are predicated
upon our experiences, you get a set of constants that are reliable but not valid. It walks
hand-in-hand with what is known as ‘confirmation bias’— our tendency to only seek out
information which confirms our beliefs rather than updates our knowledge-base so we can think
in terms of what i​ s​ rather than what o​ ught to be​. And no one is the exception. Everybody does
this. Everybody. You, Me, even Tom Hanks does this. And here’s why: it all comes down to

pain.

As it turns out, a major reason for belief-change resistance is that the act of deliberately changing
your mind ​physically h​ urts. The brain consists of eighty-six billion neurons, with at least ten
thousand connections between each of them. Such a massively resource-intensive infrastructure
takes an incredible amount of energy to maintain. For example, your brain, at rest, uses twenty
percent of your total energy stores throughout the day. Learning something new, especially when
it requires you to alter your predictions about the world, seriously amps up that energy usage. As
a result, we experience pain. Since we are hardwired to avoid pain, it is tremendously difficult
for us to be actively disabused of our misconceptions. Enter an ensemble cast of cognitive biases,
and thus begins the mental theatrics of self-deception.

Put simply, a cognitive bias is a mental shortcut for making critical decisions without using a
painful amount of energy. Unfortunately, the term “bias” in the psychological context is often
confused with the terms “prejudice” or “partisan” in the socio-political one. Some cognitive
biases inflame prejudices and spur on partisanship, but they are very much not the same thing.
Cognitive bias is a neurological adaptation that meets the need to react quickly to possible
danger. For the same reason our vision has evolved into the amazing superpower it is today, so
have scores of these mental shortcuts.
Way back, when we were just primates in the trees, our natural predators were snakes. As such,
our vision adapted to spot subtle yet sudden changes in patterns and then react to them with
enough speed to avoid any danger those changes suggested. For example, if you could spot a
lush green pit-viper slithering near in the brush-rich periphery, you could effectively distance
yourself without going through a time-consuming analytical process. In effect, information that
doesn’t align with our predictions about our surroundings inspires certain knee-jerk reactions. A
blessing to the nomadic tribes who migrated across the planet to be fruitful and multiply; a curse
to the wired-in minds of the information age who struggle to establish a universal set of values
across cultures.
The drastic environmental shift from an ecosystem of predator and prey to an ecosystem of
industry and thought has caused one of our most critical adaptations to become more of a danger
than protection. Consider the stick bug—this incredible creature has adapted to blend in
seamlessly with its environment to ensure its survival. However, if this same master of
camouflage were to be let loose in the city, a child might see it on the sidewalk and stomp on it
because he finds the snapping of twigs to be pleasing. Just imagine if there was an animal that
evolved to look like bubble wrap. Context is king when it comes to adaptation.

There are plenty of things we do when we think we see a snake in the tall grass just as there are
plenty of ways the world happens to us the way a child may happen to a stick bug. Quite often
they are the same. That is to say, there are several reactions we have that were once conducive to
our survival which now effectively put us in harm's way.

Here are some common reactions when it comes to


having conversations that may challenge our values and
beliefs:
We seek support from people who share the same beliefs.
Echo chambers give us a sense that we’re correct since the approval comes from what we think
is the outside world, even if it’s insular by nature. A person with a tribal bias, for example, may
voice negative experiences about other tribes to garner sympathy from peers with known
comparable experiences. But this doesn’t bring them any closer to the truth. Think back to the
concept of color constancy. If a person grows up thinking a group possesses any number of traits,
they will see those traits even if they’re not there. Those predictive models are designed through
social learning based on a value-set that excludes competing groups. If that prediction is
challenged, then comes the pain. Not only does it create cognitive dissonance, but it also
threatens one’s identity. And thus the mind chooses the comfort of familiar reassurance over the
discomfort of adjusting its schemas to incorporate new information.

We fall back on fallacious persuasive tactics to convince others


that any contradictions are untenable or fake.​
Call it ‘conspiracy logic’. For instance, a young-earth creationist will out-right reject the efficacy
of carbon-dating by applying a metaphysical counterpoint to an empirical claim, i.e. “the devil
put fossils here to trick us”. A member of the Flat-Earth Society will reject physical laws like
gravity, criticizing well-established scientific premises as a conspiracy to hide the truth. So, they
create premises that suit their claims. You could say they re-calibrate their scales to display a
favorable weight. Then they proceed to show that weight to anyone that will look. Take the
rhetoric surrounding the anti-vax movement, for instance. It’s full of blatant rejection and ripe
cherry-picking that ultimately leads to self-sealing arguments… which are dangerous given a
public platform.

We use “What-About-isms" instead of providing a sound


defense.
When criticized for something we can't defend, we shift the focus by criticizing something
related but not relevant. For example, if X is criticized for an immoral act, they might refute the
criticism by saying "what about Y and their immoral act?" Rhetorically, this lets X off the hook
without having to compromise. It's been a tried and true technique of Soviet propagandists for
decades. John Oliver has the clearest explanation of it I've yet to hear: [Whataboutisms] ​assume
that all actions, regardless of context, share a moral equivalency. And since no one is perfect, all
criticism is hypocritical and everyone should do whatever they want. I​ t's a massively effective red
herring. But, it doesn't just throw discourse off-topic. It incites resentment. And that's a
dangerous price to pay for not having to say you might need to self-correct.

When all else fails, we resort to the “Everyone is entitled to their


opinion” argument.
This one I find the most intellectually insulting. Yes, of course, we are allowed to express our
views as we please. However, the statement is often imbued with the false assumption that others
must respectfully treat those views as serious candidates for the truth. As this shelter for
untenable views is fortified by tacit approval, the gap in credibility between experts and laymen
begins to inch closer to overlap in public discourse. Put simply, when people stop getting called
out for saying stupid shit, those people end up getting the same respect as those who don’t say
stupid shit. It’s bad news for progress when willful ignorance serves as an adequate justification
for sweeping generalizations and misinformed narratives. Yes, we are entitled to express our
views; but, we have an incredible responsibility to hold ourselves accountable for the impacts
those views may have, collectively as well as individually.

All of these fallacies are natural off-shoots of cognitive


biases being activated. Nevertheless, there is one
cognitive bias that is exceedingly mischievous in this
regard.
Charles Bukowski, paraphrasing Bertrand Russell, said “the problem with the world is that
intelligent people are full of doubts, while the stupid ones are full of confidence.” As it turns out,
that sentiment has ample evidence to support it. The work of psychologists David Dunning and
Justin Kruger resulted in identifying what was so cleverly deemed the Dunning-Kruger Effect—
the tendency of people with the smallest amount of expertise to have the greatest amount of
confidence in their assertions. It’s the reason why some idiot might take one karate class, go out
and pick a fight, and then get the crap kicked out of him. Despite recent studies, this notion has
been around for a long time. Darwin once observed, "Ignorance more frequently begets
confidence than does knowledge". Aristotle once declared, “The more I know, the more I know I
don’t know.” Confucius philosophized, “Real knowledge is to know the extent of one’s
ignorance”.
But is there a ​type ​of knowledge wherein the Dunning-Kruger Effect is most prevalent and long
lasting?

Since ​psychomotor ​knowledge has much more cut and dry indicators of failure (like getting the
crap kicked out of you), the Dunning-Kruger Effect is much more prevalent within ​cognitive a​ nd
affective ​domains. Suffice to say it’s much easier to bullshit thoughts and feelings than it is to
bullshit physical ability. As a result, this principle especially applies to those who opine without
evidence. More evidence equals more nuance, and more nuance calls for unfounded
generalizations to be discounted. That means more time is needed to establish scope, define
terms, and then sort the wheat from the chaff even before the discussion begins. This is why
social media ‘debates’, without a qualified moderator, fall victim to “thread-death” so quickly or
become Google-copy-and-paste competitions. In other words, if people spoke to each other in
real life the way they do on the internet, there would be a lot of pauses in conversation to flip
through dictionaries and encyclopedias.

The point here is that an opinion based on a small amount of information and subjective
experience is not equally as valid as expertise rooted in deliberate practice, peer review, and
actual research. Regardless, it remains the general attitude. Unfortunately, when matters of
preference and intuition are confused with matters of fact and intellect, fertile ground is
cultivated for misconceptions to grow. As a result, a lot of tin-foil hats get thrown into the ring.

Nevertheless, and all too often, one’s opinions aren’t actually what one deep down believes.
They merely act as a shibboleth of group membership and therefore are dog-whistle expressions
of the values associated with said group. Therefore, a disagreement with a point becomes
synonymous with an aversion to the values being signaled by it. At this point, we fall victim to
cognitive dissonance and revert to what our predictive models tell us is true. In simpler terms, if
a fact doesn’t align with one’s values then it must not be true. This is a defining symptom of
politics infecting science. It’s too bad that our need to belong exceeds our responsibility to be
ethical. And it’s doubly despairing that our mental models override our opportunities to be
accurate. On top of this, there is the issue of Pluralistic Ignorance— the phenomenon of not
raising your hand just because nobody else is. Or the Bystander Effect— the likelihood of one
person assisting another lessens as the number of people around increases.
All in all, our brains continue to apply old solutions to new problems and that tends to make a
bad situation worse. It’s like we keep throwing water on an electrical fire. Unfortunately, we
can’t evolve at the speed of information. So, we have to leverage certain principles to save us
from ourselves. And even then, the success rate is negligible.

Keep in mind that learning is not an event, it’s a process—a consistent curved-line effort to be
slightly more capable than your seconds-earlier self. Consider that one way to excel at chess is to
play against yourself without allowing yourself to win, not by beating those you already have.

Ultimately, it’s one’s values that are the strongest predictor of whether or not we’ll infer on
behalf of new information. If contradictory evidence is strongly aligned with an established value
set, the mind will allow the older notions to be updated. The pain of changing weaker schemata
is eclipsed by the pleasure of confirming stronger ones. So, if we want to change someone’s
mind we need to frame arguments in that person’s values. However, this can take a turn to the
manipulative and completely miss the point of doing it in the first place.

In such a connected world, filled with nuance and fluid context, the course of action with the
most integrity would be for us ourselves to simply argue from the other side. With humility,
perhaps we should try to prove wrong our own values and question the assertions which support
them. With each iteration, we can upgrade our prediction-models or, at very least when we see a
​ oment​.
change in pattern, linger a little longer in the present buttfarts. Sorry… the present m

I think of Wallace Stevens’ poem “​The Snow Man​”.

One must have a mind of winter...


not to think of any misery in the sound of the wind...
For the listener, who...
nothing himself, beholds
Nothing that is not there and the nothing that is."

He expresses that, as observers of a world greater than we can see and deeper than we can
fathom, we must realize that we aren’t really anything without a context. In order to behold
“Nothing that is not there and the nothing that is” we have to detach from the stories we've told
ourselves about what ​ought t​ o be. That is to say, to be aware of the meaning we project onto the
world and to be mindful that reality just might not match.

You might also like