Professional Documents
Culture Documents
By Dave McAlinden
After about age five, we no longer live in the present moment. We live in our predictions. This
means we are living in a milliseconds-ahead future we’ve created to align with how we think the
world s hould be based on our experiences. For example, you are not actually reading these
sentences in real-time. You are making predictions about what words come next based on how
cues align with your own predictive model for the English language. The only time we become
“present” is when those predictions are buttfarts. Excuse me ...when those predictions are
broken. That feeling you just had when you read the oh-so-charming word ‘buttfarts’ was you
breaking out of your predictive stream and into the present moment. But now that my syntax
once again follows your model for it, you’re back to living in the future.
Structures in the mind that link concepts and relationships among other concepts are what
psychologists call ‘schemas’ or ‘schemata’. These are constructed when bits of information are
labeled and combined into clusters that connect with other clusters to codify what we encounter
in the world. Essentially, schemas are what make up our predictive models.
These structures have allowed us to survive and evolve, to create associations, and to recognize
and compare patterns. However, they can be a major hindrance when it comes to updating old or
accepting new information. Especially if you’re an adult. And that’s because many of our
existing schemas come with deeply embedded misconceptions.
White light is the combination of all the primary colors of light— Red, Blue, and Green; but, if
you subtract any one color, the white ceases to be white. A combination of the remaining two
colors will emerge. For example, if you take out the Green light, you’ll see Magenta. If you take
out the Red light, you’ll see Cyan. Adding or taking away wavelengths at the source of the light
will change your experience of it. This means, since we construct our reality through the senses,
the brain only gives you a filtered version of the information that exists around us. One of the
crazy things that result from this filtration process is called "color constancy".
Color constancy is the tendency of the color of a familiar object to look the same under any type
of l ight c ondition, even if the object isn’t reflecting the actual color we perceive. For example, if
you have always loved eating strawberries and you've never seen one that wasn't red, and then
you see a bunch of strawberries, your brain will a ssume t hem to be red even if they are displayed
in light that subtracts the red wavelengths. That is, there is no red light for the object to reflect.
Our predictions automatically fill in gaps or replace information to bias the experience of reality
to match what we expect.
So, if you placed a picture of strawberries into Photoshop and then removed all the red from it,
you will still experience red within the cyan-saturated image even though it isn’t there to be
experienced. This means your mind lies to you so you can continue to live in what o ught to be
the truth based on what in the past has always been the case. Due to our evolutionary wiring, in
such circumstances, the facade is more conducive to our survival than the reality would be. As a
result, that's what is projected in the mind’s eye.
This reveals that we don't see what is truly happening in the real world. We observe the effects of
reality on our brains through a survival filter based on our respective schemata rather than
observe reality as it is b ased on its contextual properties. In turn, pun intended, our predictive
models color our experiences.
This is just one reason why experience is poor evidence for claims about the nature of things.
But, it also indicates that we are rarely ever in the present moment. We live in our predictions for
how the world "should" be.
Now apply that concept to how long-term memory works—long-term memory is like a piano:
combinations of data create meaning the same way a combination of piano keys creates a chord,
a melody, a movement, etc. The tuning of the piano is what determines the clarity of the
combinations and the accuracy of their execution. You could play the right combination of notes,
but if it’s not tuned properly then those combinations become discordant. This is what happens
when we develop bad habits and buy into misconceptions or incorrect information.
For the most part, we do have reliable prediction-models; but, they’re often invalid to some
degree. The validity of a measurement depends on how accurately it is set to the information it is
measuring. Unfortunately, adjusting our mental settings isn’t as easy as re-calibrating a bathroom
scale. Imagine if you experienced serious stomach cramps when you reached for your scale’s
reset button. You’d forget all about re-calibrating your scale and go lie down. In fact, you'd
inadvertently condition yourself to avoid the reset button altogether. Now, imagine if every time
you tried to tune your piano, the heavy hard-rock maple lid slammed down on your fingers.
You’d say, “Whatever. Sounds close enough” and then go ice your hand. That’s what it’s like for
our brains when faced with having to update previously encoded information. As a result, it
becomes exceedingly difficult to be critical of a measurement that leads to an otherwise dubious
prediction.
When our predictions are broken, we are brought into the present moment briefly and are faced
with a choice: adjust our predictions to align with the updated information, or ignore the
information and seek refuge in the false security of those predictions. Most of the time we dig in
our heels, double down, and discount the new information as an anomaly or perceive it as further
proof that we were correct all along. Social media, bar-guments, and Thanksgiving dinner
conversations are full of relevant examples as I’m sure you have noticed. Behavioral economists
have termed this phenomenon "The Backfire Effect’.
We tend to disregard information that contradicts our beliefs. In turn, those beliefs are
strengthened even though they may have just been debunked. Since our beliefs are predicated
upon our experiences, you get a set of constants that are reliable but not valid. It walks
hand-in-hand with what is known as ‘confirmation bias’— our tendency to only seek out
information which confirms our beliefs rather than updates our knowledge-base so we can think
in terms of what i s rather than what o ught to be. And no one is the exception. Everybody does
this. Everybody. You, Me, even Tom Hanks does this. And here’s why: it all comes down to
pain.
As it turns out, a major reason for belief-change resistance is that the act of deliberately changing
your mind physically h urts. The brain consists of eighty-six billion neurons, with at least ten
thousand connections between each of them. Such a massively resource-intensive infrastructure
takes an incredible amount of energy to maintain. For example, your brain, at rest, uses twenty
percent of your total energy stores throughout the day. Learning something new, especially when
it requires you to alter your predictions about the world, seriously amps up that energy usage. As
a result, we experience pain. Since we are hardwired to avoid pain, it is tremendously difficult
for us to be actively disabused of our misconceptions. Enter an ensemble cast of cognitive biases,
and thus begins the mental theatrics of self-deception.
Put simply, a cognitive bias is a mental shortcut for making critical decisions without using a
painful amount of energy. Unfortunately, the term “bias” in the psychological context is often
confused with the terms “prejudice” or “partisan” in the socio-political one. Some cognitive
biases inflame prejudices and spur on partisanship, but they are very much not the same thing.
Cognitive bias is a neurological adaptation that meets the need to react quickly to possible
danger. For the same reason our vision has evolved into the amazing superpower it is today, so
have scores of these mental shortcuts.
Way back, when we were just primates in the trees, our natural predators were snakes. As such,
our vision adapted to spot subtle yet sudden changes in patterns and then react to them with
enough speed to avoid any danger those changes suggested. For example, if you could spot a
lush green pit-viper slithering near in the brush-rich periphery, you could effectively distance
yourself without going through a time-consuming analytical process. In effect, information that
doesn’t align with our predictions about our surroundings inspires certain knee-jerk reactions. A
blessing to the nomadic tribes who migrated across the planet to be fruitful and multiply; a curse
to the wired-in minds of the information age who struggle to establish a universal set of values
across cultures.
The drastic environmental shift from an ecosystem of predator and prey to an ecosystem of
industry and thought has caused one of our most critical adaptations to become more of a danger
than protection. Consider the stick bug—this incredible creature has adapted to blend in
seamlessly with its environment to ensure its survival. However, if this same master of
camouflage were to be let loose in the city, a child might see it on the sidewalk and stomp on it
because he finds the snapping of twigs to be pleasing. Just imagine if there was an animal that
evolved to look like bubble wrap. Context is king when it comes to adaptation.
There are plenty of things we do when we think we see a snake in the tall grass just as there are
plenty of ways the world happens to us the way a child may happen to a stick bug. Quite often
they are the same. That is to say, there are several reactions we have that were once conducive to
our survival which now effectively put us in harm's way.
Since psychomotor knowledge has much more cut and dry indicators of failure (like getting the
crap kicked out of you), the Dunning-Kruger Effect is much more prevalent within cognitive a nd
affective domains. Suffice to say it’s much easier to bullshit thoughts and feelings than it is to
bullshit physical ability. As a result, this principle especially applies to those who opine without
evidence. More evidence equals more nuance, and more nuance calls for unfounded
generalizations to be discounted. That means more time is needed to establish scope, define
terms, and then sort the wheat from the chaff even before the discussion begins. This is why
social media ‘debates’, without a qualified moderator, fall victim to “thread-death” so quickly or
become Google-copy-and-paste competitions. In other words, if people spoke to each other in
real life the way they do on the internet, there would be a lot of pauses in conversation to flip
through dictionaries and encyclopedias.
The point here is that an opinion based on a small amount of information and subjective
experience is not equally as valid as expertise rooted in deliberate practice, peer review, and
actual research. Regardless, it remains the general attitude. Unfortunately, when matters of
preference and intuition are confused with matters of fact and intellect, fertile ground is
cultivated for misconceptions to grow. As a result, a lot of tin-foil hats get thrown into the ring.
Nevertheless, and all too often, one’s opinions aren’t actually what one deep down believes.
They merely act as a shibboleth of group membership and therefore are dog-whistle expressions
of the values associated with said group. Therefore, a disagreement with a point becomes
synonymous with an aversion to the values being signaled by it. At this point, we fall victim to
cognitive dissonance and revert to what our predictive models tell us is true. In simpler terms, if
a fact doesn’t align with one’s values then it must not be true. This is a defining symptom of
politics infecting science. It’s too bad that our need to belong exceeds our responsibility to be
ethical. And it’s doubly despairing that our mental models override our opportunities to be
accurate. On top of this, there is the issue of Pluralistic Ignorance— the phenomenon of not
raising your hand just because nobody else is. Or the Bystander Effect— the likelihood of one
person assisting another lessens as the number of people around increases.
All in all, our brains continue to apply old solutions to new problems and that tends to make a
bad situation worse. It’s like we keep throwing water on an electrical fire. Unfortunately, we
can’t evolve at the speed of information. So, we have to leverage certain principles to save us
from ourselves. And even then, the success rate is negligible.
Keep in mind that learning is not an event, it’s a process—a consistent curved-line effort to be
slightly more capable than your seconds-earlier self. Consider that one way to excel at chess is to
play against yourself without allowing yourself to win, not by beating those you already have.
Ultimately, it’s one’s values that are the strongest predictor of whether or not we’ll infer on
behalf of new information. If contradictory evidence is strongly aligned with an established value
set, the mind will allow the older notions to be updated. The pain of changing weaker schemata
is eclipsed by the pleasure of confirming stronger ones. So, if we want to change someone’s
mind we need to frame arguments in that person’s values. However, this can take a turn to the
manipulative and completely miss the point of doing it in the first place.
In such a connected world, filled with nuance and fluid context, the course of action with the
most integrity would be for us ourselves to simply argue from the other side. With humility,
perhaps we should try to prove wrong our own values and question the assertions which support
them. With each iteration, we can upgrade our prediction-models or, at very least when we see a
oment.
change in pattern, linger a little longer in the present buttfarts. Sorry… the present m
He expresses that, as observers of a world greater than we can see and deeper than we can
fathom, we must realize that we aren’t really anything without a context. In order to behold
“Nothing that is not there and the nothing that is” we have to detach from the stories we've told
ourselves about what ought t o be. That is to say, to be aware of the meaning we project onto the
world and to be mindful that reality just might not match.