You are on page 1of 22

pablo@stafforini.

com
Astral Sign
Not subscribed Codexout Ten Subscribe Help

Book
...
Review: The Scout Mindset
2 hr ago 29 35
I.

You tried Carol Dweck’s Growth Mindset, but the replication crisis crushed your faith. You
tried Mike Cernovich’s Gorilla Mindset, but your neighbors all took out restraining orders
against you. And yet, without a mindset, what separates you from the beasts? Just in time,
Julia Galef brings us The Scout Mindset (subtitle: “Why Some People See Things Clearly And
Others Don’t).

Galef admits she’s a little behind the curve on this one. Books on rationality and
overcoming cognitive biases were big ten years ago (Thinking Fast And Slow, Predictably
Irrational, The Black Swan, etc). Nowadays “smiling TED-talk-circuit celebrity wants to help
you improve your thinking!” is more likely to elicit groans than breathless anticipation. And
that isn’t the least accurate description of Julia (you can watch her TED talk here).

But Galef earned her celebrity status honestly, through long years of hard labor in the
rationality mines. Back in ~2007, a bunch of people interested in biases and decision-
making joined the “rationalist community” centered around the group blogs Overcoming
Bias and Less Wrong. Around 2012, they mostly left to do different stuff. Some of them
went into AI to try to save the world. Others went into effective altruism to try to
revolutionize charity. Some, like me, got distracted and wrote a few thousand blog posts on
whatever shiny things happened to catch their eyes. But a few stuck around and tried to
complete the original project. They founded a group called the Center For Applied
Rationality (aka “CFAR”, yes, it’s a pun) to try to figure out how to actually make people
more rational in the real world.

Like - a big part of why so many people - the kind of people who would have read
Predictably Irrational in 2008 or commented on Overcoming Bias in 2010 - moved on was
About
because just learning that biases existed didn’t really seem to help much. CFAR wanted to
find a way to teach people about biases that actually stuck and improved decision-making.
Archive
To that end, they ran dozens of workshops over about a decade, testing various techniques

Mistakes
and seeing which ones seemed to stick and make a difference. Galef is their co-founder and
pablo@stafforini.com
former Astral Codex Ten
president,
Not subscribed Subscribe
and Scout Mindset is an attempt to write down what she learned.
Sign out Help
Reading between the lines, I think she learned pretty much the same thing a lot of the rest
of us learned during the grim years of the last decade. Of the fifty-odd biases discovered by
Kahneman, Tversky, and their successors, forty-nine are cute quirks, and one is destroying
civilization. This last one is confirmation bias - our tendency to interpret evidence as
confirming our pre-existing beliefs instead of changing our minds. This is the bias that
explains why your political opponents continue to be your political opponents, instead of
converting to your obviously superior beliefs. And so on to religion, pseudoscience, and all
the other scourges of the intellectual world.

But she also learned that just telling people “Hey, avoid confirmation bias!” doesn’t work,
even if you explain things very well and give lots of examples. What does work? Research is
still ongoing, but the book concentrates on emotional and identity-related thought
processes. Above, I made fun of everyone and their brother having a “mindset”, but this
book uses the term deliberately: thinking clearly is about installing an entirely new mindset
in yourself in a bunch of different ways.

Galef’s preferred dichotomy is “soldier mindset” vs. “scout mindset”. Soldiers think of
intellectual inquiry as a battle; their job is to support their “side”. Soldiers are the people
who give us all the military and fortress-related language we use to describe debate:

Beliefs can be deep-rooted, well-grounded, built on fact, and backed up by arguments. They
rest on solid foundations. We might hold a firm conviction or a strong opinion, be secure in
our convictions or have an unshakeable faith in something.” This soldier mindset leads us
to defend against people who might “poke holes” in our logic, “shoot down” our beliefs,
or confront us with a “knock-down” argument, all of which may be our beliefs are
“undermined”, “weakened”, or even “destroyed” so we become “entrenched” in them less
we “surrender” to the opposing position

A Soldier’s goal is to win the argument, much as real soldiers want to win the war. If you’re
an American soldier fighting the Taliban, you want to consider questions like “What’s the
most effective way to take that mountain pass?” or “How can I shoot them before they
shoot me?”, but definitely not “What are the strongest arguments for defecting and joining
the Taliban?” Likewise, someone with Soldier Mindset considers questions like “What’s the
most rhetorically effective way to prove this point?” or “How can I embarrass my About
opponents?”, but not “Am I sure I’m on the right side?” or “How do we work together to
converge on truth?” Archive
Mistakes
Scout Mindset is the opposite. Even though a Scout is also at war, they want to figure out
pablo@stafforini.com
what’s Astral Codex Ten
true. Although
Not subscribed Subscribe
Sign out it would be convenient for them if the enemy was weak, if the enemy Help
is in fact strong, they want to figure that out so they can report back to their side’s general.
They can go on an expedition with the fervent hope that the enemy turns out to be weak,
but their responsibility is still to tell the truth as they understand it.

But isn’t there still a war you have to win? Aren’t there some beliefs you want to fight for,
such that even if you need to be reasonable when figuring out the best way to proselytize
them, they themselves should be beyond challenge? Julia thinks this point is probably
further back than you expect. Even a true American patriot might want to consider the
possibility that, instead of trying really hard to win the war in Afghanistan, the best thing
for the US is to cut their losses and get out. If you were too focused on winning the war
because “that’s the most pro-America thing to do”, you might (might!) miss that.

And maybe you’re not just an American patriot. Maybe you only support America because
you think it best embodies certain values you really care about. If America had stopped
embodying those values, wouldn’t you want to know about it? When Andrew Jackson
toasted “the Union - of all things the most dear” didn’t John Calhoun respond with “to the
Union, of all things the most dear - except freedom”?

(not that John Calhoun was very good at promoting freedom - maybe he should have used
more scout mindset!)

II.

Are Scouts really better than Soldiers? Isn’t this just evidence-less cheerleading for your
team (ie Team Scout), exactly the sort of thing Scouts are supposed to avoid?

Julia Galef is extremely prepared for your trollish comments to this effect. She avoids the
“Scouts are better than Soldiers” dichotomy, instead, arguing that both these mindsets have
their uses but right now we lean too hard in the direction of Soldier. She gives lots of
evidence for this (including an evolutionary argument that Soldier was more useful in small
bands facing generally simple problems). I’ll review a little of this; for the full story, read
Chaper 3 of the book.

One justification for Soldier mindset is that you are often very sure which side you want to
win. Sometimes this is because the moral and empirical considerations are obvious. Other
About
times it’s something as simple as “you work for this company so you would prefer they beat

Archive
their competitors.” But even if you know which side you’re supporting, you need an accurate
picture of the underlying terrain in order to set your strategy. She gives the example of the
Humane League, an animal rights group that was picketing laboratories to stop animal
Mistakes
testing. After a while they evaluated that program and found it rarely worked, and when it
pablo@stafforini.com
did Astral Codex Ten
Not work, the animals
subscribed Subscribe Help
Sign out saved were only a drop in the bucket. So they tried other strategies,
and one of them (pressuring agribusinesses to improve animal welfare) worked really well
and saved far more animals. Even though the Humane League remained good Soldiers for
their cause of animal welfare, their Scout mindset let them abandon an unpromising
strategy and switch to a promising one.

Galef spends a lot of time in Silicon Valley, where the tech crowd has a different objection:
don’t you need to be insanely overconfident to launch a startup? 90% of startups fail. But a
lot of good founders seem absolutely certain they can succeed. They act as good Soldiers for
Team “we’re definitely going to make a billion dollars”, and that certainty rubs off on
employees, investors, etc and inspires confidence in the company. Wouldn’t a more realistic
Scout Mindset doom them?

Galef says not necessarily. Did you know that Jeff Bezos said outright he started off with a
30% chance Amazon would succeed, even going so far as to tell investors “I think there’s a
70% chance you’re going to lose all your money”? Or that Elon Musk said the odds of
SpaceX working were “less than 10%”? Ethereum founder Vitalik Buterin said he’s “never
had 100% confidence in cryptocurrency as a sector…I’m consistent in my uncertainty”. And
since the book came out, I stumbled on this profile of billionaire Sam Bankman-Fried,
which says he believed his chances of success “were only 20% to 25%”.

Galef adds a story from the early days of Intel. They were making computer memory
components, and the Japanese were outcompeting them. The executives talked among
themselves, admitted they probably couldn’t beat the Japanese, pivoted to a different kind of
computer chip - microprocessors - and the rest is history. Even though on the larger-scale
they remained Soldiers for their final goal (Intel should make money), being able to play
Scouts for their subgoal (what should our strategy be?) served them better than insane
overconfidence.

III.

The book divides learning Scout Mindset into an intellectual half (Part II) and an emotional
half (Part III - V). The intellectual half emphasizes probabilistic thinking and thought
experiments.

About
You’ve probably heard the probabilistic (aka Bayesian) side of things before. Instead of
thinking “I’m sure global warming is fake!”, try to think in terms of probabilities (“I think
Archive
there’s a 90% chance global warming is fake.”) Instead of thinking in terms of changing your
mind (“Should I surrender my belief, and switch to my enemy’s belief that global warming is
Mistakes
true”), think in terms of updating your probabilities (“Now I’m only 70% sure that global
pablo@stafforini.com
warming Astral Codex Ten
is fake”).
Not subscribed Sign outThis mindset makes it easier to remember that it’s not a question ofSubscribe Help
winning or losing, but a question of being as accurate as possible. Someone who updates
from 90% to 70% is no more or less wrong or embarrassing than someone who updates from
60% to 40%.

(this comes up again in the last part of the book, the part on how to be emotionally okay
with changing your mind. “Probability update” is less emotionally devastating than “I said
X, but actually ~X, so I was dead wrong.")

Not sure how sure you are? The book contains a fun probability calibration exercise. I won’t
violate its copyright, but you can find a very similar automated test here

My results on the quiz above. See if you can get closer to the line than I did!

But you probably already knew all of this. One of the genuinely new ideas in Scout Mindset is
its endorsement of various counterfactual “tests”. The idea is, imagine yourself considering
a similar question, under circumstances that would bias you the opposite direction. If you
stick with your opinion, it’s probably honest; if you’d change your opinion in the
counterfactual, you probably had it because of bias.

So for example, if a Republican politician is stuck in some scandal, a Republican partisan

About
might stand by him because “there’s no indisputable evidence” or “everyone in politics does
stuff like that” or “just because someone did one thing wrong doesn’t mean we should fire
them”. But before feeling too sure, the partisan should imagine how they would feelArchive
if a
Democrat committed exactly the same scandal. If they notice they’d feel outraged, then
Mistakes
their pro-Republican bias is influencing their decision-making. If they’d let the Democrat
pablo@stafforini.com
off
Nottoo, Astral Codex Ten
then they
subscribed Signmight
out be working off consistent principles. Subscribe Help
I try to use this test when I remember. I talk a good talk about free speech, and “don’t
cancel other people for discussing policies you don’t like, they have a right to their opinion
and you should debate it instead”. But a while back I read an article on Harvard hosted a
conference on “the risks of home schooling”, with an obvious eye towards seeing whether
they could get home schooling regulated or banned. My first twenty thoughts were
something like “is there some way to get revenge on Harvard for being the sorts of people
who associate with causes like this?”, plus anger that the administration was probably going
to pretend it was neutral on this issue and just “encouraging debate”. Then by my twenty-
first thought I remembered this is exactly the sort of thing I was supposed to be against,
and grudgingly decided to be more understanding and sympathetic of everyone in the
future.

Or: sometimes pundits will, for example, make fun of excessively woke people by saying
something like “in a world with millions of people in poverty and thousands of heavily-
armed nuclear missiles, you’re really choosing to focus on whether someone said something
slightly silly about gender?” Then they do that again. Then they do that again. Then you
realize these pundits’ entire brand is making fun of people who say silly things (in a woke
direction) about gender, even though there are millions of people in poverty and thousands
of nuclear missiles. So they ought to at least be able to appreciate how strong the
temptation can be. As Horace puts it, “why do you laugh? Change the name, and the joke’s
on you!”

Some other counterfactual tests like this you can try:

Status Quo Test: If you’re defending the status quo, imagine that the opposite was the
status quo. Would you be tempted to switch to what you have now? For example, I
sometimes feel tempted to defend American measurements - the inch, the mile, Fahrenheit,
etc. But if America was already metric, and somebody proposed we should go to inches and
miles, everyone would think they were crazy. So my attraction to US measurements is
probably just because I’m used to them, not because they’re actually better.

(sometimes this is be fine: I don’t like having a boring WASPy name like “Scott”, but I don’t
bother changing it. If I had a cool ethnically-appropriate name like “Menachem”, would I
About
change it to “Scott”? No. But “the transaction costs for changing are too high so I’m not
going to do it” is a totally reasonable justification for status quo bias)
Archive
Mistakes
Conformity Test: Imagine that some common and universally-agreed idea was unusual;
pablo@stafforini.com
would Astral Codex Ten
you stillSign
Not subscribed want Subscribe Help
out to do it? If not, you might be motivated by conformity bias. Suppose
only 5% of people got married or had kids; would you still want to be one of the 5%?
Suppose almost everyone started a business after high school, and going to college instead
was considered a weird contrarian choice - would you take it anyway?

Again, sometimes this is fine. Doing the same thing as everyone else earns you friends, and
is usually good evidence that you’re not making a terrible error. But it’s at least worth being
aware of. Julia writes:

When I was a kid, I idolized my cousin Shoshana, who was two years older than me…
during a family camping trip one summer….as we sat in her tent, listening to the latest
album on her cassette player, Shoshana said “Ooh, this next song is my favorite!” After
the song was over, she turned to me and asked me what I thought. I replied
enthusiastically “Yeah, it’s so good, I think it’s my favorite too.”

“Well, guess what?” she replied. “That’s not my favorite song. It’s my least favorite song. I
just wanted to see if you would copy me.”

It’s possible my cousin Shoshana crossed paths with Barack Obama at some point,
because he used a similar trick on his advisors when he was president. It was essentially
a “yes man” test: If someone expressed agreement with a view of his, Obama would
pretend he had changed his mind and no longer held that view. Then he would ask them
to explain to him why they believed it to be true. “Every leader has strengths and
weakness, and one of my strengths is a good BS detector” Obama said.

The Selective Skeptic Test: How credible would you consider the same evidence if it
supported the other side?

A meta-analysis of ninety careful studies, published by a prestigious psychology professor,


shows that there is no such thing as telepathy, p < -10^10. Does that put the final nail in the
coffin? Does it close the debate? Is anyone who tries to pick holes in it just a sore loser?
Does it mean that anyone who keeps believing in telepathy after this is a “science denier”?

In the real world, a study meeting that description shows there is such a thing as telepathy.
Hopefully you left yourself some room to say that you think the study is wrong.

This is another one with some subtlety. By Bayes’ Rule, you should believe evidence for About
plausible things more than you believe evidence for implausible things. If my friend says
she saw a coyote out in the California hills, I believe her; if she says she saw a polar Archive
bear, I
am doubtful. I think the best you can do here is understand that, a giant meta-analysis
Mistakes
proving telepathy is false doesn’t force a believer to change her mind any more than a giant
pablo@stafforini.com
Astral Codex Ten
meta-analysis proving
Not subscribed Sign out it’s true forces you to change yours. Subscribe Help
A lot of the best rationalists I know instinctively apply these tests to everything they think.
One technique for cultivating this practice (not the book’s recommendation) is to go on
Twitter, where the adage is “there’s always an old tweet”. Argue that people who say racist
things should be cancelled, and someone will dig up your old racist tweet and make you
defend why you shouldn’t face the same consequences. Argue that it’s disgraceful when the
other party uses extreme violent language about their outgroup, and someone will dig up an
old tweet where you used even more extreme language about yours. Demand that the
Republican senator resign for sexual misconduct, and someone will find the old tweet
where you said the Democratic senator should tough it out. Eventually, if you want to
maintain any dignity at all, you learn to double-check whether your beliefs are consistent
with one another or with what you’d believe in vaguely similar situations.

Scout Mindset says: why not try the same thing, even when you’re not on Twitter, just to
determine what’s true?

IV.

And one very likely answer is: because it would hurt.

Scout Mindset tries to differentiate itself from other rationality-and-bias books by caring a
lot about this. It argues that, while other rationality books just told you what to do, most
people wouldn’t do it; they’d be too emotionally attached to their existing beliefs. So after
giving a few intellectual suggestions, it goes on a deep dive into the emotional side.

At times, this sounds a little facile. There are lots of pages to the effect of “instead of relying
on false beliefs in order to feel good about yourself, have you considered just having true
beliefs but feeling good anyway?” The book phrases this a little more politely:

There is an abundance of different coping strategies, and you don’t need to be so quick
to go with the first thing you happen to pull out of the bucket. You can almost always
find something comforting that doesn’t require self-deception if you just rummage
around in there.

For example:
About
I once felt guilty about something inconsiderate I had done to a friend and spent a week

Archive
trying to justify my behavior to myself. Should I apologize? “No, that’s unnecessary, she
probably didn’t even notice” I told myself, at various times - and “She probably forgave
Mistakes
me already anyway” at other times. Obviously I didn’t find these internally contradictory
pablo@stafforini.com
Astral Sign
Codex
Notjustifications
subscribed out Ten
fully argument Help
satisfying, which is why I had to keep having the same Subscribe with
myself again and again.

Finally I asked myself: “Okay, suppose I had to apologize. How would I do it?” It didn’t
take me long to draft in my head the rough contours of an apology that I felt I could
deliver without too much angst. And when I imagined my friend’s reaction, I realized
that I expected her to be appreciative, not angry. Once the prospect of apologizing
seemed tolerable, I returned to my original question: “Should I apologize?” Now the
answer was much clearer: yes, I should. It’s striking how much the urge to conclude
“That’s not true” diminishes once you feel like you have a concrete plan for what you
would do if the thing were true.

I’m mentioning this story in particular because because of how it straddles the border
between “rationality training” and “being-a-good-person training”. It reminds me of C.S.
Lewis - especially The Great Divorce, whose conceit was that the damned could leave Hell
for Heaven at any time, but mostly didn’t, because it would require them to admit that they
had been wrong. I think Julia thinks of rationality and goodness as two related skills: both
involve using healthy long-term coping strategies instead of narcissistic short-term ones.

I know some rationalists who aren’t very nice people (I also know others who are great).
There are lots of other facets of nice-person-ness beyond just an ability to acknowledge
your mistakes (for example, you have to start out thinking that being mean to other people
is a mistake!) But all these skills about “what tests can you put your thoughts through to see
things from the other person’s point of view?” or “how do you stay humble and open to
correction?” are non-trivial parts of the decent-human-being package, and sometimes they
carry over.

In one sense, this is good: buy one “rationality training”, and we’ll throw in a “personal
growth” absolutely free! In another sense, it’s discouraging. Personal growth is known to be
hard. If it’s a precondition to successful rationality training, sounds like rationality training
will also be hard. Scout Mindset kind of endorses this conclusion. Dan Ariely or whoever
promised you that if you read a few papers on cognitive bias, you’d become a better thinker.
Scout Mindset also wants you to read those papers, but you might also have to become a
good person.

About
(in case this is starting to sound too touchy-feely, Julia interrupts this section for a while to
mercilessly debunk various studies claiming to show that “self-deluded people are happier)
Archive
Mistakes
Here Scout Mindset reaches an impasse. It’s trying to train you in rationality. But it
pablo@stafforini.com
acknowledgesAstral Codex Ten Subscribe Help
thatoutthis is closely allied with making you a good person. And that can’t be
Not subscribed Sign
trained - or, if it can, it probably takes more than one TED talk. So what do you do? Scout
Mindset goes with peer pressure.

We hear about Jerry Taylor, a professional climate change skeptic who would go on TV
shows debating believers. During one debate, he started questioning his stance, did some
more research afterwards, decided he was wrong after all, and became an environmental
activist.

And about Joshua Harris, a pastor who led a “don’t date before marriage” movement. At age
21, he wrote a book I Kissed Dating Goodbye, which became a hit in evangelical circles. But
over the years, he got a lot of feedback from people who said they felt like the book really
hurt them. Twenty years later, he retracted the book and urged people to date after all.

And about scientist Bethany Brookshire, who complained online that men always wrote to
her as “Ms. Brookshire” vs. women’s “Dr. Brookshire”, proving something about how men
were too sexist to treat a female scientist with the respect she deserved. The post went viral,
but as it became a bigger deal, she wanted to make sure she was right. So she went over
hundreds of past emails and found that actually, men were more likely to call her Dr. than
women; the alternate pattern had been entirely in her imagination. So she wrote another
post saying I Went Viral. I Was Wrong, which was generally well-received and prompted
good discussion.

And:

The example of intellectual honor I find myself thinking about most often is a story
related by Richard Dawkins from his years as a student in the zoology department at
Oxford. At the time there was a major controversy in biology over a cellular structure
called the Golgi apparatus - was it real or an illusion created by our observational
methods?

One day, a young visiting scholar from the United States came to the department and
gave a talk in which he presented new and compelling evidence that the Golgi apparatus
was, in fact, real. Sitting in the audience of that talk was one of Oxford’s most respected
zoologists, an elderly professor who was known for his position that the Golgi apparatus
was illusory. So of course, throughout the talk, everyone was stealing glances at the
About
professor, wondering: How’s he taking this? What’s he going to say?
Archive
At the end of the talk, the elderly Oxford professor rose from his seat, walked up to the

Mistakes
front of the lecture hall, and reached out to shake hands with the visiting scholar, saying,
“My dear fellow, I wish to thank you. I have been wrong these fifteen years.” The lecture
pablo@stafforini.com
Nothall Astralinto
burst
subscribed Codex
Sign out Ten
applause. Subscribe Help
Dawkins says: “The memory of this incident still brings a lump to my throat.” It brings a
lump to my throat too, every time I retell that story. That’s the kind of person I want to
be - and that’s often enough to inspire me to choose scout mindset, even when the
temptations of soldier mindset are strong.

Julia says that these people were able to change their minds so effectively because they had
an identity as “scouts”, moreso than their identity as global-warming-skeptics or dating-
skeptics or Golgi-apparatus-skeptics or whatever. It was more psychologically painful for
them to be obstinate and irrational than for them to admit they were wrong. So they were
able to use healthy coping mechanics and come out okay on the other side.

Once she’s finished bombarding you with examples of epistemically healthy people, she
moves on to epistemically healthy communities. The rationalist and effective altruist
communities get namedropped here, as does the r/changemyview subreddit. At every point,
Julia mentions how much she personally respects all these people - and, implicitly, how
much she is rooting for you to become like them.

All of this reminds me of a theory of psychotherapy, which is that one way people get
messed up is by knowing a lot of messed-up people, so much so that the little voice in their
head that tells them what to do, gets trained on messed-up people. When you think “what’s
the right thing to do in this situation?” the abstracted voice of your community of epistemic
peers answers “Something messed-up!”

Then you get a therapist, who is (hopefully!) a really together, with-it, admirable person.
You talk about all your issues with them, so much so that when you have an issue, it’s your
therapist’s voice you hear in your head giving you advice about it. When you ask “what would
other people think of this?”, it’s your therapist you’re thinking of. Plus, your therapist is
credentialed as an Officially Correct High Status Person. She’s not just speaking for herself,
she’s serving as an ambassador for a whole world of healthy normal people; her words are
backed by the whole weight of polite society. So if you’re making a decision to, like, commit
a crime, instead of feeling internalized peer pressure from all your scummy friends to do it,
you feel internalized peer pressure from your therapist (and the normal world she
represents) not to.
About
This last section of Scout Mindset seems to be trying something like that. Julia is trying to
Archive
normalize changing your mind, to assure you that lots of great people who you respect do it,

Mistakes
that there are whole communities out there of people who do it, that she does it and she is a
pablo@stafforini.com
Astral Codex Ten
TED-talk-having
Not subscribed Sign celebrity
out who you implicitly trust. Subscribe Help
One last story, that goes almost a little too far:

One week in 2010, I was following a heated debate online over whether a particular blog
post was sexist. The blogger, a man in his mid-twenties named Luke, chimed in to say
that he had considered his critics’ arguments carefully but didn’t think that there was
anything wrong with his post. Still, he said, he was open to changing his mind. He even
published a list titled “Why It’s Plausible I’m Wrong”, in which he summarized and
linked to some of the best arguments against him so far, while explaining why he wasn’t
persuaded by them.

A few days later, by which point the debate spanned over 1,500 comments across
multiple blogs - Luke posted again. He wanted to let everyone know that he had found an
argument that had convinced him that his original post was harmful.

He had surely already alienated many readers who believed his original post to be
morally wrong, Luke acknowledged. “And now, by disagreeing with those who came to
my defense and said there was nothing wrong with my post, I’ll probably alienate even
more readers,” he said. “Well, that’s too bad, because I do think it was morally wrong.”

“Wow,” I thought. I admired both the fact that Luke didn’t change his mind in the face of
strong pressure, and the fact that he did change his mind in response to strong
arguments. I decided to message him and share my appreciation: “Hey, this is Julia Galef
- just wanted to tell you how much I appreciate your thoughtful writing! It feels like you
actually care what’s true.”

“Hey, thanks - I feel the same way about your writing,” Luke replied.

Ten years after that exchange, we’re engaged to be married.

I know Julia and Luke, they’re both great, and you should absolutely substitute them for
whoever was judging you in your mind before. If it would help to have a voice to attach to
the name, you can listen to Julia on the Rationally Speaking podcast.

About
Archive
Mistakes
pablo@stafforini.com
Astral Sign
Not subscribed Codexout Ten Subscribe Help

29 35
Subscribe

Discussion
Write a comment…
Chronological

sohois 2 hr ago
"(sometimes this is be fine: I don’t like having a boring WASPy name like “Scott”, but I don’t
bother changing it. If I had a cool ethnically-appropriate name like “Menachem”, would I
change it to “Scott”? No. But “the transaction costs for changing are too high so I’m not
going to do it” is a totally reasonable justification for status quo bias)"
Hmm yes, transaction costs of changing a name are high. After all, it's not like you had been
writing under a pseudonym for many years that could have been anything you wanted,
Menachem Alexander
Reply
H. 2 hr ago
I mean, the transaction costs for switching psuedonyms are still pretty high. Setup costs
on any name you fancy are cheap, but a reputation builds on the name etc
Reply
sohois 2 hr ago
A fair point, except that Scott was originally writing under "Yvain" at LessWrong,
AboutSo
and I believe started to use "Scott Alexander" once SlateStarCodex was started.
he accepted the costs then, but failed to call himself Menachem Awesomeman.
Although I'm sure there are probably a bunch of good reasons for the originalArchive
SA
pseudonym which we aren't privy to Mistakes
Reply
pablo@stafforini.com
Astral Sign
Not subscribed Codex Ten 1 hr ago
outPresto Subscribe Help
I felt like his pseudonym of "Moldbug" worked pretty well
Reply
Knox Loveday 2 hr ago
I agree. I'm suspect about the idea that the transaction costs of a name change are really
that high, although I agree they are not zero. Women in the United States routinely
change their last name when they marry, and then revert to their "maiden" name upon
divorce. Marry more than once or twice ... the transaction cost here is actually that
institutions make you jump through hoops to update documents, but in terms of socially
and professionally everyone gets it and just adapts like they would if you changed your
pronouns. Ah but what about famous people? It seems likely easier for them after all, a
famous person has the benefit of people being interested in what they are doing and a
desire to keep up with the news about that person. Re-branding in the consumer
product/business world happens, sometimes due to company mergers and sometimes
due to escape the bad associations with the prior name. Anyway, just stream of
consciousness thinking, but in support of how imho, while not zero, the transaction costs
of changing one's name are not prohibitive, and likely just high enough that one wouldn't
do it without a compelling reason (e.g. marriage, disassociate with a hated family name,
etc.) but not so high that if your whimsy is to do it, it's easily enough done.
Reply
MorningLightMountain 2 hr ago
Scout Mindset is, accidentally, a really great general self-help and CBT book, that doesn't talk
down to you
Reply
Rachael 2 hr ago
Typo thread: "sometimes this is be fine"
Reply
Robert Hoglund 2 hr ago
Agreed. I already think "what would Julia Galef do."
Reply
Hamiltonianurst 2 hr ago
I feel compelled to plug my friends' and my fan project, mindsetscouts.com. Everyone likes
earning cool merit badges! One of the merit badges even has a link to a calibration
spreadsheet template, if that is a thing you've always wanted but never wanted to bother
About
making.
Reply
Archive
david roberts 2 hr ago
Mistakes
This excellent review convinced me to buy the book. One area of life where it is fine to have a
pablo@stafforini.com
pure soldier
Astral
Not subscribed Codex
Sign mindset
out Ten is being a sports fan. You start rooting fro a team when you
Subscribe Help
are young
because of where you live, your family, whatever. You never consider changing your
allegiance (although if, like me, you're a Mets fan, you sometime wish you could). If we all
have a certain amount of "soldier"tendency in us, then sports fandom is a good and healthy
way to exercise and partially exhaust that tendency.
Reply
AJKamper 1 hr ago
I take a "scout mindset" to my sports fandom.
Oh, not really. But one of the interesting places where that soldier mindset comes out in
fandom is when there's a controversial play; people for one side are more likely to see
that the _other_ team, for example, committed pass interference or touched the ball last
before it went out of bounds.
So I use situations like that to remind myself to stay in scout mindset (not literally--I've
been doing this for years, and now have a framework for it).
Reply
david roberts 1 hr ago
I tend also to be a scout when it comes to calls. But I am 100% soldier when it
comes to my partisanship. Long ago, I should have, rationally, switched from being a
Mets fan to being a Yankees fan. But I grew up pro-Mets, hating the Yankees, so
that's that.
Reply
DavidP 1 hr ago
My team just lost a really big game, and then video of the 3 best players sniffing a
white powder (after the game) surfaced, and at first I was totally convinced that
either the other team provided the white powder, or made the video! But then I
thought, nah!
Reply
eccdogg 34 min ago
I try to take a scout mindset with regard to sports fandom when I set my
expectations around how good I think the team will be in a season or how they will
perform in a game.
Every pre season the talk is always all sunshine and rainbows about how great our
team will be this year. Then the season happens and everyone is upset that the pre
season expectations are not met.
About
Luckily it is very easy to find an outside view for your sports teams because the
folks in Las Vegas put a number out there that you can bet on for total winsArchive
in the
season and odds of winning each game. There are also computer models estimate
Mistakes
those odds. So you can say "lots of people think we will be 6-5 this year why might
pablo@stafforini.com
theyoutthink
Astral Sign
Not subscribed Codex Ten that?" Subscribe Help
Reply
Donald Fagen 2 hr ago
I think a bronze age mindset reference would really round out that first paragraph.
Reply
Richard Gadsden 1 hr ago
On the political scandals thing, one noticeable trend is that for people who support party X,
they have to weigh the cost of having a politician who has done something scandalous
against the risk of getting a politician from party Y, which is (for supporters of party X)
intrinsically scandalous.
So it's not unreasonable to have a higher standard for your opponents (whose replacement is
costless) than for your own party.
The other factor is about who will replace them, which is why I favour political systems that
make it easy to replace a politician with another of the same party (and make that a separate
process from the electoral process where voters choose between parties). Note that Al
Franken was replaced as Senator easily - but his replacement, Tina Smith, is of the same
party. Ralph Northam was also in a scandal of comparable magnitude, but remained as
Governor of Virginia because his two same-party successors were involved in the same
scandal and the third in line was of the other party. You can see the same process with the
recent California recall; Gavin Newsom was able to ensure that the only possible replacement
was a Republican and was able to run against that Republican. From the perspective of the
Democratic majority in California, however bad Newsom was, a Republican would be worse.
The only case I can think of in recent years when a politician has been replaced by one of the
opposite party as a result of a scandal is the Alabama Senate election when Roy Moore was
defeated by Doug Jones.
Reply
Freddie deBoer 1 hr ago
Very interesting and a compelling endorsement. This review is a good prompt to think about
my own relationship to the whole rationalist project, and I need to read this book. I am much
more sympathetic to rationalism than I was say 5 years ago and think on balance it's a force
for good. I also think it's a giant motte and bailey, which is frequently discussed in grand and
outsized terms regarding its goals, but when challenged its members tend to say things like
"oh nobody's trying to achieve real rationality, we know that's impossible, we're just trying to
get people to be a bit more rational." But I think what I will do is read and review this book and
use it as a lens to think through the movement and its evolution. About
Reply
Nick 1 hr ago Archive
Mistakes
"They founded a group called the Center For Applied Rationality (aka “CFAR”, yes, it’s a pun)
pablo@stafforini.com
toAstral
try toSign
Not subscribed figure
Codexoutout
Tenhow to actually make people more rational in the real world.Subscribe
" Help
As if Scientology wasn't enough...
Reply
J. Ott 1 hr ago
What would Scott’s review be if he wasn’t personal friends with Luke and Julia? (Probably
similar but longer?)
Reply
Alan L. 1 hr ago
Nobody's complained about p < -10^10 yet, which, depending on where you put your
parentheses is either impossible or certain :^)
Reply
Scott Aaronson 9 min ago
I came to this comment section to point that out. Otherwise, awesome review of an
awesome book! :-)
Reply
Mike H 1 hr ago
I think I might buy the book, then.
I feel like there's a kind of deeper level of reasoning that often goes into people being
unwilling to change their mind, or unwilling to adopt a 'scout mindset'. In the real world,
military scouts tend to get killed a lot. They go into enemy territory surrounded by soldiers
where they're at a disadvantage. Soldiers at least get to fight in groups. The epistemic
equivalent of a scout being "killed" would, I think, be being convinced or pressured to change
your mind based on non-rationalist tactics. If this happens a few times then your Bayesian
prior on "I am going to be misled/pressured/BSd into changing my mind" starts going up and
it stops making sense to be a scout. It starts looking smarter to be a soldier.
In the past I've changed my mind about lots of things - I can think of a few examples from
both politics and my job right now. But I sort of feel like this has happened to me with
everything about COVID. In the beginning I adopted the default position of "the scientists
have got this" and believed all their predictions. Then I read an article that gave me lots of
new information about the history of epidemiology and the unreliability of Ferguson's
modelling, and that caused me to go off and do lots of research of my own that backed it up,
so I changed and adopted a position of "this is all wrong and my god scientists don't seem to
care which is scandalous". But I tried to keep a scout mindset (not calling it that of course)
and would still read articles and talk to people who disagreed, in fact, I ended up debating a
scientist I was vaguely acquainted with in the street for about 20 minutes. We met, saidAbout
hello,
not seen you for a while, got talking and ended up in a massive debate about PCR testing,
modelling and other things. He was very much still where I started so it was a vigorousArchive
debate.
Mistakes
The problem is, a very frequent experience was reading or hearing something that was
pablo@stafforini.com
causing me tooutupdate
Astral Sign
Not subscribed Codex Ten my beliefs somewhat to be closer to the "mainstream"Subscribe
(i.e. Help
media/government) narrative. But then a little demon on my shoulder would whisper,
"shouldn't you double check that? after all, these guys were misleading you before" and then
I'd go check and sure enough, there'd be some massive problem with whatever I'd just read.
In many cases it was just flatly deceptive.
After this happens a bunch of times your prior on "these people who disagree with me are
probably bullshitting me" goes high enough that it's really hard to maintain a scout mindset.
After all, clever bullshitters do it because sometimes they succeed. If I find myself becoming
less certain and updating my beliefs in the direction of "I was wrong", how do I know that this
is a correct decision, given all the previous times when it wouldn't have been? This feels like a
systematic problem with attempts to do scout-y Bayesian reasoning in a world where people
are actively trying to recruit you to their side via foul means as well as fair. I suspect this
explains a lot of politics, although it doesn't mean the perception of being misled is true, of
course.
Having written all that, I have no solutions. The scout mindset would work best in an
environment where people are genuinely playing fair. If they aren't, then maybe it's one of
those times the book alludes to when the soldier mindset actually works best.
Reply
Mr. Doolittle 33 min ago
I had a recent personal experience with being too much of a Scout. In dealing with a
younger person, I was accepting things told to me as honest-unless-proven-otherwise,
which is how I engage with just about everyone. Then this person very clearly lied to me,
and I slightly updated, and then they lied again, and I slightly updated. I was still treating
things said as true, but feeling skeptical. Eventually the skepticism became too large to
maintain a Scout mindset, because I realized there were more lies than truth. Trying to
be a Scout took a lot of additional time, and created numerous situations where the truth
of the matter was being blocked by trying to review whether the specific claims were or
were not accurate. By switching to a Soldier mentality, I was able to defeat false claims
more easily and not update in a false direction.
You can't be a Scout when the other side is constantly defecting. You can only be a
Scout when there can be some assurance that your willingness to update isn't going to
be abused. Otherwise you are signaling that the opponent has an opportunity to Win
using Soldier tactics, which will encourage them to be more of a Soldier. Being a Scout in
that situation means you will lose or be forced to disengage.
Reply
arbitrario 1 hr ago About
I find myself in disagreement with this use of the word "probability", but I realize it's because I
am a soldier for frequentism Archive
Reply
Mistakes
Wombat 1 hr ago
pablo@stafforini.com
I'mAstral
curious
Not subscribed about
Codex
Sign out Tenthe theory of psychotherapy mentioned toward the end. Is Subscribe
there a name for
Help
this theory? I'd like to read more about it.
Reply
LGS 1 hr ago
So what I'm hearing is, "the best way to be a good person is to be like me and my personal
friends. The community I, personally, am involved in is the main one to avoid the scourge of
confirmation bias -- the bias in which people think they are never wrong". Got it.
Reply
Elena Yudovina 10 min ago
In fairness, I would hope that most people believe that being like them and their friends
is a path towards being a good person -- and if they don't, that's certainly not an
endorsement of their community!
Reply
Elriggs 7 min ago
Definitely not the best way to be a good person.
Besides that, your ironic statement is a fully general argument against any solution to
correcting confirmation bias. A specific example on why it’s wrong would be more
convincing.
Reply
Jack C 44 min ago
Does the book touch on the problem of Going Public? It's easier to change your mind when
your opinions are private. At least that's been my experience.
I see increasing polarization as partly an effect of more opinions being public because of
social media.
Reply
Mark P Xu Neyer (apxhard) 42 min ago
“It’s hard to be rational without also being a good person” looks like more evidence against
the orthogonality thesis. If you’re intensely intelligent but don’t see “being mean to other
people as an error”, you’re much more likely to dismiss them when they have true knowledge
that conflicts with your priors.”
And if rationality is just as hard as being a good person, doesn’t this suggest that an
unaligned AI is likely to have biases which inhibit it’s abilities as well?
Reply
argentus 42 min ago About
On boring WASP names, look on the bright side. My partner's grandad's name was Hyman.
Archive
Reply
The Ancient Geek Writes RationalityDoneRight · 33 min ago Mistakes
Norbert Wiener's name was Norbert Wiener.
pablo@stafforini.com
Astral Codex
ReplySign
Not subscribed out Ten Subscribe Help
Magehat Writes bAd Reads · 34 min ago
I've seen a lot of press for this book, but reading this review was the first time I realized that
"scout" in the title refers to scouting something out, and not glorifying the boy scouts. I think
it was the cover image of the binoculars looking out over a landscape right out of a national
park.
Reply
The Ancient Geek Writes RationalityDoneRight · 28 min ago
Does Galef, in a spirit of fair play, mention Mercier and Sperber's theory of argumentative
reason?
Reply
Deiseach 5 min ago
I hate it, so it's probably good advice.
Well, to say "hate" is too strong. But the thing is, scouts are just as much part of the army as
soldiers. Scouts are still "on a particular side" and are working against the scouts from the
opposing army. What if you don't want to fight in any war at all, you just want to find out
things? Is there such a thing as The Nature Rambler Mindset?
Second, the Obama anecdote strikes me not as "wow, strong BS detector, how admirable!"
but as "what a jerk". (And yes, I did the "imagine it's a guy you like" part to check out my
reactions).
I'd hate a boss like that, who was constantly whip-sawing between "I love it/I hate it" in order
to 'catch me out'. You could never be sure what his real opinion was, when he had genuinely
changed his mind, and when it was "he always loved/hated it, he was just pretending the
opposite". Plus, if the people working with him have any brains at all, they will figure out the
strategy after he does it a few times, then they will always have two sets of opinions ready to
go at all times - if Obama says "I love this thing!", be ready to go with "Eh, I'm not so sure"; if
he says "I hate it!", be ready to go with "Oh, there are good parts". That way you can always
turn on a sixpence when he goes "Ha, fooled you, I hate/love it!"
It'll trip everybody up when he goes "No, honestly, I really do love this" "Yeah, sure, Mr.
President" *wink* "I know how this goes, you don't want yes-men!" "No, seriously, I do
believe this" "Ha ha, can't catch *me* out, I know this means you hate it!" but at least Mr. Big
Guy can flatter himself on his sterling bullshit detector.
You're correct that this will require a lot of change and a lot of work to improve yourself.
Reply About
Archive
Mistakes
Top New Community What is Astral Codex Ten?
pablo@stafforini.com
Astral Sign
Not subscribed Codexout Ten Subscribe Help
Statement on New York Times Article
...
Feb 13 821 1,463

Still Alive
You just keep on trying till you run out of cake
Jan 21 1,084 520

A Modest Proposal For Republicans: Use The Word


"Class"
Pivot from mindless populist rage to a thoughtful campaign to fight
classism.
Feb 25 482 1,628
See all

Ready for more?


pablo@stafforini.com Subscribe

© 2021 Scott Alexander. See privacy, terms and information collection notice

Publish on Substack About


Astral Codex Ten is on Substack – the place for independent writing Archive
Mistakes
pablo@stafforini.com
Astral Sign
Not subscribed Codexout Ten Subscribe Help

You might also like