Question Certainty

You might also like

You are on page 1of 2

Question Certainty

DECISION MAKING

by Walter Frick
From the October 2015 Issue

A
James Graham

ccording to legend, around 550 BC, Croesus, the king of Lydia, held one of the world’s earliest prediction tournaments. He sent emissaries to seven oracles to ask them to
foretell what he would be doing that day. Pythia, the Oracle of Delphi, answered correctly: He would be cooking lamb-and-tortoise stew.

Croesus didn’t perform this exercise out of mere curiosity. He had a decision to make. Confident that he’d discovered a reliable oracle, the king then asked Pythia whether he should
attack Persia. She said that if he did, he would destroy a mighty empire. Croesus attacked but was defeated. The problem was interpretation: Pythia never said which mighty empire
would be destroyed.

Whether the story is fact or fiction, Croesus’ defeat illuminates a couple of truths: Forecasting is difficult, and pundits often claim their predictions have come true when they
haven’t. Still, accurate predictions are essential to good decision making—in every realm of life. As Philip Tetlock, a professor of both management and psychology at the University
of Pennsylvania, and his coauthor, Dan Gardner, write in Superforecasting, “We are all forecasters. When we think about changing jobs, getting married, buying a home, making an
investment, launching a product, or retiring, we decide based on how we expect the future to unfold.”

So what is the secret to making better forecasts? From 1984 to 2004 Tetlock tracked political pundits’ ability to predict world events, culminating in his 2006 book Expert Political
Judgment. He found that overall, his study subjects weren’t very good forecasters, but a subset did perform better than random chance. Those people stood out not for their
credentials or ideology but for their style of thinking. They rejected the idea that any single force determines an outcome. They used multiple information sources and analytical tools
and combined competing explanations for a given phenomenon. Above all, they were allergic to certainty.

Superforecasting describes Tetlock’s work since. In 2011 he and his colleagues entered a prediction tournament sponsored by the U.S. government’s Intelligence Advanced Research
Projects Activity. They recruited internet users to forecast geopolitical events under various experimental conditions, and—harnessing the wisdom of this crowd—they won. In the
process, they found another group of “superforecasters” to study. Most weren’t professional analysts, but they scored high on tests of intelligence and open-mindedness. Like
Tetlock’s other experts, they gave weight to multiple perspectives and weren’t afraid to change their opinions. They were curious, humble, self-critical, and less likely than most other
people to believe in fate. And although they seldom used math to make their predictions, all were highly numerate. “I have yet to find a superforecaster who isn’t comfortable with
numbers,” Tetlock writes.

Not surprisingly, Jordan Ellenberg, a mathematician at the University of Wisconsin–


Further Reading Madison, also believes his discipline can help people make better judgments. “Knowing
Superforecasting: The Art and Science of Prediction mathematics is like wearing a pair of X-ray specs that reveal hidden structures underneath
Philip E. Tetlock and Dan Gardner the messy and chaotic surface of the world,” he explains in his new book, How Not to Be
Crown, 2015 Wrong. Common sense and the application of “simple and profound” concepts, such as
correlation, are rigorously extended to everyday reasoning. Notice that he says knowing
math, not doing math. As Tetlock’s research makes clear, the superforecasters aren’t
/
How Not to Be Wrong: The Power of Mathematical Thinking building elaborate statistical models from massive data sets. But the concepts in How Not
Jordan Ellenberg to Be Wrong would be familiar to them, and they use their mathematical brains to find
Penguin, 2015 structure amid complexity and to estimate, understand, and accept probabilities.
Red Team: How to Succeed by Thinking Like the Enemy Superforecasting shows how this is done by contrasting a scene from Zero Dark Thirty, the
Micah Zenko 2012 film about the hunt for Osama bin Laden, with real-life events. In the movie, the CIA
Basic Books, 2015
director, played by James Gandolfini, demands to know if bin Laden really is in the
compound in Abbottabad, Pakistan. “Is he there or is he not f–––––– there?” he asks.
Analysts offer probabilities between 60% and 80%, until the protagonist, Maya (Jessica
Chastain), chimes in: “A hundred percent he’s there,” she says. “OK, fine, 95%, because I know certainty freaks you guys out. But it’s a hundred!”

According to Tetlock, the actual conversation never would have gone that way, because Leon Panetta, the real CIA director at the time, was comfortable using probability and diverse
estimates to make his decisions. In fact, as Micah Zenko recounts in another new book, Red Team, the CIA conducted three separate “red team” exercises before the raid, all designed
to check and challenge analysts’ assumptions. Although the real “Maya” did give that 95% estimate, she and her team were made to completely review their work. The CIA also
appointed four outside analysts to study the case, and the National Counterterrorism Center, a separate agency, conducted its own analysis, generating three probabilities that bin
Laden was in the compound: 75%, 60%, and 40%. President Obama concluded that this amounted to “a flip of the coin,” but he did, of course, authorize the raid. Tetlock dislikes
Obama’s analogy (his superforecasters would have been more precise) but not the overall process. Maya’s estimate was “more extreme than the evidence could support” and
therefore “unreasonable,” he explains. In his world, such confidence is cause for skepticism.

The Zenko book is a good complement to Superforecasting, because it shows how organizations, not just individuals, can overcome their biases toward false certainty and make good
predictions, in geopolitics and business, in public and private sectors. With simulations, vulnerability probes, and alternative analyses that offer fresh eyes on a complex situation or
intentionally oppose a certain position, red teams can greatly improve the accuracy of forecasts in the same way that Tetlock’s experts do.

Zenko adds that management must buy in, committing significant resources to red teams and empowering them to be brutally honest in their analyses. Tetlock agrees. Although
great leaders should be confident and decisive, they must also possess “intellectual humility” and recognize that the world is complex and full of uncertainty, he explains. They
should learn from and lean on superforecasters and red teams, using not just one but many. If Croesus had asked all seven oracles about his planned attack on Persia, for example, he
might not have lost his empire.

A version of this article appeared in the October 2015 issue (pp.130–131) of Harvard Business Review.

Walter Frick is the deputy editor of HBR.org.

This article is about DECISION MAKING


 Follow This Topic
Related Topics: Managing Uncertainty

Comments
Leave a Comment

Post Comment
2 COMMENTS

zim okafor 3 years ago


"allergic to certainty", that is my best line in this piece.
 Reply 00
 Join The Conversation

POSTING GUIDELINES
We hope the conversations that take place on HBR.org will be energetic, constructive, and thought-provoking. To comment, readers must sign in or register. And to ensure the quality of the discussion, our moderating team will review all comments and may edit them for clarity,
length, and relevance. Comments that are overly promotional, mean-spirited, or off-topic may be deleted per the moderators' judgment. All postings become the property of Harvard Business Publishing.

You might also like