Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Look up keyword
Like this
1Activity
0 of .
Results for:
No results containing your search query
P. 1
Panic Virus by Seth Mnookin Stanford Book Club April 2012

Panic Virus by Seth Mnookin Stanford Book Club April 2012

Ratings: (0)|Views: 56|Likes:
Published by Michael Benefiel
The Stanford Alumni book for April 2012 was Seth Mnookin's The Panic Virus. My piece reduces his chapter on conceptual biases and "availability cascades" to a list of ten - 8 individual and 2 group - conceptual biases. I think it explains a lot about why I find it difficult to change my mind once I have made a decision - even if I begin to suspect I've made a bad decision. I welcome your feedback.
The Stanford Alumni book for April 2012 was Seth Mnookin's The Panic Virus. My piece reduces his chapter on conceptual biases and "availability cascades" to a list of ten - 8 individual and 2 group - conceptual biases. I think it explains a lot about why I find it difficult to change my mind once I have made a decision - even if I begin to suspect I've made a bad decision. I welcome your feedback.

More info:

Published by: Michael Benefiel on May 14, 2012
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as DOCX, PDF, TXT or read online from Scribd
See more
See less

05/14/2012

pdf

text

original

 
Here‟s my list of conceptual biases, named and sometimes explained, in Chapter 16,“Cognitive biases & Availability Cascades.” When I tried two (Texas Shapshooter &
 Availability Cascade) at lunch today, a group of very smart people who argue aboutreligion, politics, MACs & PCs, Androids & iPhones, anti-trust enforcement stoppedside conversations long enough to listen. Thanks, Seth Mnookin and Enlightenmentadvocates for empiricism!1.
 
A brain that can‟t feel can‟t decide. This suggests that limbic activity in
integral to our decision-
making, doesn‟t it?
 2.
 
Pattern recognition produces more false positives than false negatives.Ancestors who were complacent with rustles (of snakes) or flickers (ofpredators) ceased to contribute to the evolutionary cascade which led to us.3.
 
Clustering illusion results because we are driven to connect the dots evenwhen the dots are random. Lorraine found a high number of breast cancercases
in Long Island. Believing the dots weren‟t random, when they were,
gave a sense of more control over fate/destiny/therapeutic outcomes thanwe, in fact, have.4.
 
Expectation bias and selection bias (when SafeMinds members set out towrite the academic paper to legitimize the hypothesis they believed wastrue) will produce errors of manipulation of data or misinterpretation ofdata.5.
 
Anchoring effect describes the preference to give past experiences toomuch weight when making decisions about the future.6.
 
When we decide how much more energy and attention to invest based on our
past investments while discounting evidence, this is “irrational escalation,”
and my own wish to hold onto losing stocks so they can return my investmentis an excellent example.7.
 
The Texas
Sharpshooter‟s Fallacy allows us to craft a hypothesis to support
our data, making it untestable. The metaphor is shooting holes in the side of
a barn, then painting the targets‟ bulls
-eyes where the hole is.8.
 
At the moment when I must be most self-disciplined to see how I was wrong,
I want so much to be right that “confirmation bias” kicks in and I will put all
the stress on data which shows I might be right.

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->