You are on page 1of 5

ROT140

Blind Spots:
The Roots of
Unethical Behaviour
at Work
Acknowledging the perils of the want self and addressing the informal
values in an organization can lead to more ethical behaviour.
by Max Bazerman and Ann Tenbrunsel

HOW ethical do you think you are, compared to the other people
reading this magazine? On a scale of 0 to 100, rate yourself relative
to them: if you believe you are amongst the most ethical in the
group, give yourself a score near 100; if you consider yourself average, give yourself a score of 50. Now rate your organization: on a scale of 0 to 100, how ethical is it compared to others?
If youre like most of the people we ve asked, both scores
will be higher than 50. If we averaged the scores of all the people
reading this, we guess the average would be about 75. Yet that
cant be the case: the average score would have .to be 50. Some
of you must be overestimating your ethicality. The fact is, most

of us overestimate our own ethicality at some point. In effect, we


are unaware of the gap between how ethical we think we are and
how ethical we truly are.
Traditional approaches to ethics lack an understanding of the
unintentional-yet-predictable cognitive patterns that result in
unethical behaviour. Our aim in this article is to alert you to the
blind spots that prevent us from seeing the gaps between our actual and desired behaviour. Our own research on bounded ethicality
and QGLQJs from the emerging HOd of Behavioural Ethics offer
insights that can help us understand why we often behave
contrary to our
. best intentions.
Rotman MagazineSpring 2011 /53

This document is authorized for use by Jos Ignacio Torres Justinian, from 1/12/2016 to 4/7/2016, in the course:
NB4006: Liderazgo y Etica en la Gestion Publica - Pena (Spring 2016), Tecnologico de Monterrey (ITESM).
Any unauthorized use or reproduction of this document is strictly prohibited.

Ethical Gaps in Individuals

The notion that we experience gaps between who we believe ourselves to be and who we actually are is related to the problem of
bounded awareness. This term refers to the common tendency to
exclude relevant information from our decisions by placing arbitrary bounds around our denition of a problem, resulting in a systematic failure to see important information. To illustrate, take a
look at the following picture:

What do you see? Now, take a look at the Dalmatian snifng


the ground. Most people dont see the Dalmatian on rst look; but
once they know she is there, they can no longer look at the image
without noticing her. The context of the black-and-white background keeps us from noticing the Dalmatian, just as our
prot-focused work environments can keep us from seeing the
ethical implications of our actions. As this exercise demonstrates,
we are boundedly aware: our perceptions and decision making are
constrained in ways we dont realize.
We have found that in addition to falling prey to bounded
awareness, we are also subject to bounded ethicality, or systematic constraints on our morality that favour our own self-interest.
As a result, our evaluations of our own moral transgressions often
differ substantially from our evaluations of the same transgressions committed by others. In one study of this phenomenon,
participants were divided into two groups. In one, participants
were required to distribute a resource (such as time or energy) to
themselves and another person and could make the distribution
fairly or unfairly. The allocators were then asked to evaluate the
ethicality of their actions. In the other condition, participants
viewed another person acting in an unfair manner and subsequently evaluated the ethicality of this act. Individuals who made
an unfair distribution perceived this transgression to be less
objectionable than did those who saw another person commit
the same transgression. This widespread double standard one
rule for ourselves, a different rule for others is consistent with

the gap that often exists between who we are and who we think
we should be.
While you might accept the fact that most people have inated perceptions of their own ethicality, in all likelihood you remain
skeptical that this applies to you. In fact, you are probably certain
that you are as ethical as you have always believed yourself to be.
Lets test this assumption. Imagine that you have volunteered to
participate in an experiment that requires you to try to solve a
number of puzzles. You are told that you will be paid according to
your performance a set amount for each successfully-solved puzzle. The experimenter mentions in passing that the research
program is well funded, and explains that, once you have nished
the task, you will check your answers against an answer sheet,
count the number of questions you answered correctly, put your
answer sheet through a shredder, report the number of questions
you solved correctly to the experimenter, and receive the money
that you reported you earned. Would you truthfully report the
number of puzzles you solved, or would you report a slightly higher number? Note that there is no way for the experimenter to
know if you cheated.
While we cant predict whether you would cheat on this task,
we do know that lots of seemingly nice people do cheat just a little. They count a problem that they would have answered correctly,
if only they hadnt made a careless mistake; or they count a problem
they would have aced if they only had another ten seconds. In addition, when piles of cash are present on a table in the room, people
are even more likely to cheat than when less money is visible. In this
case, participants presumably justify their cheating on the grounds
that the experimenters have money to burn. This study was not
the exception to the rule: ample research conrms that people who
believe they are honest do in fact cheat when given an easy, unveriable opportunity to do so.
Ethical Gaps in Organizations

Ethical gaps at the individual level are compounded when considered at the organizational level. One compelling example is the 1986
explosion of the Challenger space shuttle after it was launched at the
lowest temperature in its history. Extensive post-crash analyses documented that the explosion was caused because an O-ring on one of
the shuttles solid rocket boosters failed to seal at low temperatures.
On January 27, 1986, the night before the launch, engineers and
managers from NASA and from shuttle contractor Morton
Thiokol met to discuss whether it was safe to launch the
Challenger at a low temperature. In seven of the shuttle programs
24 previous launches, problems with O-rings had been detected.
Now, under intense time pressure, Morton Thiokol engineers hurriedly put together a presentation. They recommended to their
superiors and to NASA personnel that the shuttle should not be
launched at low temperatures, citing their judgment that there was
a connection between low temperature and the magnitude of these
past O-ring problems.

54 / Rotman Magazine Spring 2011

This document is authorized for use by Jos Ignacio Torres Justinian, from 1/12/2016 to 4/7/2016, in the course:
NB4006: Liderazgo y Etica en la Gestion Publica - Pena (Spring 2016), Tecnologico de Monterrey (ITESM).
Any unauthorized use or reproduction of this document is strictly prohibited.

Bounded awareness refers to the


common tendency to exclude relevant
information from a decision by
placing arbitrary bounds around our
denition of the problem.

NASA personnel reacted to the engineers recommendation


not to launch with hostility, according to Roger Boisjoly, a Morton
Thiokol engineer who participated in the meeting. In response to
NASAs negative reaction to the recommendation not to launch,
Mortons managers asked for the chance to caucus privately. Just as
[NASA manager] Larry Mulloy gave his conclusion, writes
Boisjoly, [Morton Thiokols] Joe Kilminster asked for a veminute, off-line caucus to re-evaluate the data and as soon as the
mute button was pushed, our general manager, Jerry Mason, said
in a soft voice, We have to make a management decision.
In the caucus that followed, No one in management wanted
to discuss the facts, writes Boisjoly. In his opinion, his superiors
were primarily focused on pleasing their customer, which had
placed Morton Thiokol in the position of proving that it was not safe
to y rather than the more typical default of not launching until
there was reason to believe it was safe to y. The managers were
struggling to make a list of data that would support a launch
decision, he writes, but unfortunately for them, the data actually supported a no-launch decision. Against the objections of their
own engineers, the four Morton Thiokol senior managers present
voted to recommend in favour of the launch. They then gave their
recommendation to NASA, which quickly accepted the recommendation to launch.
Perhaps the most startling aspect of this story was the data that
engineers analyzed when trying to determine whether low temperatures were connected to O-ring failure. NASA and Morton Thiokol
engineers argued about the possible role of temperature based on
the fact that low temperatures were present during many of the
seven launches that had O-ring problems. Many of the engineers on
both teams saw no clear observable pattern regarding the O-rings.
These were experienced engineers with rigorous analytic training.
They were talented enough to know that, to nd out whether outdoor temperature was related to engine failure, they should
examine temperatures when problems occurred and temperatures
when they did not. Yet no one asked for the temperatures for the 17
past launches in which no O-ring failure had occurred.
An examination of all of the data shows a clear connection
between temperature and O-ring failure, predicting that the
Challenger had a greater than 99 per cent chance of failure. But

because the engineers were constrained in their thinking, they only


looked at a subset of the available data and missed seeing the connection. The failure of NASA and Morton Thiokol engineers to look
outside the bounds of the data in the room was an error committed
by well-intentioned people that caused seven astronauts to lose their
lives and delivered an enormous blow to the U.S. space program.
Unfortunately, it is very common for decision makers to err by
limiting their analysis to the data in the room, rather than asking
what data would best answer the question being asked. These decision makers were guilty of a common form of bounded ethicality:
moving forward too quickly with readily-available information,
rather than rst asking what data would be relevant to answer the
question on the table and how the decision would affect other
aspects of the situation.
An organizations ethical gap is more than just the sum of the
individual ethical gaps of its employees. In fact, group work the
building block of organizations creates additional ethical gaps.
Groupthink the tendency for cohesive groups to avoid a realistic
appraisal of alternative courses of action in favour of unanimity
often prevents groups from challenging ethically-questionable
decisions, as was the case with NASAs fateful decision.
Changing Yourself

Confucious said that Only the wisest and stupidest of [people]


never change, and as a result, virtually all of us are ripe for
change. Following are two ways to become more aware of your
ethical blind spots.
1. Anticipate your want self. System 1 thinking refers to our

fast, automatic, effortless and emotional decision processes, while


System 2 thinking refers to slower, conscious, logical and more
reasoned decision processes. Not surprisingly, System 1 responses
are more likely to be immoral than System 2 thoughts. This suggests
that learning to think before acting, in more reective and analytical
ways, can help us move toward the ideal image we hold of ourselves.
But doing so entails being prepared for the hidden psychological
forces that crop up before, during and after we confront ethical
dilemmas. The want self that part of us that behaves according
to self interest and often without regard for moral principles is
Rotman Magazine Spring 2011 / 55

This document is authorized for use by Jos Ignacio Torres Justinian, from 1/12/2016 to 4/7/2016, in the course:
NB4006: Liderazgo y Etica en la Gestion Publica - Pena (Spring 2016), Tecnologico de Monterrey (ITESM).
Any unauthorized use or reproduction of this document is strictly prohibited.

silent during the planning stage of a decision, typically emerging


and dominating at the time of the decision. Not only will your selfinterested motives be more prevalent than you think, but they
likely will override whatever moral thoughts you have. If you nd
yourself thinking I would never do that or Of course Ill choose
the right path, its likely that youll be unprepared for the inuence of self-interest at the time of the decision.
One technique for ghting the want self involves putting in
place pre-commitment devices that seal you to a desired course
of action. For example, in one study Philippine farmers who
saved their money by putting it a lock box were able to save more
money than those who did not, even factoring in the small cost of
the lock box. By eliminating their ability to spend the money
immediately, the lock box effectively constrained the want self.
When faced with an ethical dilemma, we can use similar strategies to keep our want self from dominating more reasoned
decision making.
In addition, research on the widespread phenomenon of escalation of commitment our reluctance to walk away from a chosen
course of action shows that those who publicly commit to a decision in advance are more likely to follow through with it than are
those who do not make such a commitment. You might also precommit to your intended ethical choice by sharing it with an
unbiased individual who you believe to be highly ethical. In doing so,
you can induce escalation of commitment and increase the likelihood
that you will make the decision you planned and hoped to make.

3. Evaluate your unethical choice accurately. Because we want


to see ourselves as ethical, our recollections of our behaviour tend
to be biased in that direction. De-biasing oneself isnt easy
most of us need training to help us identify the distorted feedback
we give ourselves. Rather than focusing on how we should behave,
such training would emphasize the psychological mechanisms that
lead to unethical behaviour and inaccurate recollections of such
behaviour. In addition, it would incorporate techniques to help
people accurately recall their behaviour. Training individuals to
note the biases and distortions that impede accurate evaluation of
their actions can help mitigate the effects of these biases.
Changing Organizations

In The Ford Pinto Case, Douglas Birsch and John Fielder recount
that when it was discovered that the Pintos gas tank was unsafe,
nobody reported it to Lee Iacocca. Hell no, replied one Ford
engineer. That person would have been red. Whenever a problem
was raised that meant a delay on the Pinto, Lee would chomp on his
cigar, look out the window and say Read the product objectives and
get back to work. Iacocca was fond of saying, Safety doesnt sell.
Clearly, without a leader who believes in ethical decision making, an organization will not behave ethically. But at the same time,
having an ethical leader is not sufcient: ndings from the emerging eld of Behavioural Ethics suggest that the following less
obvious aspects of unethical behaviour must also be addressed.
Hidden-but-powerful informal values. The informal values

2. Give voice to your should self. Focusing on the high-level

aspects of a situation is one way to do this. For example, a group of


researchers was able to reduce the immediate temptation of eating
a tasty pretzel by refocusing participants attention away from the
concrete aspects of the temptation how good the pretzel would
taste and toward its abstract dimensions. They did so by asking
participants to imagine that they were looking at a photograph of
a pretzel rather than an actual pretzel. In a similar manner, when
we are faced with an ethical dilemma, we may be able to give the
should self a stronger voice by focusing on the abstract principles
that guide the decision: rather than thinking about the immediate
payoff of an unethical choice, think about which values and principles you believe should guide the decision to give your should self
a ghting chance.
Another efcient strategy involves reformulating a dilemma
into a choice between two options the ethical choice and the unethical choice, which highlights the fact that by choosing the unethical
option, you are not choosing the ethical one. In one study, individuals who evaluated two options at a time an improvement in air
quality (the should choice) and a commodity such as a new printer
(the want choice) were more likely to choose the option that
maximized the public good (improvement in air quality); by contrast, when participants evaluated the options independently of
one another, they more often chose the printer.

imparted at work play a critical role in employee behaviour. What


pressures do employees feel, and why? What ethical challenges do
they face? What types of decisions does the organization actually
reward? What qualities characterize those who make it to the top?
One way to get to the heart of these questions is to try to identify
who really runs the company which may not necessarily mean the
CEO. In the later days of Arthur Andersen, it was the consultants
who had the most power, and at Ford during the Pinto era, it was the
salespeople: This company is run by salesmen, not engineers; so the
priority is styling, not safety, said one engineer. Identifying these
pockets of power can reveal a great deal about an organizations
true values. If winning consulting business is an accounting rms
penultimate goal, what considerations are being pushed aside to
achieve it? If salespeople are running an auto manufacturer, whose
voices are being silenced? Paying attention to what isnt being talked
about within an organization can provide valuable information
about informal values, as exemplied in this quote from Barbara
Tofer, a former Arthur Andersen employee: We were supposedly
still the guardians of the public trust, but no one ever mentioned
that. Everyone did, however, talk about making money all the time.
Ethics talk or a lack thereof also reveals a great deal about an
organization. How is unethical behaviour described? More
importantly, how is it disguised? For example, when someone is
found to have lied to a customer, is the word lying used, or is the

56 / Rotman Magazine Spring 2011

This document is authorized for use by Jos Ignacio Torres Justinian, from 1/12/2016 to 4/7/2016, in the course:
NB4006: Liderazgo y Etica en la Gestion Publica - Pena (Spring 2016), Tecnologico de Monterrey (ITESM).
Any unauthorized use or reproduction of this document is strictly prohibited.

Paying attention to what isnt being


talked about within an organization
can provide valuable information about
informal values.

behaviour disguised as misrepresenting the facts? Is stealing


described as an inappropriate allocation of resources? There is
power in calling unethical behaviour by its name: if it isnt
labeled as such, it is unlikely that an intervention will be
attempted, let alone succeed.
Ethical sink holes. The difcult task of identifying how informal
values differ from desired values can be made easier by identifying
characteristics that make misalignment more likely. We suggest
paying close attention to three particular characteristics: uncertainty, time pressure and isolation.
The more uncertainty there is in an environment, the more likely
unethical behaviour is to occur. In environments characterized by
high uncertainty, individuals may be able to downplay the ethical
implications of a decision and, in doing so, become more likely to
code the decision as a business decision rather than an ethical one.
Uncertainty has also been identied as a catalyst in the divergence
between the want and should self. By introducing the idea that an
outcome may not have ethical implications, the want self may be
able to focus on its own desires, increasing the probability that the
individual will make an unethical choice. In the case of the Ford
Pinto, focusing on the likelihood that the engines would not combust upon impact faded other possible outcomes including
engine combustion and subsequent loss of life from consideration, allowing the decision to be re-coded as a business rather than
an ethical decision.
Time pressure is another likely source of unethical behaviour.
The busier and more rushed people are, the more likely they are to
rely on (automatic) System 1 thinking. Time pressure reduces the
cognitive resources available and decreases the odds of making
should choices. This clearly characterized the Ford Pinto scenario: described in the book as the shortest production planning
period in modern automotive history, the Pintos production
schedule was set at under 25 months, in stark contrast to the average of 43 months. We can increase our likelihood of making
should choices by analyzing ethical dilemmas in an environment
free of distractions and time pressures.
Isolated individuals and groups also tend to develop norms that
diverge from an organizations stated norms. From 1990 to 1994,

General Electric paid nes ranging from a $20,000 criminal ne


to a $24.6 million civil ne for unethical behaviour that included
money laundering, defective pricing and mail fraud. In one 1992
incident, GE pled guilty to defrauding the Pentagon and agreed to
pay $69 million in nes. The company took responsibility for the
behaviour of a former marketing employee who, working with an
Israeli Air Force general, helped to divert Pentagon funds to their
personal bank accounts and to Israeli military programs that were
unauthorized by the United States. As a result of these and other
incidents (and being shut out of government contracts for six
months), GE now strives to prevent isolated groups from hatching
fraudulent plots.
In closing

Once you identify your organizations own sink holes, focus on


promoting ethical values to key individuals, particularly those with
access to and control over information and staff. Communicating
desired values to these employees and nding ways to make them
stick will provide the biggest payoff in terms of reforming your
organizations informal culture.
While we have no way of knowing what ethical challenges you
are facing in your personal and professional life, we do know that
many of us fall short of our own ethical standards. Applying the lens
of Behavioural Ethics, we have identied some of the ways in which
you can make choices that better align with your own and your
organizations values. In the end, using the tools at our disposal,
each one of us can contribute to creating a more ethical world.

Max Bazerman is the Jesse Isidor Straus


Professor of Business Administration at
Harvard Business School. Ann Tenbrunsel is
the Rex and Alice Martin Professor of Business
Ethics in the Department of Management at
Mendoza College of Business at the University of Notre Dame and co-director
of the Institute for Ethical Business Worldwide. The preceeding is adapted
from their new book, Blind Spots: Why We Fail to Do Whats Right and What to Do
About It (Princeton University Press, 2011).

Rotman Magazine Spring 2011 / 57

This document is authorized for use by Jos Ignacio Torres Justinian, from 1/12/2016 to 4/7/2016, in the course:
NB4006: Liderazgo y Etica en la Gestion Publica - Pena (Spring 2016), Tecnologico de Monterrey (ITESM).
Any unauthorized use or reproduction of this document is strictly prohibited.

You might also like