You are on page 1of 9

Ethics and Social Responsibility for Scientists and Engineers

Michael C. Loui
Professor of Electrical and Computer Engineering
University of Illinois at Urbana-Champaign

Friday Forum, University YMCA, Champaign, Illinois


February 6, 2009

Abstract. Too often instruction in professional responsibility emphasizes only compliance with
regulations, standards of conduct, and codes of ethics. As their primary professional
responsibilities, scientists strive to ensure the integrity of the research record, and engineers
strive to ensure the safety of the public. After analyzing different kinds of responsibilities, I will
advance a broader conception of professional responsibility for scientists and engineers, a
conception that encompasses stewardship for society and the environment. Finally, I will discuss
whether ethics instruction can or should cultivate a commitment to this broader conception.

30 to 45 minutes

Introduction

Wernher von Braun was a German physicist who led the development of rocket technology.
During the Second World War, he contributed to the development of the V-2 rocket that
devastated London. Toward the end of the war, he was captured by the United States Army, and
he later worked on the rockets used for the American manned spaceflight program. In 1965, von
Braun was the subject of a song written by satirist Tom Lehrer:
Gather round while I sing you of Wernher von Braun,
A man whose allegiance is ruled by expedience …
“Once the rockets are up, who cares where they come down?
That’s not my department,” says Wernher von Braun.

Despite the light-hearted tone, this song raises two profound questions. First, are scientists and
engineers merely “hired guns,” who provide technical services to whoever employs them? In von
Braun’s case, the contrast is stark, as he worked for both Nazi Germany and the United States.
Second, are scientists and engineers responsible for the uses of their creations? Or as the song
goes, “that’s not my department”?

In this talk, I will first review our current understandings of professional and ethical
responsibility for scientists and engineers. Second, I will analyze and distinguish different kinds
of responsibility. I will describe a broader conception of professional responsibility for scientists
and engineers, a conception that encompasses stewardship for society and the environment.
Third, finally, I will discuss whether ethics instruction can or should cultivate a commitment to
this broader conception.

First, the Current State of Professional Ethics for Scientists and Engineers

1
Within the discipline of philosophy, ethics encompasses the study of the actions that a
responsible individual ought to choose, the values that an honorable individual ought to espouse,
and the character that a virtuous individual ought to have. For example, everyone ought to be
honest, fair, kind, civil, respectful, and trustworthy. Besides these general obligations that
everyone shares, professionals have additional obligations that arise from the responsibilities of
their professional work and their relationships with clients, employers, other professionals, and
the public. For example, doctors and nurses must keep information about patients confidential.
Lawyers must avoid conflicts of interest: a lawyer must not simultaneously represent both
husband and wife in a divorce case.

Professionals have special obligations because they have specialized knowledge and skills, and
they are granted licenses by the public to use their knowledge and skills to significantly affect the
lives of their clients. For example, physicians can legitimately write prescriptions and perform
surgery—privileges that most of us do not have. A dentist can legitimately apply a drill to your
teeth, but you would not let me do the same. I like to say that professionalism makes all the
difference between oral surgery and assault. Because professionals have special powers, they
have special responsibilities to use those powers wisely for the benefit of their clients.

Many professional obligations are not required by laws or regulations. In this sense, professional
ethics goes beyond laws. That is, professional standards may require or forbid certain actions
even when no laws apply. For example, doctors and nurses should render assistance in cases of
medical emergency—for instance, when a train passenger suffers a heart attack. Conversely, an
action can be wrong even if it is allowed by law. So when an Enron executive (or in a
hypothetical case, an Illinois governor) claims he has done nothing wrong, he might mean that he
has done nothing illegal. But what may be legal, or allowed by law, may still be wrong.

The primary professional responsibility of the scientist is to ensure the integrity of the research
record. Together, the responsibilities of scientists are called the Responsible Conduct of
Research. Scientists must not fabricate experimental data. They must not falsify that data, that
is, alter the data in unacceptable ways. They must not plagiarize, representing the ideas of other
scientists as their own, without proper attribution. When scientists collaborate to write a
manuscript, the authors should include only those who made substantial intellectual contributions
to the reported research. When scientists review a manuscript submitted to a journal, they should
evaluate the strengths and weaknesses of the manuscript fairly, they should keep the ideas
confidential, and they should return a report promptly. Scientists should comply with regulations
regarding the treatment of human and animal subjects, and the handling of hazardous substances.

All of these practices in the responsible conduct of research aim to increase the trustworthiness
of research results. While few of these practices are expressed in laws or regulations, most are
stated in standards of professional practice and codes of ethics. None of these professional,
ethical concerns of scientists have to do with the social consequences of scientific research,
however. As we know, the research of some scientists migrates quickly from findings in the
laboratory to commercialization in the marketplace. In a sense, these scientists are becoming
engineers, with the attendant professional obligations of engineers.

2
Engineers understand professional responsibilities primarily at the level of individuals and
organizations, as specified in the codes of ethics of all engineering organizations. Though the
codes differ in emphases and details, all engineering codes say that engineers should be
technically competent and should continue to improve their competence. In their technical work,
engineers should report laboratory data honestly, and they should affix their signatures only to
plans that they have prepared or supervised. In their relationships with other professionals,
engineers should subject their work to peer review, and experienced engineers should mentor
junior engineers. In their relationships with employers, engineers should honor the
confidentiality of trade secrets and other proprietary information, and engineers should avoid
conflicts of interests. For example, an engineer should not specify the use of parts manufactured
by a firm owned by the engineer’s spouse. Avoiding conflicts of interest contrasts with
contracting habits in the City of Chicago, where a former mayor said, “There’s nothing wrong
with nepotism, as long as you keep it within the family.”

Engineering codes of ethics state that engineers should “hold paramount” the safety, health, and
welfare of the public, above other obligations to clients and employers. This commitment to the
public good echoes the public commitments of other professions: medicine is committed to
human health, law to justice. But engineering and engineering societies have provided little
guidance on how to prioritize the public good, except in extreme cases such as whistle-blowing.
When an engineer finds an immediate threat to the safety of the public, the engineer must notify
appropriate authorities. If the threat is not handled within the organization, the engineer is
expected to blow the whistle. For example, Michael DeKort was employed by Lockheed Martin
as the lead engineer on the Deepwater project, to modernize the Coast Guard’s patrol boats. In
2005, DeKort found that the new camera surveillance systems had large blind spots, much
equipment would not withstand bad weather, and the communication system used unshielded
cable instead of the specified, more secure shielded cable. He alerted officials at Lockheed
Martin about these problems, without success. In fact, he was dismissed from the project.
Finally, in August 2006, in desperation, DeKort blew the whistle by posting a YouTube video,
which attracted attention from the news media and led to investigations by U.S. Department of
Homeland Security and by the Transportation and Infrastructure Committee of the U.S.
Congress. These investigations confirmed his allegations of wrong-doing.
[http://abcnews.go.com/Technology/story?id=2371149&page=1,
http://www.ieeessit.org/about.asp?Level2ItemID=5]

While stories of whistle-blowing are dramatic, how can or should an engineer serve the public
good in everyday cases, which do not require whistle-blowing? Engineering codes of ethics are
largely silent. As a consequence, instruction in everyday engineering ethics, like instruction in
scientific research ethics, emphasizes professional responsibilities to individuals and
organizations, instead of professional responsibilities to the public.

Second part of talk: Different Kinds of Professional Responsibility [1300 words so far]

The culture of science and engineering also encourages a conception of professional


responsibility limited to fulfilling assignments competently and conscientiously. As the noted
civil engineer Samuel Florman wrote in his book The Civilized Engineer

3
Engineers do not have the responsibility, much less the right, to establish goals for
society. Although they have an obligation to lead, like most professionals, they have an
even greater obligation to serve. [p. 95] … I propose that the essence of engineering
ethics be recognized as something different and apart from white-hat heroics….
Conscientiousness is fairly close to what I have in mind…. Engineers are accustomed to
having their fellow citizens rely on them. As a consequence, if engineers do their jobs
well they are being more than competent; in serving fellow humans, and serving them
well, they are being good. Reliability is a virtue. Therefore, the conscientious, effective
engineer is a virtuous engineer…. Practically all disasters—think of Bhopal and
Chernobyl… are attributable to one or another type of ineptitude. One can only conclude
that the great need in engineering ethics is an increased stress on competence. [pp. 101–
103]
Of course, like Florman, I am all in favor of competence and conscientiousness. But I think we
can expect more.

In October 2008, I attended a conference on engineering, social justice, and sustainable


development at the National Academy of Engineering. At this conference, I heard an elderly
distinguished engineer state categorically that engineers are not asked to consider the wider
social implications of their work, because those implications are not their responsibility. In
essence, this elderly engineer thought of an engineer as merely a “hired gun,” someone who
serves a client by solving technical problems, without considering the social consequences.

It seems to me that this conception of professional responsibility is quite limited. To explore the
concept of responsibility further, let me tell you a tragic story, one of the many tragedies in the
history of technology.

In the early 1980s, Atomic Energy of Canada Limited manufactured and sold a cancer radiation
treatment machine called the Therac-25, which relied on computer software to control its
operation. Between 1985 and 1987, the Therac-25 caused the deaths of three patients and serious
injuries to three others. Who was responsible for the accidents? The operator who administered
the massive radiation overdoses, which produced severe burns? The software developers who
wrote and tested the control software, which contained several serious errors? The system
engineers who neglected to install the backup hardware safety mechanisms that had been used in
previous versions of the machine? The manufacturer? Government agencies? We can use the
Therac-25 case to distinguish between four different kinds of responsibility.

First, causal responsibility. Responsibility can be attributed to causes: for example, “the tornado
was responsible for damaging the house.” In the Therac-25 case, the proximate cause of each
accident was the operator, who started the radiation treatment. But just as the weather cannot be
blamed for a moral failing, the Therac-25 operators cannot be blamed because they followed
standard procedures, and the information displayed on the computer monitors was cryptic and
misleading.

Second, role responsibility. An individual who is assigned a task or function is considered the
responsible person for that role. In the Therac-25 case, the software developers and system
engineers were assigned the responsibility of designing the software and hardware of the

4
machine. Insofar as their designs were deficient, they were responsible for those deficiencies
because of their roles. Even if they had completed their assigned tasks as specified, however,
their role responsibility may not encompass the full extent of their professional responsibilities.

Third, legal responsibility. An individual or an organization can be legally responsible, or liable,


for a problem. That is, the individual could be charged with a crime, or the organization could be
liable for damages in a civil lawsuit. Similarly, a physician can be sued for malpractice. In the
Therac-25 case, the manufacturer could have been sued.

Finally, fourth, moral responsibility. Causal, role, and legal responsibilities are exclusive: if one
individual is responsible, then another is not. In contrast, moral responsibility is shared: many
engineers are responsible for the safety of the products that they design, not just a designated
safety engineer. Furthermore, rather than assign blame for a past event, moral responsibility
focuses on what individuals should do in the future. In the moral sense, responsibility is a virtue:
a “responsible person” is careful, considerate, and trustworthy; an “irresponsible person” is
reckless, inconsiderate, and untrustworthy.

Responsibility is shared whenever multiple individuals collaborate, as in most scientific and


engineering teams. Teamwork is not a way to evade responsibility—although I once read in the
preface of a co-authored book, “each of the four authors would like to say that any mistakes are
the fault of the other three.” Or, as an alumnus once told me, “Always work in teams: if
something goes wrong, you can blame someone else.” On the contrary, teamwork should not be
used to shift blame or responsibility to someone else.

When moral responsibility is shared, responsibility is not atomized to the point at which no one
in the group is responsible. Rather, each member of the group is accountable to the other
members of the group and to those whom the group’s work might affect, both for the
individual’s own actions and for the effects of their collective effort. For example, suppose a
computer network monitoring team has made mistakes in a complicated statistical analysis of
network traffic data, and these mistakes have changed the interpretation of the reported results. If
the team members do not reanalyze the data themselves, they have an obligation to seek the
assistance of a statistician who can analyze the data correctly. Different team members might
work with the statistician in different ways, but they should hold each other accountable for their
individual roles in correcting the mistakes. Finally, the team has a collective moral responsibility
to inform readers of the team’s initial report about the mistakes and the correction.

I wish to argue for a broad notion of professional moral responsibility that obligates scientists
and engineers to act even when they do not have an assigned role responsibility for a task. For
example, even when the specifications for a new product ignore issues of economic sustainability
and environmental impact, the product engineer is professionally responsible for considering
those issues. In another case, suppose an electrical engineer notices a design flaw that could
result in a severe electrical shock to someone who opens a personal computer system unit to
replace a memory chip. Even if the engineer is not specifically assigned to check the electrical
safety of the system unit, the engineer is professionally responsible for calling attention to the
design flaw.

5
This broad notion of professional responsibility comports with our understanding that
professional responsibilities go beyond complying with laws or regulations, because laws and
regulations often lag behind advances in technology. For example, in 1989, Hasbro introduced
the Playskool Travel-Lite portable crib. This lightweight crib folded up for easy transportation.
The crib met all existing safety standards, because there were no government or industry test
standards at the time for portable cribs, which were are new kind of product. Quite likely the new
crib was not adequately tested. Eventually three children were killed, in separate incidents, when
the top rails collapsed and strangled them. [http://faculty.chicagobooth.edu/linda.ginzel/more/]

Today we have a dazzling variety of new products based on new nanotechnologies and
biotechnologies, whose long-term consequences are difficult to foresee, and which are thus
largely unregulated. Again, scientists and engineers have a professional responsibility to test
these new products sufficiently, to monitor these products as they are used, and to mitigate their
harmful effects. In the absence of technical standards, they must use their professional judgment.

This broader notion of professional responsibility can even motivate exemplary behavior: an
engineer might go beyond a job description or a project assignment, to make products safer,
more reliable, less expensive, or more efficient—and thereby improve the lives of users and the
public. In the 1930s, Val Roper and Daniel Wright were engineers at General Electric who
worked on headlights for automobiles. Persisting on their own for several years, they found new
materials to make the headlights brighter so that motorists could drive more safely at night, and
their designs became the industry standards. [Fleddermann, pp. 112–114]

What Can Instruction Do? [2600 words so far]

Now I turn to the third, final part of this talk. Can or should instruction in professional ethics in
science and engineering cultivate this broader notion of responsibility?

By instruction, I do not mean a boring 45-minute lecture with the single message “Do Not Steal”
or “Be Honest.” These one-shot presentations insult the intelligence of our students. Sadly, for
many students in science and engineering, a single presentation is often all they receive. So let’s
suppose instead that we are talking about instruction through a high-quality educational program
in professional ethics: an academic course or a series of substantial workshops.

As a professor, I believe in the power of education to develop the mind, the hand, and the heart.
This belief arises from decades of educational research, including research on college students.
According to research by psychologists, moral growth continues throughout young adulthood. So
although you might think that everything you needed to know about right and wrong you learned
in kindergarten, in fact, moral development continues throughout the college years. In particular,
while traditional age college students in late adolescence know they should be honest, they
generally do not have well-developed understandings of fairness. They are overcoming self-
centeredness to learn to understand the perspectives of others. As a consequence, beginning
college students have difficulty perceiving fairness issues such as conflict of interest, which
require taking the perspectives of others.

6
But does moral development come from natural maturation, from social interactions with other
people, or specifically from instruction? We do know from solid research studies—including my
own research of course—that ethics instruction can improve students’ moral reasoning skills.
After a full course in philosophical ethics or in professional ethics, students can identify moral
issues, analyze moral problems, and reconcile moral values in finding wise solutions. They can
apply and interpret codes of ethics, imagine the consequences of possible solutions, and make
reasoned judgments.

While academic courses can definitely improve students’ cognitive moral skills, can they
improve students’ affective moral skills? In plain terms, an individual can know what is right,
but might lack the confidence or courage to act accordingly. So, can instruction promote moral
courage?

Or should academic courses attempt to change students’ attitudes and behaviors? Aren’t we on
dangerous ground when we advocate changes beyond improved cognitive skills? As my
colleague Sara Pfatteicher at University of Wisconsin – Madison wrote,
A professor’s job is to teach students how to think, not what to think. [p. 140]
Demonstrating [that] students “understand ethics” need not (indeed, should not) imply
that we assess whether students “behave ethically,” either before or after graduation. [p.
137]
Nevertheless, our courses may affect students in ways that go beyond the improvements in
cognitive skills that we routinely assess for course grades.

Collaborating with both undergraduate and graduate students I have conducted several studies on
the effectiveness of ethics instruction for scientists and engineers. I have decided to exercise
considerable restraint here by describing the findings of only two of our studies. Both studies
examined the student outcomes of a full undergraduate course on engineering ethics on this
campus. The course relates general ethical theory to concrete problems in engineering, using
readings, videos, short scenarios about everyday problems, and case studies about major events
such as the Challenger space shuttle disaster, the Bay Area Rapid Transit system case, and the
Citicorp Center skyscraper case. For their research papers, students study ethical issues in a
contemporary technological controversy, such as stem cell research, mammalian cloning,
genetically modified foods, nuclear weapons, and copyright laws for digital media. The course
uses a variety of pedagogies, such as small-group discussions, formal debates, and even role-
plays.

During the 2003 to 2004 academic year, according to the final reflections at the end of the
engineering ethics course, students became more confident in their ability to identify and reason
about moral problems. Students learned about specific ethical obligations of engineers beyond
honesty; for example, engineers should avoid conflicts of interests and maintain confidentiality
of proprietary information. Most significant, some students developed a deeper, richer
understanding of professional responsibilities beyond completing assigned tasks competently and
conscientiously. They realized that engineers are morally responsible for the social consequences
of their technical decisions. One student wrote,
I now realize that engineers have a larger social responsibility…. I now understand
engineering as using technical knowledge to bring about a social change. As a computer

7
engineer, this means creating something new, or improving something that will have an
impact on how some part of the population lives their lives.
These students articulated a capacious notion of professional responsibility that encompasses
stewardship for society and the environment. One student wrote,
Now, I understand how broadly engineers can influence society … But, with this power
comes the ability to do harm as well. The professional engineer, as with all professionals,
should consider the implications of their actions, especially with respect to the public….
Some of the most interesting and most influential articles we read were the ones that
empowered me to be the best engineer I can be. This is not only because my parents
influenced me to become an honest and hard working person, but rather because of the
power and responsibility I will have when I graduate.

In another study, we interviewed three groups of engineering students: six students who had
completed this engineering ethics course, six who had registered for the course but had not yet
started it, and six who had not taken or registered for the course. We asked the students what
they would do as the central character, an engineer, in each of two short cases that posed moral
problems. Students who had completed the ethics course considered more options before making
a decision. For both cases, even when they were not directly involved, they were more likely to
feel responsible and take corrective action. Students who had not taken the course seemed to
have weaker feelings of responsibility: they would say that a problem was “not my business.”
We concluded that instruction in ethics can increase awareness of responsibility, knowledge
about how to handle a difficult situation, and confidence in taking action.

Conclusions [3700 words so far]

Let me summarize. As their primary professional responsibilities, scientists strive to ensure the
integrity of the research record, and engineers strive to ensure the safety of the public. Too often,
however, scientists and engineers receive instruction in professional responsibility, in a few brief
sessions, that emphasizes only compliance with regulations, standards of conduct, and codes of
ethics—minimal responsibilities that encompass only the conscientious completion of assigned
tasks, and the competent execution of technical work. From this kind of ethics instruction,
scientists and engineers are likely to adopt a narrow conception of professional responsibility
similar to Tom Lehrer’s fanciful Wehrner von Braun:
“Once the rockets are up, who cares where they come down?
That’s not my department,” says Wernher von Braun.
By contrast, it seems that students who have completed a well-taught full course on professional
ethics would be less likely to say, “that’s not my department.” These students would understand
that their professional responsibility does not end when they fulfill their assignments, because
they are not mere “hired guns.”

In his inauguration speech in January 2009, President Barack Obama said, “What is required of
us now is a new era of responsibility—a recognition, on the part of every American, that we have
duties to ourselves, our nation, and the world.” What if, indeed, scientists and engineers
construed their own professional responsibilities to encompass a broad set of such duties,
including stewardship for our society and our planet? As a group of engineering students told
Lee Shulman, president of the Carnegie Foundation for the Advancement of Teaching,

8
An engineer is someone who uses math and the sciences to mess with the world—by
designing and making things that people will buy and use; and once you mess with the
world, you are responsible for the mess you’ve made. [Sheppard et al., check page no.]
May we all feel so responsible for the messes that we have made.

Thank you for your attention. I would be happy to take questions.

References

C. Fleddermann (1999). Engineering ethics. Upper Saddle River, N.J.: Prentice Hall.

S. Florman (1987). The civilized engineer. New York: St. Martin’s Press.

G. Hashemian and M. C. Loui (submitted). Can instruction in engineering ethics change


students’ feelings about professional responsibility?

N. G. Leveson and C. S. Turner (1993). An investigation of the Therac-25 accidents. Computer,


26 (7), 18–41.

M. C. Loui (2005). Ethics and the development of professional identities of engineering students,
Journal of Engineering Education, 94 (4), 383–390.

M. C. Loui and K. W. Miller (2008). Ethics and professional responsibility in computing,. In


Wiley Encyclopedia of Computer Science and Engineering, ed. B. W. Wah, New York: Wiley,
New York, dx.doi.org/10.1002/9780470050118.ecse909.

S. K. Pfatteicher (2001). Teaching vs. preaching: EC2000 and the engineering ethics dilemma.
Journal of Engineering Education 90 (1), 137–142.

T. Lehrer (1981). Too many songs by Tom Lehrer. New York: Pantheon Books.

S. Sheppard, K. Macatangay, A. Colby, and W. Sullivan (2008). Educating engineers: designing


for the future of the field. New York: Wiley/Jossey-Bass.

You might also like