You are on page 1of 68

Professional Ethics

What is Professional Ethics?


When applied to computing and
cybertechnology, professional ethics is
a field of applied ethics concerned with
moral issues that impact computer
professionals.

Why a Separate Category of


Professional Ethics?

The same ethical rules involving honesty,


fairness, and so forth should apply to
professionals as well as to ordinary
individuals.
So, if it is wrong for ordinary people to steal,
cheat, lie, and so forth, then it is wrong for
professionals to do so as well.
So, we might conclude that a separate field of
study called "professional ethics" is not really
needed.

Separate Category of
Professional Ethics (continued)

Some argue that some moral issues affecting


professionals are sufficiently distinct and
specialized to warrant a separate field of
study.
Others argue that professionals can have
special moral obligations that exceed those of
ordinary individuals.
To grasp the arguments for this view, it is
useful first to understand what is meant by
the terms profession and professional.

What is a Profession?

Harris, Pritchard, and Rabins (2004) note that the


term profession has evolved from a concept that
was once associated with people professing a
religious or monastic life to one that now has a more
secular meaning.
Profession was used to describe a person who
made a public promise to enter a distinct way of life
with allegiance to high moral ideals.
Later, the term came to refer to anyone who
professed to be duly qualified.
Profession has now come to mean an occupation
in which one professes to be skilled in and to follow.

What is a Profession (continued)?

According to Allan Firmage (1991), a profession can


be understood in terms of
the attributes and requirements of a professional practice,
such as "calling in which special knowledge and skill are
used in...the service of mankind."

Ernest Greenwood (1991) believes that professions


are occupational fields distinguishable in terms of five
characteristics:
(i) systematic theory,
(ii) authority,
(iii) community sanction,
(iv) ethical codes,
(v) a culture.

Who is a Professional?

Professionals who comprise a given

profession also tend to have certain defining


attributes and requirements.
Medical doctors, lawyers, accountants, etc.,
find themselves in situations in which their
decisions and actions can have significant
social effects; their roles and responsibilities
can exceed those of ordinary individuals.
Sometimes these roles and responsibilities
differentiate professionals from others.

Who is a Computer
Professional?

A computer professional might be interpreted


to mean anyone who is employed in the
computer, information-technology, or
information/communications fields.
Or a computer professional might be thought
of in more narrow terms, in which case only
software engineers would be included.
There are various gradients in between the
two ends of this spectrum.

Definition of a Computer
Professional (Continued)

A computer professional could be defined in


such a way that, in addition to software
engineers, software quality analysts, software
technical writers, network administrators,
software managers and supervisors.
A software engineering team includes those
who contribute by direct participation to the
analysis, specification, design, development,
certification, maintenance, and testing of
software systems.

Do Computer Professionals
Have Special Responsibilities?

Don Gotterbarn (2001) believes that because


software engineers and their teams are have
significant opportunities to:
(i) do good or cause harm;
(ii) enable others to do good or cause harm;
(iii) influence others to do good or cause
harm.

Critical-Safety Software

Gotterbarn suggests that the roles and


responsibilities involved in the
development of safety-critical systems
is a differentiating factor.
A "safety-critical system" is often used
to refer to computer systems that can
have a direct life-threatening impact.

Safety-Critical Software
(Continued)

Examples of safety-critical software


systems and applications typically
include:
aircraft and air traffic control systems
mass transportation systems
nuclear reactors missile systems
medical treatment systems.

Additional Safety-Critical
Systems

Kevin Bowyer (2002) extends the range


of safety-critical applications to include
software used in the:
design of bridges and buildings;
election of water disposal sites;
development of analytical models for
medical treatment.

Professional Codes of Ethics

Many professions have established


professional societies, which in turn
have adopted codes of conduct.

The medical profession established the


AMA (American Medical Association),
The legal profession established the ABA
(American Bar Association).

Both associations have formal codes of


ethics/conduct for their members.

Professional Codes for


Computer Societies

The computing profession also has


professional societies, which include:
The Association for Computing (ACM);
The Australian Computer Society (ACS);
The British Computer Society;
The Institute for Electrical and Electronics
Engineers (IEEE);
IEEE Computer Society (IEEE-CS).

Purpose of Professional Codes

Professional codes of ethics are often


designed to motivate members of an
association to behave in certain ways.
Four primary functions of codes are to:
inspire,
guide,
educate,
discipline the members.

Criticisms of Ethical Codes

John Ladd (1995) argues that ethical codes


rest on a series of confusions that are both
"intellectual and moral."
His argument has three main points:
(1) ethics is basically an "open-ended, reflective, and
critical intellectual activity;
(2) codes introduce confusions with respect to microethics vs. macro-ethics;
(3) giving codes a disciplinary function makes them
more like legal than ethical rules.

In Defense of Professional
Codes

Gotterbarn argues that we need to


distinguish among three aspects of
professional codes, as:
codes of ethics;
codes of conduct;
codes of practice.

In Defense of Professional
Codes (Continued)

Codes of ethics as "aspirational," because

they often serve as mission statements for


the profession and thus can provide vision
and objectives.
Codes of conduct are oriented more toward
the professional and the professional's
attitude and behavior.
Codes of practice relate to operational
activities within a profession.

Table 4-1: Some Strengths


and Weaknesses of
Professional Codes
Strengths

Weaknesses

Codes inspire the members of a profession to


behave ethically.

Directives included in many codes tend to be too


general and too vague.

Codes guide the members of a profession in


ethical choices.

Codes are not always helpful when two or more


directives conflict.

Codes educate the members of a profession


about their professional obligations.

A professional codes directives are never


complete or exhaustive.

Codes discipline members when they violate one


or more of the codes directives.

Codes are ineffective (have no teeth) in


disciplinary matters.

Codes sensitize members of a profession to


ethical issues and alert them to ethical aspects
they otherwise might overlook.

Directives in codes are sometimes inconsistent


with one another.

Codes inform the public about the nature and


roles of the profession.

Codes do not help us distinguish between microethics issues and macro-ethics issues.

Codes enhance the profession in the eyes of the


public.

Codes can be self-serving for the profession.

Should Computer Professionals


Be Licensed or Certified?

Don Gotterbarn (2000) has argued in favor of


certification for software engineers?
The ACM has not endorsed any proposals for
licensing software engineers.

ACM points out, a software engineering license could be


interpreted as an authoritative statement that the licensed
engineer is capable of producing software systems of
consistent reliability, dependability, and usability.

According to the ACM:

the current state of knowledge and practice in the field of


software engineering is too immature to give assurances
of this type (White and Simons, 2002).

Licensing and Certification


(Continued)

John Knight and Nancy Leveson (2002) sum


up the ACMs position on the licensing of
software engineers:
Licensing software engineers who work on safety-critical
systems would be neither practical nor effective in achieving
the goal of public interest, and it could even have serious
negative ramifications. The real issue, however, is not
licensing per se but determining how best to protect the
public without unduly affecting engineering progress, the
economy, the engineering and computing professions, or
individual rights.

Licensing and Certification


(Continued)

Knight and Leveson believe that approaches


other than licensing and certification might be
more effective and that those approaches
need to be evaluated.
The state of Texas requires software
engineers to be certified, and other states are
currently considering certification or licensing
requirements for software engineers.

Conflicts of Professional Responsibility:


Employee Loyalty and Whistle-blowing

What, exactly, is employee loyalty?


Do employees and employers have a special
obligation of loyalty to each other?
Should loyalty to ones employer ever
preclude an employee from "blowing the
whistle" in critical situations?
In which cases can whistle-blowing be
justified?

Do Employees Have a Special


Obligation to Employers?

Some believe we have a prima facie


obligation of loyalty in employment
contexts.
In other words, all things being equal,
an employee should be loyal to his or
her employer and vice versa.

Does Employee Loyalty Still Make Sense in the


Context of a Large Computer Corporation?

Ronald Duska (1991) argues that in


employment contexts, loyalty only
arises in special relationships based on
a notion that he calls "mutual
enrichment."
So in relationships in which parties are
pursuing their self-interests, the notion
of loyalty would not be applicable.

Duskas Argument (continued)

Duska believes that employer-employee


relationships at least where corporations
are concerned are based on self-interest
and not on mutual enrichment.
He concludes that employees should not
necessarily feel any sense of obligation of
loyalty to corporate employers.
Corporations like employees to believe that
they have an obligation of loyalty to their
employers because believing that serves the
corporations interests.

Ladds Criticism of Employee


Loyalty

Ladd also believes that for corporations,


loyalty can only be in one direction.
He argues that a corporation cannot be loyal
to an employee in the same sense that
employees are supposed to be loyal to it.
A corporation's goals are competitively linked
to the benefits employees bring to the corp.
A corporation can be good to employees only
because it is good for business, i.e., it is in
the company's own self interest.

Ladds and Duskas Criticisms


(Continued)

Both Duska and Ladd cite corporate selfinterest as an obstacle for a balanced
employer-employee relationship that is
required for mutual loyalty.
Consider that corporations often go through
downsizing phases in which loyal employees
who have served a company faithfully for
several years are dismissed as part of
restructuring plans.

Employee Loyalty and the


Outsourcing of Programming Jobs

Many computer programming jobs in the U.S. are


now being outsourced by major corporations, such
as IBM, to countries where programmers are willing
to write the code for much lower wages.
Not only does this practice raise questions for
employee loyalty, but it could also affect the future of
the computer profession in the U.S. if young people
are discouraged from entering the programming field
because of having their high-skilled jobs eliminated
via outsourcing (Baker and Kripilani, 2005).

Sometimes Employers Have


Been Loyal

Consider a case in which an employer


continues to keep an employee on the payroll
even though that employee has a chronic
illness, which causes her to miss several
months of work.
Also consider a case in which several
employees are kept on by a company despite
the fact that their medical conditions have
caused the corporation's health insurance
costs to increase significantly, thereby
reducing the company's overall earnings.

Employer Loyalty (continued)

Consider a recent case involving the owner of


Malden Mills, whose physical plant in
Massachusetts was destroyed by fire.
The mill's proprietor, Aaron Feurestein, could
have chosen to rebuild his facility in a
different state or country where employees
would work for lower wages.
Instead, Feurestein continued to pay and
provide benefits for his employees while a
new facility was being built in Mass.

Do Computer Professionals Have Special


obligations of Loyalty to Their Employers?

They have to balance their obligation of


loyalty owed to an employer against other
obligations of loyalty they also may have?
Loyalty is not something that an employee
must give exclusively or blindly to ones
employer.
Loyalty should also be seen as an obligation
that individuals have to society as a whole,
especially where safety and health issues are
at stake.

Divided Loyalties

Divided loyalties can result in serious


conflicts for employees.
In certain cases, the moral dilemmas
they generate are so profound that an
employee must determine whether to
"blow the whistle."

Whistle-blowing

What, exactly, is whistle-blowing?


John Rowan and Samuel Zinaich (2003)
note that the expression blowing the
whistle comes from the effort by
individuals to get the publics
attention.

Whistle-blowing (Continued)

Whistle-blowing occurs when one or more


employees go outside the organization (e.g.,
to the press) to shed some light on
misconduct within the organization.
In the context of engineering, whistle-blowing
incidents often occur in attempts to alert the
public to a potentially unsafe product.

Whistle-blowing (Continued)

Whistle-blowing incidents can occur because


of either:
(a) overt wrongdoing (where an employee
informs the public about the immoral or
illegal behavior of an employee or
supervisor);
(b) negligence (e.g., where one or more
individuals in an organization have failed to
act).

Whistle-blowing Continued

Michael Martin (2003) notes that the act of


blowing the whistle can be either:

condemned as an action taken by disloyal


troublemakers who rat on their companies and
undermine teamwork based on the hierarchy of
authority within the corporation;
regarded as a tragedy to be avoided (though it may
sometimes be a necessary evil);
affirmed unequivocally as an obligation that is
paramount in certain circumstances where it
overrides all other considerations, whatever the
sacrifice involved in meeting it.

When Should an Employee


Blow the Whistle?

Colleen Rowley, an FBI employee, came forth


to describe the way in which critical
messages had failed to be sent up the
Federal Bureau's chain of command in the
days immediately preceding the tragic events
of September 11, 2001.
Was it appropriate for this individual to blow
the whistle on her supervisor?
Was she also possibly being disloyal to her
supervisor and fellow employees in doing so?

When Should an Employee


Blow the Whistle (continued)

Should individuals in positions of authority in


corporations such as Enron and WorldCom
have blown the corporate whistle about the
illegal accounting practices in those firms?
One could argue that failing to blow the
whistle in the Enron case resulted in
thousands of individuals losing their
retirement savings, and in some cases their
entire life savings.

Cases Where Whistle-blowing


Could Have Saved Lives

Consider three classic cases (described


in the text):
Challenger Space Shuttle (because of
faulty O-rings);
Ford Pinto (because of faulty gas tank);
BART case (because of a faulty
computer system affecting a mass
transportation system).

Controversial Political Issues in


Whistle-blowing

SDI (Star Wars) Case


David Parnas blew the whistle on Star Wars
because of three factors:
1. The specifications for the software could not be
known with any confidence.
2. The software could not undergo realistic testing.
3. There would not be sufficient time during an
attack to repair and reinstall failing software (no
"real-time" debugging).

Criteria for Blowing the Whistle in an


Engineering Context

Richard De George (1999) offers some


specific conditions for when an engineer
is either:
(a) permitted to blow the whistle;
(b) obligated to do so.

When an Engineer is
Permitted to Blow the Whistle

One is permitted to blow the whistle


when the:

1) harm that will be done by the product to the


public is serious and considerable.
2) engineers (or employees) have made their
concerns known to their superiors.
3) engineers (or employees) have received no
satisfaction from their immediate supervisors and
they have exhausted the channels available within
the corporation, including going to the board of
directors.

When an Engineer is Required


to Blow the Whistle

Two additional conditions are needed


for requiring an engineer to blow the
whistle:
4) The engineer has documented evidence that
would convince a reasonable, impartial observer that
his/her view of the situation is correct and the
company policy wrong.
5) There is strong evidence that making the
information public will in fact prevent the threatened
serious harm.

Evaluating De Georges
Criteria

Gene James (1991) believes that De George's


conditions are too lenient.
An individual has a moral obligation to blow
the whistle when the first three conditions are
met, as well.
We have a prima facie obligation to "disclose
organizational wrongdoing" that we are
unable to prevent, which could also occur
when De George's first three conditions are
satisfied.

James Critique of De Georges


Criteria (continued)

For James, the degree of the obligation


depends on the extent to which we are
capable of foreseeing the severity and the
consequences of the wrongdoing.
He worries that De George's model leaves us
with no guidance when we are confronted
with cases involving sexual harassment,
violations of privacy, industrial espionage,
and so forth.
Also there is a problem with the word harm.

Alperns Criticism of De
Georges Criteria

Kenneth Alpern (1991) argues that De


George's model lets engineers off too easily
from their whistle-blowing responsibilities.
Alpern believes that engineers must be willing
to make greater sacrifices than others
because they are in a greater position to do
certain kinds of social harm.
He believes that these obligations come from
a fundamental principle of "ordinary morality"
viz., we must do no harm.

Ladds Defense of De Georges


Criteria

Ladd (1991) believes that requiring engineers


to blow the whistle in non-extraordinary
cases (such as in De George's conditions 1-3)
can be undesirable from an ethical point of
view because it demands that these
individuals be "moral heroes."
Engineers should not have to be heroes or
"saints."

An Alternative Strategy

De George and Ladd seem correct in


claiming that engineers should not be
required to be moral heroes or saints.
James and Alpern also seem to be
correct in noting that engineers,
because of the positions of
responsibility they hold, should be
expected to make greater sacrifices.

A Compromise View

Michael McFarland (1991) argues that,


collectively, engineers might be held to a
higher standard of social responsibility than
ordinary individuals.
However, the onus of responsibility should
not fall directly on engineers as individual
engineers.
Rather, it should be shouldered by engineers
as members of the engineering profession.

McFarlands View (Continued)

McFarland's model is based on the


assumption that, as moral agents, we
have a prima facie obligation to come to
the aid of others.
In describing the nature of this
obligation, he uses a non-engineering
analogy involving the infamous Kitty
Genovese case.

McFarlands Argument
(continued)

The analogy for engineers, McFarland draws


from the Genovese case is that when no
other sources of help are available, engineers
should take responsibility by banning
together.
If engineers act as individuals, they might not
always have the ability to help.
If they act collectively, however, they might
be able to accomplish goals that would
otherwise not be possible.

McFarlands Argument
(continued)

McFarland believes that an engineer's work


must be seen in a wider social context, i.e., in
its relation to society.
Without that context, an adequate account of
moral responsibility for engineers cant be
given.
Unless engineers work collaboratively on
ethical matters, they will not be able to meet
all of their responsibilities.

McFarlands Argument
(continued)

McFarland's model encourages


engineers to shift their thinking about
responsibility issues from:
the level of individual responsibility (at
the micro-ethical level), to responsibility
at the broader level of the profession
itself (i.e., the macro-ethical level).

Responsibility, Liability, and


Accountability

Responsibility requires that two


conditions must be satisfied:

causality,

intent.

Some agent, X, is held morally


responsible for an act, Y, if X caused Y.

Responsibility (Continued)

A person could be held responsible even if he


or she did not intend the outcome.
Robert Morris, who launched the "Internet
worm" in 1988, claimed that he did not
intend for the Internet to be brought to a
standstill.
Morris was held responsible for the outcome
caused by his act of unleashing the computer
worm.

Responsibility (continued)

Agents can also be held responsible when


they intend for something to happen, even if
they ultimately fail to cause (or bring about)
the intended outcome.
Suppose a disgruntled student intends to
blow up a computer lab, but is discovered at
the last minute and prevented from doing so.
Even though the student failed to carry out
his objective, we hold the student morally
culpable because of his intentions.

Liability vs. Responsibility

Liability is a legal concept.


It is sometimes used in the narrow sense of
"strict liability."
To be strictly liable for harm is to be liable to
compensate for it even though one did not
necessarily bring it about through faulty
action (e.g., when a someone is injured on a
persons property).
The moral notion of "blame" may be left out.

Accountability (vs. Liability


and Responsibility)

According to Helen Nissenbaum


(1995), responsibility is only part of
what is covered by the notion of
accountability.
Accountability means that
someone, or some group of
individuals, or perhaps even an
entire organization is answerable.

Accountability (Continued)

Nissenbaum notes that


...there will be someone, or several people
to answer not only for malfunctions in lifecritical systems that cause or risk grave
injuries and cause infrastructure and large
monetary losses, but even for the
malfunctions that cause individual losses of
time, convenience, and contentment.

Table 4-2: Responsibility,


Liability, and Accountability
Moral Responsibility

Legal Liability

Accountability

Attributes of blame (or


praise) to individuals.

Does not attribute


blame or fault to those
held liable.
___________________
Typically applies in the
case of corporations
and property owners.

Does not necessarily


attribute blame (in a
moral sense).
___________________
Can apply to
individuals, groups of
individuals, and
corporations.

________________________

Usually attributed to
individuals rather than
"collectivities" or
groups.
___________________
Notions of guilt and
shame apply, but no
legal punishment or
compensation need
result.

___________________
Compensation can be
required even when
responsibility in a
formal sense is not
admitted.

_____________________________

Someone or some
group is answerable
(I.e., it goes beyond
mere liability).

The Problem of Many Hands


in a computing Context

Computer systems are the products of


engineering teams or of corporations, as
opposed to the products of a single
programmer working in isolation.
So "many hands" are involved in their
development.
It is difficult to determine who exactly is
responsible whenever one of these safetycritical systems results in personal injury or
harm to individuals.

The Problem of Many Hands


(continued)

Two problems for assigning responsibility


(e.g., in the Therac 25 Case) is that:
(a) we tend to think of responsibility as something
that applies to individuals but not to groups (or
collectivities);
(b) we tend to think of responsibility in exclusionary
terms: If X is responsible, then Y is not, and vice
versa.

Accountability is a broader concept than


responsibility; it is non-exclusionary, and it
can apply to groups as well as individuals.

Assessing Risk

Gotterbarn (2004) argues that "ethical


risks" associated with the entire
"software development life cycle" must
also be taken into consideration.
The life cycle of software includes the
maintenance phase, as well as the
design and development stages.

Risk Assessment (continued)

Gotterbarn worries that the concept of risk


has typically been understood in terms of
three conditions, where software is either:
(i) behind schedule;
(ii) over budget;
(iii) fails to meet a system's specified requirements.

Software can satisfy all three conditions and


still fail to meet an acceptable standard of
risk assessment (e.g. Aegis Radar System).

Risk Assessment (continued)

Gotterbarn believes that failures


like the Aegis System are due to:
(1) an overly narrow conception of risk;
(2) a limited notion of "system
stakeholders."

Risk Assessment (continued)

Gotterbarn argues that a model of risk


assessment based solely on cost
effectiveness, i.e., in terms of criteria
such as budget and schedule, is not
adequate.
Instead, the notion of risk analysis must
be enlarged to include social, political,
and ethical issues.

Risk Assessment (continued)

Gotterbarn also notes that the stakeholders who are


typically given consideration in risk assessment
models for software development are limited to the:

(a) software developers,

(b) customers.

This limited notion of "system stakeholders" leads to


developing systems that have unanticipated negative
effects (as in the Aegis case).
We need a more robust model of risk assessment for
software development.

You might also like