Professional Documents
Culture Documents
ETHICAL DILEMMAS IN ST
DILEMMA:
a situation in which a difficult choice has to be made between two
or more alternatives, especially equally undesirable ones.
a situation where there is no clear easy choice or answer.
a difficult situation in which you have to choose between two or more alternatives
an argument forcing an opponent to choose either of two unfavorable alternatives.
ETHICAL DILEMMAS
John J. Reilly Center for Science Technology and Values of the University of Notre Dame: ten emerging ethical
dilemmas and policy issues in science and technology for 2018:
1. Helix. A digital app store designed to help you read your genome
2. The Robot Priest. BlessU-2 and Pepper are the first robot priest and monk, respectively.
3. Emotion Sensing Facial Recognition. Optimizing retail experiences by assessing your reactions.
4. Ransomware. A malware that threatens to publish your data or hold it hostage until you pay up, whether you’re
an individual or a large corporation.
5. The Textalyzer. This is a new tool in the battle against texting and driving that tells police if
you were on your phone before an accident.
6. Social Credit Systems. This is a system of scoring people through their actions by
monitoring them under a surveillance camera.
7. Google Clips. This little camera will watch you all day and capture your most picturesque moments.
8. Sentencing Software. Recently, Americans are being sentenced with the help of a mysterious algorithm.
9. The Rise of Robot Friendship. An app that stores the deceased’s footprint. Can we create a chat bot out of our loved
ones’ old texts and social media posts?
10. The Citizen App. An app that notifies users of existing events in a particular area. (e.g. heavy traffic, ongoing
crimes)
Will you use these technologies? Will you allow them to be used on you?
How will you choose/decide. What will be your bases?
problem here is that beauty companies market themselves as “clinically proven” when that is, in fact,
not the case. Most research done by manufacturers does not meet the scientific method and is not
reproducible. The experts hired to tout these products are not scientists either—they are often
Here Baron asks a startling question: are you your data? While hiring companies can already see a
candidate’s social media history, some companies are going a level beyond and using neurological
games and emotion-sensing recognition as part of their assessments. If taken to the extreme, this
means a machine could decide if you are right for a position based entirely on your responses to a
game of your facial expressions. Never mind your resume, your phone interview, your in-
person interview, or your impressive track record—it could all be for naught.
3. Predatory Journals
Researchers estimate there are roughly 8,000 predatory journals, or journals that lack ethical
practices such as peer-review and have extremely low standards. The thing is, when these journals
publish anything, the information becomes fodder for unknowing researchers and scientists who
are duped into believing it’s the truth. Given the immense amount of pressure on academics to
predatory journals. As you’ll read later on in this list, fake data is not something we can afford much
more of.
President Donald Trump’s White House is considering a controversial plan to monitor the mentally ill
as a way to stop mass shootings in the U.S.—a program that sounds a lot like a real-life Minority
Report. HARPA, run by a third-party pancreatic cancer foundation with no governmental ties, would
leverage data available on phones and smartwatches to detect when mentally ill people are about to
turn violent. Beyond the infringement of civil liberties, research has not found reliable benchmarks
to predict violent behavior, or even classify the mentally ill versus non-mentally ill.
ClassDojo is a popular online tool that, through recording in the classroom, scores children on their
behavior, and then shares that with the class, as well as parents. The system’s company says it is
meant to foster positive behavior in the classroom, but pundits raise more than a few concerns,
including: 1) can the information be hacked; 2) how is good behavior quantified/defined?; and 3)
6. Grinch Bots
Aptly named “Grinch Bots ” include online entities that buy up popular goods as soon as
Once the goods are sold-out, they are resold on the secondary
market at an inflated price. This isn’t a new problem, but there also isn’t a new solution, either. In
2016, Congress passed the Better Online Ticket Sales (BOTS) Act, but it hasn’t been very effective.
The Stopping Grinchbots Act 2018 was introduced last year and is currently awaiting more action
from the House. However, the bill would only make it illegal to resell products purchased by
automated bots, and obviously doesn’t apply to the rest of the world.
7. Project Nightingale
Dubbed Project Nightingale, this partnership sees Ascension, the second-largest health care system
in the U.S., collaborate with Google to host health records on the Google Cloud. With roughly 2,600
hospitals, doctors’ offices and other related facilities spread over 21 states, it holds 10s of millions of
patient records. Both companies signed a HIPAA (Health Insurance Portability and Accountability
Act), meaning Google can’t do anything with the records other than provide a cloud hosting service.
However, The Wall Street Journal reported that neither doctors nor patients had been informed of
what was happening with these records and that roughly 150 Google employees had access to the
data. As data increasingly moves to the cloud and other storage options, and companies such as
Microsoft and Apple also launch health projects, we have be ensure our data is protected.
college websites use software that reveals the name, age, ethnicity, address and contact information
of a candidate, as well as which specific college sub-pages he/she visited and how long was spent on
each web page. The college then uses these factors to determine an “affinity score” that decides how
likely a candidate is to accept an offer from the college. But, Baron says, when colleges assign scores
to students based on income and interest, it strips applications of much of their context and it also
discriminates against low-income students or those without dedicated Internet access. The analytics
have the potential to harm a prospective student’s college admission based on an algorithm that
When CRISPR-Cas9 gene editing went mainstream in 2012, researchers immediately called a
moratorium due to the high-power potential of the system. There were then nationwide meetings,
international meetings, multiple groups got involved—overall, it went exactly as it was supposed to.
Now, however, the legitimacy of the ethical researcher is taking a hit as lawyers, business people,
journalists and others muddy the waters. Ethics officers need to have rigorous training and
understand the frameworks for ethical decision making. Otherwise, ethics turns into a merry-go-
round.
recent application of deep learning to create hard-to-identify fakes is more sophisticated, and more
concerning. States are attempting to build legislation against deep fakes, and companies like
Facebook and Microsoft want to help develop tools to spot them. But these days, just about anyone
can download deep fake software to create fake videos or audio recordings that look and sound like