You are on page 1of 1

Examination Questions 17 – Ethical, moral and cultural issues (AS / A Level) UNIT 1

1. TheinUK
Computers Government
the Automatedhas committed
decision making toArtificial
an investment
intelligence in testing self-driving
Environmental effectscars. Censorship
The hope andis
the Internet Specification
Monitor behaviour PointsAnalyse
/ Learning
personal Objectives:
Piracy and offensive PG Online text book page ref: 249-258
workforce information communications
that one day, humans will not need to drive cars or freight vehicles on the roads. AS Level A Level Specification point description
1.5.2a 1.5.2a The individual (moral), social (ethical) and cultural opportunities and risks of digital technology:
Computers in the Automated decision is only Has the ability to write Computers could benefit The internet should be an ISPs have the right Personal data should
1. Computers The copying of use of another
in the workforce
Discuss
workplace should havethe ethical
as good asandthemoral issuesbetter
developer and algorithms
the socialthenbenefits of this, explaining
the environment with the whether youAnd it should
open space. to monitor users only ever be analyse with
2. Automated decision making persons work goes against the
adequate security, as that coded the decision and humans could by the use used of self-driving cars as have the same sort of activity without the3. consent of intelligence
data copy right and patent act and
would
required by the data
recommend further investment
allgothrimis should be
in this research.
of neural networks. But they could find more freedom of speech rules telling the user as
Artificial
subject. And any data therefor should not be allowed
4. Environmental effects
protection act a method rigorously tested before use should be used carefully efficient routes and drive at just like the real world. But I according to The stored must be stored .Users should also have the
5. Censorship and the Internet
to One consideration
ensure this is for each is that car will have peoples aslives in the
if given the power
wrong just asthethe
mostdriver would
efficient speedhave. believe
This could raise
services should at regulation of according to the data option to filter communications
user to have their own goal and full computer constantly therefore least put warnings in place investigatory 6. Monitor
protection act. behaviour
Meaning it but due to the law about
a benefit as cars could be deemed to be safer as
login.
they can set to always
access could create
follow the road rules without
reducing fuel consumption
the ability
for certain to warn views powers act. is 7. Analyse
stored securitypersonal information
and not freedom of speech I believe
to break them. This would if theory reduce roadmuch accident
damage deaths
to the road
and laws
amountareof there
Co2 for thatabout
verysensitive
reason.messages
On 8. Piracy
transferred. and offensive communications
this option should never be
9. Layout, colour paradigms and character
by thesets
the other hand they raise the moral issue that ifcomputersthe car was to cause areleased and other sort
death while driving automatically of things.
whose changed service
provider
fault would it be. As the law currently states it’s the driver’s fault but in reality, that doesn’t seem fair as it might
not have been the drivers actions that caused the death. Expectations / Learning Outcomes:

One issue with self-driving cars is the situation where the cars current path will set it to hit a group of people but  For this Learning Record you must take each of the bullet points listed above in spec point 1.5.2a and
they could be saved by automatically steering the car off in another direction but this would result in killing just make them a focus. You can do this in any form you like such as a table for each one, or a mind map
one person who was previously not in harms way. From a moral point of view you would say it’s better for the with 9 parts to it.
total number of deaths to be as low as possible. But if a developer was to implement this they would have to  You must then provide evidence around each point which shows you have thought about the
create a list ranking the importance of different types of people and the numbers they are in for the car to decide individual, social and cultural opportunities and risks of each.
how to save. Which would be very immoral and it also raises the thought that by the developer implementing this
would they be responsible for the deaths of the other people. Grade TG. Breadth Depth Presentation Understanding
Quad Quad
In whole a think the aim should be to reduce total deaths/ injures at all costs. And maybe with the implementing A/A* LINK / FORMULATE
Create, Generate,
Core Core
ALL
self-driving cars this figure could be zero. So I think we should continue to invest money in their development Hypothesis, Reflect,
and cars should only ever be allowed on the road with extremely high quality algorithms as they can potentially Theorise, Consider

have people’s lives at stake. EXPLAIN / ANALYSE Dual Dual


[12] B/C MOST
Apply, Argue, Compare, Core Core
Contrast, Criticise,
Relate, Justify
DESCRIBE / IDENTIFY Single Single
D/E SOME
Name, Follow Simple Core Core
Procedure, Combine,
List, Outline

U FEW
Very little depth of
understanding shown

MY ASSESSMENT GRADE IN THIS TOPIC IS:


How To Improve:

My Response Is: (Set yourself specific targets / objectives as to how you will achieve your HTI)

You might also like