Professional Documents
Culture Documents
BIK21003 - Final - Exam - FBK - semII 20222023 - 240130 - 211107
BIK21003 - Final - Exam - FBK - semII 20222023 - 240130 - 211107
i
_ut __ "
.,r!o'lJ[aiuht
o
tft,., ;,iq
I 89s>
FINAL EXAMINATION
SEMESTER II SESSION 2022J2O23
DEGREE PROGRAMME
INSTRUCTIONS TO CANDIDATES
1. This Booklet consists of FOUR (4) sections: SECTION A, SECTION B, SECTION C' and
SECTION D.
10. ln the sentence Dln felt comfortable in class, Din plays a role as
A. Subject
B. Modifier
C. Direct object
D. lndirect object
(10 marks)
SECTION B:
1 . Form TWO (2) new words from the following lexemes as listed in the table.
a admit
b commit
c permit
d remit
e transmit
(10 marks)
418
BIK2l OO3/FB1(SEMESTER II SESSION 202212023
2. 10 wordp were underlined and bolded in the text below. Please state: (1) Type of word
formation process (e.9., clipping, compounding, blending, derivation, etc.)for each of the
underlined word, and (2) Changes from the base/root. No. 0 is given as an example.
lmproved technology has made synthetic media ever more (0) realistic - but there are ways
to detect such content. lmages circulating recently on Twitter featured a (1) flustered-
lookinq Donald Trump being arrested, devastation in Oregon after a 2001 earthquake and
the Pope wearing a big white puffer jacket.
To the casual (2) observer they look like real photographs, with only their fictitious subject
matter in some instances indicating these are actually fakes - created in this case using
an (3) artificial intelligence image generator called Midjourney.
Political (4) deeofakes are seen as particularly problematic. "Deepfake content can be
used maliciously, such as discrediting politicians or spreading (5) disi rmation " says
Dr Subhajit Basu, associate professor of information technology law at the University of
Leeds in the UK. For example, deepfake videos can create false statements or actions
attributed to political candidates, influencing public opinion and (6) undermininq trust in
the democratic process." Although it may be possible to detect deepfakes, by the time a
video or image has been shown to be fake, it may already have gone viral.
"This can be incredibly damaging in countries where the general population does not have
'digital awareness'. This can contribute to the erosion of trust in democratic institutions,
media, and public figures," Dr Basu says. A (7) Doliticallv incendiary deepfake video could
be released shortly before polling day or politicians could also argue that controversial or
(8) ill-conceived comments they made were never actually said.
Dr Antoniou says indirect methods may involve considering an image's context, such as
the location or event that it is supposed to come from. Sometimes simply looking carefully
at a poorly produced fake image will reveal (10) qiveaway flaws. He cautions that it as
deepfakes become more realistic, it may become "increasingly difficult" for the untrained
eye to tell apart real and artificially generated videos.
5/8
BrK21 003/FBTOSEMESTER I SESSION 202212023
b observer
c. artificial
d deepfakes
disinformation
f. undermining
s politically
h. ill-conceived
t. metadata
I giveaway
(20 marks)
SECTION C:
lnstructions: ldentify and write the verb phrase for each of the sentences below in the space
provided.
6/8
BtK2l 003/FBl(SEMESTER lt SESSTON 202212023
(10 marks)
7t8
BIK21 OO3/FB}(SEMESTER II SESSION 202212023
SECTION D:
lnstructions: ldentifo and write the phrase markers of each phrase in the following
sentences. No. 0 is given as an example.
Several victims of COVID-19 (NP); were buried UP): this morninq (ADVP)
1 . The few surprising facts about COVID 19 were released this morning.
2. The fact that many people died because of the pandemic has hastened vaccination
distributions.
4. To eliminate several leading people from the administration is not an easy task.
(20 marks)
8/8