You are on page 1of 2

Link video: https://www.youtube.com/watch?

v=_m2dRDQEC1A

Could deepfakes weaken democracy? | The Economist

Task 1: Listen to a talk about deepfakes artwork and decide whether each statement is T, F or NG:

1. AI and machine-learning technologies are used in creating deepfake artworks to imitate the
appearance of famous celebrity influencers.
2. The fake videos of Mark Zuckerberg is testament to people’s ability of spreading disinformation
online.
3. Deepfake technology can be used to manipulate the votes in any countries.
4. “Reality apathy” exploits people’s beliefs in democracy.
5. The program that Sander is working on spurs the invention of preventative medicine.

Answer:

1. T < So the deepfake artworks used artificial intelligence and machine-learning technologies to
kind of hack the bodies if you like, of famous celebrity influencers.>
2. T <But that didn’t stop this fake clip of Facebook boss Mark Zuckerberg, going viral. That showed
the potential for spreading disinformation online through deepfakes.>
3. F < As the technology advances the danger is that deepfakes will be used to mislead voters in
democratic countries.>
4. NG
5. F < Dr van der Linden’s team have drawn inspiration from preventative medicine in their hunt to
cure fake news.>

Transcript:

Democracy is easy. It’s like stealing ice cream from a baby. I genuinely love the process of manipulating
people online for money We just want to predict your future behaviours. These videos are all deepfakes.
Synthesised content created using artificial intelligence. Fake, fake, disgusting news. Deepfakes will make
for even more complicated arguments about what is fake news and what is real. And if seeing is no longer
believing the very real question is could deepfakes weaken democracy? Democracy just doesn’t work if
people don’t believe in it. So the deepfake artworks used artificial intelligence and machine-learning
technologies to kind of hack the bodies if you like, of famous celebrity influencers. Bill Posters is the artist
behind these deepfake videos known as the Spectre Project. Spectre is almost too powerful to
comprehend. Two of the main questions we wanted to explore with the Spectre Project is what does it
feel like when our personal data is used in unexpected ways by powerful tech companies and how as a
result can that change our understandings of today. To test Facebook’s response, Bill posted the deepfake
videos on Instagram, a social-media platform owned by Facebook. The company downgraded the videos’
visibility. Spectre showed me how to manipulate you into sharing intimate data about yourself and all
those you love for free. But that didn’t stop this fake clip of Facebook boss Mark Zuckerberg, going viral.
That showed the potential for spreading disinformation online through deepfakes. A danger that’s likely
to increase as long as tech companies and politicians remain unsure how to deal with it. The power of
deepfakes is an area of great concern whilst these technologies exist in what is essentially a regulatory
black hole. Image manipulation is already exploited by autocratic regimes. It’s a dark art that goes back to
Joseph Stalin who made his enemies disappear. AI today is capable of making deepfake videos like this
where comedian Bill Hader morphs into Tom Cruise. As the technology advances the danger is that
deepfakes will be used to mislead voters in democratic countries. If you take away those tools, that
enable us to be able to sort out what’s real from what’s not you make very poor decisions. Aviv Ovadya is
the founder of Thoughtful Technology Project. He worries about another problem that deepfakes could
be used as an excuse to help politicians escape scrutiny. “You have the corrupt politician being able to say
“oh yeah that video of me—that was fake”. That brings us into a world where people won’t know what
they can trust.” He believes the ultimate threat from deepfakes could be that more and more people opt
out of democratic politics. A phenomenon he calls “reality apathy”. Reality apathy is when it’s so hard to
make sense of what’s happening. People just sort of give up. Democracy just doesn’t work if people don’t
believe in it. So what can be done to fight back? A group of scientists at Cambridge University are having
a go. They have developed a computer game to teach people how to spot disinformation. So in the game
people essentially step into the shoes of a fake news producer and you build your way up to a fake news
empire by spreading fake content online. Dr Sander van der Linden, the game’s designer believes it will
help people to distinguish fact from fiction. “So your goal is to get as many followers as possible while
maintaining your online credibility. So you can’t be too ridiculous. And the first badge in the game is
about impersonating other people online And of course one example that we’ve talked about is
deepfakes. So in the game we test people before and after and at the beginning we found that people are
duped by a lot of these techniques but once they played the game they’ve become resistant and are able
to identify them later on.” Dr van der Linden’s team have drawn inspiration from preventative medicine
in their hunt to cure fake news. So just as you inject someone with a severely weakened dose of a virus to
trigger antibodies in the immune system you can do the same with information. People can essentially
create mental antibodies and become immune against fake news. And essentially everyone is their own
bullshit detector. Today I’m president, not because I’m the greatest though probably I am. Deepfake
technology means that faking videos is becoming as easy as faking words and photos. Until people learn
to look at video with a more critical eye there’s a danger that deepfakes could be used to undermine
democracy

You might also like