You are on page 1of 3

NARRATOR: It’s sometimes said that “Seeing is believing.

” But in the digital age, how


can you be so sure that the photo, video or audio being shared on social media hasn’t
somehow been altered or manipulated? In today’s edition of Intersections: The RIT
Podcast, associate professor of photography Christye Sisson and motion picture
science undergraduate student Owen Thompson, discuss media forensics and its role
in protecting us all from fake news.

OWEN: How do you define media forensics?

CHRISTYE: I guess I would define media forensics as the process of determining


whether or not any sort of media has been manipulated, whether it’s images or video or
audio. Breaking that down to determine not only if it’s been manipulated, but how it’s
been manipulated.

OWEN: Well, whenever I hear the word ‘forensics’ I think of crime TV. I think about the
media vessel as a crime scene in and of itself. And you’re looking through for tiny little
hints that might give you some indication of what’s going on here.

CHRISTYE: Sort of under the hood.

OWEN: Yeah.

CHRISTYE: To figure out under the hood what evidence is there of the trail.

OWEN: How did you first get involved in this project?

CHRISTYE: We were first approached by the government to be involved. Really, RIT’s


role is to create the ground truth, or the ability to have this set of test manipulations so
that we can make the algorithms of detection that much better. Our job is to create the
images, the media, the video to be able to make those manipulations and then turn
around and manipulate those raw materials as if we were the bad guys. As if we were
the ones trying to deceive, or trying to make those images and that media counter to
their original, sort of, truth. And so, in that process we work very closely with these other
universities to determine whether or not the things that we’re producing are being
successfully detected or not. And then we take that information and are constantly
working that back into what we are doing to ensure that those algorithms of detection
are finding all those clues, are finding all those traces that we might be leaving. That
trail of breadcrumbs or clues that we are leaving to help determine that these images
were manipulated, but how. Really, RIT’s unique in this project in that we’re pretending
to be the bad guys. We are creating those manipulations as if we were trying to put one
past the algorithms of detection.

OWEN: Why would you say this project is important?

CHRISTYE: I think the project is really important because of the nature of any sort of
digital media is that it’s pretty malleable as compared to analog processes and the ease
of which we can manipulate things in the public realm. We can do it with an app, we can
do it with free tools, and it can be pretty undetectable. I think in the media consumption
sphere it’s pretty important to know what is truth for lack of a better term.

OWEN: Fake news?

CHRISTYE: Fake news, yeah. And then there’s the idea of being able to put your faith
in the things that you see – the images that you see, the video that you see. That has
larger societal implications as to what does happen if we can’t put our faith in footage, in
news footage, in audio recordings and interviews. How the repercussions of that sort of
distrust can pervade society. This certainly has been coming up more and more in the
last few years, certainly. So, the timing of this project is ideal given the current culture
and the current status of what we hear all the time with fake news and whether or not
images have been manipulated.

OWEN: As far as I can tell, there’s a lot of catching up to do at this point with the
capabilities of things like deep fakes and stuff like that. They seem pretty alarming. And
it only seems like it’s accelerating. Even if we are pretty far behind, it doesn’t mean we
should give up, obviously. But I think it’s possible to catch up.

CHRISTYE: Even over the course of this project we’ve seen the acceleration. When we
started deep fakes didn’t really exist, at least in the public sphere. Now we’ve seen it go
from sitting in that uncanny valley of you can tell that it doesn’t look right to just to that
point where it’s pretty – especially if it were a low-quality image or video that somebody
was seeing – they’re pretty compelling. Especially if it ties in with the viewer’s belief
system. If they take it literally at face value that can be pretty alarming.

OWEN: What kinds of things do you think this research will help stop?

CHRISTYE: The hope is that this project will provide the tools to determine whether or
not an image may have been manipulated. And then the second, and even more
importantly, is how they’ve been manipulated. So ultimately, being used by the
government, by law enforcement agencies to support both our national security
interests and domestic security. The repercussions or value of having a tool like this is
really, really significant to be able to stay that one step ahead in this kind of digital arms
race to combat this idea of digital untruth, fake news, fake media. And so, the hope is
that this tool will take steps in that direction. I think you said earlier it’s alarming how
quickly these things are moving, so trying to stay that one step ahead. I see this as
being very iterative. This is not going to be done when we’re done with the project. It’s
going to constantly need to be updated and responsive to the sort of challenges and
threats we face both in this country and abroad. One of the most popular questions I get
when I talk about this project is, especially to the general public, is how do we tell? How
do we know if the images that we’re seeing can be trusted? The thing that I say to
people is, at least right now, because we are looking into the future as to the completion
of the project. But from people’s day-to-day consumption of media I tell people that
Google reverse image search is your best friend. To find out what the breadcrumb trail
is to the creation of that image or the creation of that video and go to those trusted
sources and look at multiple sources because it’s a little bit of a game of telephone.
Depending on how many versions of something that you’re seeing it can accumulate
things that may not have been in the original along the way. Just performing that as a
concerned citizen, especially if it’s an image or a video that is very compelling or very
potentially inflammatory. And social media contributes to the rampant sharing of
especially images that are inflammatory, and videos as well. Doing that check before re-
sharing, before re-posting, before contributing to the potential for misinformation. Just
doing that quick check for yourself, doing the reverse image search, doing the reverse
video search, finding out where it came from as an educated consumer of media is one
of the things people can do to feel at least more secure in what they’re seeing and what
they’re sharing.

OWEN: To give everybody an idea, what volume of media has been accumulated by
this project so far?

CHRISTYE: I know the project at large has generated this incalculable number of
images and videos and so forth. RIT – so our two jobs were to create trusted media that
we captured. We really left no stone unturned in terms of what types of devices we’ve
used to do that. Everything from your very high-end camera to cellphone camera to
point-and-shoot cameras. And every sort of permutation therein in terms of camera
settings they might have, different on-board features they might have, different sensors.
Our job was to exhaust what is currently available in terms of the kinds of media that we
could be looking at and we could be producing. From there, we take those raw
materials, those trusted materials, and we manipulate them. They range from simple to
very complex. But with every manipulation, we chronicle everything that we’ve done to
act as a digital cheat sheet that goes along with the manipulation so it’s clear what has
been done to the image so that when we use it for testing, we can tell exactly how
successful we were. Over the course of the project we were really focusing on images
to start with, and that has evolved to incorporate a lot more video and audio. We’ve
generated several thousand image manipulations and just under 500 videos. The work
involved with creating these has been all accomplished by our undergraduate students
ranging from motion picture science to photographic sciences to imaging science to
advertising photography, fine art photography, photojournalism, visual media. We’ve
tapped the skills of all the students here at RIT to capture both the technical side of
those manipulations and the artistic side of those manipulations to create what is a huge
undertaking in terms of creating those assets. It’s really a huge project and a huge
undertaking that spans so many different organizations, so many different universities,
so many private companies. Sort of this incredible brain trust that is working together to
solve this problem. For me, the feeling of being part of something so important and
really large has been really great.

NARRATOR: Thanks for listening to Intersections: The RIT Podcast, a production of


RIT Marketing and Communications. To learn more about our university, go to
www.rit.edu and to hear more podcasts, find us on iTunes and TuneIn or visit us at
www.soundcloud.com/rittigers

You might also like