Professional Documents
Culture Documents
Table of Contents
Meaning and Definition of Editing for Electronic Media ....................................................... 3
Three Decision Making Areas in Cutting a Film..................................................................... 4
The rationale for Editing .......................................................................................................... 4
Types/ Modes/ Process of Editing............................................................................................ 6
Different Types of Video Editing............................................................................................. 9
The hardware and software requirements in video editing assignments................................ 11
Digital Technology and Editing ............................................................................................. 13
Digital video........................................................................................................................... 15
Analog versus Digital Video .............................................................................................. 15
Frame Rates and Resolution............................................................................................... 16
Interlaced and Non-interlaced Video ................................................................................. 16
Analog Video Formats ....................................................................................................... 17
Broadcast Standards ........................................................................................................... 18
Getting Video into Your Computer........................................................................................ 18
The process of capturing video into the computer ............................................................. 18
Audiences’ expectations......................................................................................................... 20
Approaches to Editing ............................................................................................................ 21
Editing Principles................................................................................................................... 22
General Editing Principles ................................................................................................. 22
Basic Rules in Editing........................................................................................................ 24
Editing Interviews .................................................................................................................. 26
The advantages of Digital Editing with a Video Server..................................................... 27
Six quick tips for file server editing ................................................................................... 27
Mixing and uses of basic effects in Editing ........................................................................... 28
Effects in film editing ........................................................................................................ 29
Distinguishing Optical Sound Track from Magnetic Sound Track .............................. 29
Editing Sound......................................................................................................................... 31
Four Essential Audio Elements .......................................................................................... 31
Analog and Digital Audio ...................................................................................................... 34
Characteristics of Digital Audio......................................................................................... 34
Audio File Formats............................................................................................................. 34
Audio Streaming ................................................................................................................ 34
Recording, Copying, and Converting Digital Audio ......................................................... 35
Copying Audio (Hardware)................................................................................................ 35
Copying Audio (Software) ................................................................................................. 35
Converting Audio............................................................................................................... 35
Procedures involved in sound editing .................................................................................... 35
Audio Editing technical terms:........................................................................................... 36
This unit is designed to help you in understanding what editing is all about especially as it
concerns broadcast production.
Editing is also referred to as the cutting of film. It is defined as the process of selecting the
parts of the shots that are good and that serve the needs of the film and eventually
discarding the rest, (Mamer, 2009). It therefore requires extensive knowledge of the
mechanics of cutting. It is the process of choosing creative materials that fit a subject
matter and the blending of various photographed frames of a film, in a convincing manner,
in order to transmit the message to the audience of the artistic work, (Owuamalam, 2007).
Each scene must have generally been photographed and recorded several times with
each filming being regarded as a take. During the filming exercise or shooting exercise,
the director decides which takes are good enough to print. The printed takes therefore
form a work print with which to work with during editing (Kogah, 1999).
Film editing involves the use of plot in arranging the presentational sequence of the story
line. The strategy enables the idea of the creative work, as conceived, to be actualized,
through a technical process. The process requires the use of equipment, and script, to
match the interpretative capacity of the editor. The editor applies skill, knowledge and
experience, to produce the synergy called film.
Video editing is the process of manipulating and rearranging video shots to create a new
work. Editing is usually considered to be one part of the post production process — other
post-production tasks include titling, colour correction, sound mixing, etc.
Many people use the term editing to describe all their post-production work, especially in
non- professional situations. Whether or not you choose to be picky about terminology is
up to you. In this case we are reasonably liberal with our terminology and we use the word
editing to mean any of the following:
Rearranging, adding and/or removing sections of video clips and/or audio clips.
Editing is a “specialty” skill or occupation in video and film production i.e. “post -
production” process. It is the process of assembling visual & aural elements into a creative
product.
3
For pictures, editing entails going through the shots and determining their specific order,
then deciding on the precise transition point from one shot to the next. The order of
shots may be predetermined in a narrative film, through that order may not be as rigid as
first assumed. In documentary and experimental film, you may have to devise the order
yourself.
Cutting or editing sound includes a number of approaches such as cutting sync tracks in
conjunction with the picture, determining the relationship between music and picture and
building complicated , layered sound effects after the picture is mostly or completely cut.
Optical effect indicated a graphic effect that is created in the lab. Optical effects
include split screens, keyholes, freeze-frames, spines, wipes and a host of other effects
executed by the lab at the filmmaker’s instruction and done prior to the final printing.
They are difficult to get right and may take several tries to obtain the precise effect.
There are many reasons for editing materials electronically and any editing approach will
depend on the desired outcome. Before you begin you must clearly define your editing
goals, which could include any of the following:
Specifically, Owuamalam (2007) gave the under listed points about the functions which
the act of editing performs in any film as follows:
1. Editing facilitates the removal of film footage that can destroy society, like racial
or ethnic inciting scenes; derogatory gender scenes, offensive stereotyping;
obscene and lurid scenes; that debase morality and legally blamable scenes. It
enables the producer to correct impressions that could affect the image and
reputation of the production adversely.
2. Editing trims the footage to fit into a specific duration as dictated by the
medium of presentation (television or film theatre/cinema).
3. Editing combines shots in a spectacular way, in order to achieve an
understanding of the film. It brings discretely shot scenes together, in a
B Roll helps you keep the video interesting versus staying on one long boring shot
19. Finally, editing serves as a structural transformer, which provides the salient
aspect of a work, in a clear and focused way, within a specific length of the film,
adjusted to suit viewership interest.
Editor is the one who plays a vital role in post-production phase. Like director,
cameraman and designer, editor also has a direct impact on any programme production.
The harmony and coordination between the editor and the producer is very necessary as
the editing can either polish or tarnish the finished product.
During the process of editing the synchronization and equation of both the producer and
editor is very important and it speeds up the pace or the work as well as it improves the
programme quality.
The choice of editing either linear or non-linear solely depends on the producer, as s/he has
to anticipate the needs according to available technical facilities. Linear editing is in
vogue now a day due to its variety of effects, more options and provision of more audio
and video layers to make a programme more colourful and bright however linear editing
is speedy and less time-consuming.
Editing involves the use of plot in arranging the presentational sequence of the story
live. The strategy enables the idea of the creative work, as conceived, to be actualized,
through a technical process. These process or modes as outlined by Owuamalam (2007) are
real time editing and post-production editing. Real time editing mode makes it possible to
6
present line shows and programmes as the events happen. The editing process involves the
use of materials from various sources and blending them synergically to produce the
screen experience, known as film. For example, the news coverage of the visit of president
Kenyatta to launch a development project, can show the following arrival ceremonies, as a
live Programme: the Presidential jet is seen touching down the airport runway; another
scene is shown where government officials are waiting in front of a red carpet, laid for the
president; activities inside the VIP lounge, showing journalists in the front seat, where
the president is to address them, traditional or cultural dance troupe outside the arrival
hall, entertaining the crowd etc. It is a blend of the various scenes and sound, as a package,
that produces the live programme, which is enjoyed on the television screen.
There are two basic types of film editing. They originate from the equipment and process
that are applicable in realizing the editing objective. The type considered and used, is a
matter of convenience and available technology. The two types or forms of editing are
linear and non- linear editing.
Linear Editing
It is time bound as a particular time code is followed to access different data. It is done on
and by videocassettes and tapes.
The editing apparatus consists of a panel like a keyboard of a computer added by a round
knob to shuttle and jog on machines having recorded material with different monitors for
the separate display. Usually there are three recording machines used as:
1. Player 1
2. Player 2
3. Recorder
Players are the sources that supply the chunks recorded in bits and pieces. Normally first
player has the basic visuals that are main video and player two has the secondary data that
can be background voice or some strips, graphics, names to be super-imposed, overlaid or
inserted on the primary video while the recorder takes the final output given by the both
players.
In linear editing, video tapes are used for playbacks and recording. It is a tape-based
recording system, whether analog or digital. The sequence of review is orderly and
Progressive. Continuality of events takes place in a specific order, which is not to be
7
altered. In linear Editing System (LES) two videotape recorders (VTR) are required.
One plays back the recorded tape while the other is used in recording selected shots from
the former, according to the editing plan. The shots to be selected can be identified
from the recorded tape, using the tape counter to find the exact location of the said shot,
in the produced tape. The editor notes the numbers and arranges them according to the
takes, desired to produce the finished product. The editor uses two monitors-the
preview monitor and the final view monitor. The preview monitor is used to watch and
select shots or takes, from the review VTR. It is the pictures on this monitor that enable
the editor to pause the review tape and select shots. The other monitor, shows the
recorded images from the editing VTR. It is the picture shown on this screen that tells the
editor if the plot and storyline have been followed as indicated by the technical desire
of the director and the expectation of the producer.
The Non-Linear Editing (NLE) system is disk based. It uses the computer for storage,
reviews and the editing of video and audio data files. The system allows one to jump from
one shot or take, to the other, irrespective of the location of the desired shot in the file.
One can jump from, say shots 1 to 7, without assessing the shots in between the
shots 2,3,4,5, and 6. The capability of random access is created since one can jump and
access any desired shot at will.
In the NLE, the programme to be edited is converted digitally into electronic signals and
recorded in a disk. The disk is loaded into a computer’s disk drive, which enables the
system to accept and respond to commands. The shot identification takes place, within
the shortest imaginable time frame. It provides one, the opportunity of taking editing
decisions that enable the shots to relate and blend with each other, to produce a thrilling
synergy, which tells the story of the plot.
1. Film Splicing
Technically this isn't video editing, it's film editing. But it is worth a mention as it was the
first way to edit moving pictures and conceptually it forms the basis of all video editing.
Traditionally, film is edited by cutting sections of the film and rearranging or discarding
them. The process is very straightforward and mechanical. In theory a film could be edited
with a pair of scissors and some splicing tape, although in reality a splicing machine is the
only practical solution. A splicing machine allows film footage to be lined up and held in
place while it is cut or spliced together.
Linear editing was the original method of editing electronic video tapes, before editing
computers became available in the 1990s. Although it is no longer the preferred option, it is
still used in some situations.
In linear editing, video is selectively copied from one tape to another. It requires at least
two video machines connected together — one acts as the source and the other is the
recorder. The basic procedure is quite simple:
1. Place the video to be edited in the source machine and a blank tape in the recorder.
The idea is to record only those parts of the source tape you want to keep. In this way
desired footage is copied in the correct order from the original tape to a new tape. The new
tape becomes the edited version.
This method of editing is called "linear" because it must be done in a linear fashion; that is,
starting with the first shot and working through to the last shot. If the editor changes their
mind or notices a mistake, it is almost impossible to go back and re-edit an earlier part of
the video. However, with a little practice, linear editing is relatively simple and trouble-
free.
3 Digital/Computer (Non-linear)
In this method, video footage is recorded (captured) onto a computer hard drive and then
edited using specialized software. Once the editing is complete, the finished product is
recorded back to tape or optical disk.
Non-linear editing has many significant advantages over linear editing. Most notably, it is a
very flexible method which allows you to make changes to any part of the video at any
time. This is why it's called "non-linear" — because you don't have to edit in a linear
fashion.
One of the most difficult aspects of non-linear digital video is the array of hardware and
software options available. There are also several common video standards which are
incompatible with each other, and setting up a robust editing system can be a challenge.
The effort is worth it. Although non-linear editing is more difficult to learn than linear, once
you have mastered the basics you will be able to do much more, much faster.
4 Live Editing
In some situations multiple cameras and other video sources are routed through a central
mixing console (video switcher) and edited in real time. Live television coverage is an
example of live editing. Live editing is a fairly specialist topic.
Production Switcher
The switcher is an 18-input video switcher designed for both live production and
post-production. Like any video switcher, the can perform video transitions such
as wipes, mixes, keys, etc. The alone is not capable of digital video effects. The
switcher is a very complex device with many levels of operational functions. To
keep things as simple as possible, we will focus primarily on the Downstream Bus
Rows consisting of the preview and program buses (bottom two) with only
occasional use of the Key Bus Row and the M/E Bus Rows (top two). For most
projects the set-up functions of the switcher will be preset by the instructor. During
rehearsal, Technical Directors (TDs) are encouraged to ask questions of the
instructor in order to fully understand the steps necessary for a successfully
switched program. However, instead of simply memorizing a set of keystrokes,
we encourage you to attempt to fully understand the process that is at work.
The 18-input switcher has sixteen video bus inputs and two auxiliary inputs.
• BLK Black Video
• R.S. Routing Switcher
• CAM 1
• CAM 2
• CAM 3
• VTR 1
• VTR 2
• DEKO 1 Graphics
• DEKO 2 Graphics
• CB Color Bars
• COLOR Background color generator
• M/E Mix/Effects bank
10
In addition, you will utilize the Downstream Transition Group consisting of the
following buttons:
BKG MIX Mixes between the program and preview buses
UNI KEY MIX Mixes in the video signal feeding the uni keyer
DS KEY MIX Mixes in the video signal feeding the downstream keyer
FADE Mixes to or from black
Switcher Inputs
0 BLACK
1 ROUTING SWITCHER
2 VTR 1 (DVC PRO 1)
3 VTR 2 (DVC PRO 2)
4 VTR 3 (BETA SP PB/REC)
5 VTR 4 (BETA SP PB)
6 VTR 5 (3/4 INCH)
7 CAMERA 1
8 CAMERA 2
9 CAMERA 3
10 CAMERA 4
11 SYNCHRONIZER
12 COLOR BARS
13 DEKO 1
14 DEKO 2
15 TOASTER 2
16 A51
17 COLOR
18 M/E
1. A source device to play the original tape or disk. Typically a VCR or camera.
There are hundreds of digital video cameras, or camcorders, on the market today from
manufacturers like Sony, Panasonic, JVC and Canon. Most of them use what are known as
MiniDV tapes.
11
Just about every camcorder based on the MiniDV tape format includes a FireWire (IEEE
1394) port on the camera so that you can load the video onto your computer quickly and
easily. Whichever type of camera you pick, it needs to have a FireWire connection so you
can hook it to your computer. This sort of FireWire connector is common on digital
camcorders. You attach a FireWire cable to this connector, and attach the other end to your
computer.
2. A computer with at least these specs: high speed processor / big RAM / Fast hard drive
with 1 GB or more free space.
Note: Some editing software requires a high-performance computer to even work properly.
A Pentium 4 machine or a late-model Mac with 8GB of RAM and a big hard disk is a nice
machine to have when you are rendering and writing files.
You can use just about any desktop computer for video editing, as long as it has enough
CPU power, hard disk space and bus bandwidth to handle the data flowing in on the
FireWire cable. Video processing in general uses lots of CPU power and moves tons of data
on and off the hard disk. There are two different places where you will most feel the
benefits of a fast machine and the sluggishness of a slow one: When you render a movie
that you have created or write it out to hard disk, you will definitely feel the speed of the
machine. On a fast machine, rendering and writing can take minutes. On a slow machine it
can take hours.
A more important issue comes when you are reading data from or writing data to the
camera. When the video data stream is coming in from the camera through the FireWire
cable, the computer and hard disk must be able to keep up with the camera or the computer
will lose frames. When sending a completed movie back to the camera, the processor must
be able to stream the data quickly enough or the camera will lose frames.
3. A video capture device. To capture video from an analogue source (such as VHS or
Video8) you need a device to convert the video into a digital format. This can be a
standalone device which plugs into the computer or a video capture card which becomes
part of the computer.
If you are using a source device which outputs a digital signal (such as Firewire or USB)
you don't need a capture device, but you do need to make sure your computer has the
appropriate input available.
A FireWire port to connect the camera to - If your computer does not have a FireWire port,
you can buy a FireWire card and install it.
4. Connecting leads to plug the source device into the capture device or computer.
There are many software packages available for editing video on your computer. Windows
XP even ships with software that's built into the operating system. Machines from Sony and
Apple have software that comes with the machines.
Adobe Premiere is a full-featured and well respected video editing package that can do
almost anything you would want to do.
In order to use a package like Adobe Premiere, you need to understand several basic
concepts. Once you understand those basic concepts, however, the whole process is
remarkably easy. After you are familiar with the fundamentals, it is extremely easy to
expand your repertoire to include all sorts of advanced techniques.
b) Insert or “Cover”. Overlaying one clip on top of another. One clip is “inserted” or
overlaid on others.
Notice the new video clip only covers the picture without covering the audio.
Advantage
There are several ways in which modern information and communication technology (ICT)
or digital technology has improved electronic media editing and production. Editing video
13
Dis-advantage
1. On the "down" side, video has to be "captured," or transferred from tape into the
computer. Capturing is in itself an editing process. Clips have to be identified and
transferred in real time. You might expect capturing to take up to twice as long as
the total length of the video you are transferring even if the process is automated.
2. The way video is captured and stored in one editing system may not be compatible
with another. In other words, while you can be sure that an NTSC VHS
videocassette can be played back on any NTSC VHS machine, a video computer
file may not be readable by any software other than the software used to create it.
14
The world of video is in the middle transition from analog to digital. This transition is
happening at every level of the industry. In broadcasting, standards have been set and
stations are moving towards digital television (DTV). Many homes already receive digital
cable or digital satellite signals. Video editing has moved from the world of analog tape-to-
tape editing and into the world of digital non-linear editing (NLE). Home viewers watch
15
crystal clear video on digital versatile disk (DVD) players. In consumer electronics, digital
video cameras (DV) have introduced impressive quality at an affordable price. The
advantages of using a computer for video production activities such as non-linear editing
are enormous. Traditional tape-to-tape editing was like writing a letter with a type-writer. If
you wanted to insert video at the beginning of a project, you had to start from scratch.
Desktop video, however, enables you to work with moving images in much the same way
you write with a word processor. Your movie “document ” can quickly and easily be edited
and re-edited to your heart ’s content, including adding music, titles, and special effects.
The quality of the movies you watch is not only dependent upon frame rate. The amount of
information in each frame is also a factor. This is known as the resolution of the image.
Resolution is normally represented by the number of individual picture elements (pixels)
that are on the screen, and is expressed as a number of horizontal pixels times the number
of vertical pixels (e.g. Standard resolution 720x576 PAL or 720x480 NTSC). All other
things being equal, a higher resolution will result in a better quality image [e.g. HDTV
1920x1080 and UHDTV (4k) 3840x2160, FUHD (8k) 7680x4320]
You may find yourself working with a wide variety of frame rates and resolutions. For
example, if you are producing a video that is going to be shown on VHS tape, CD-ROM,
and the Web, then you are going to be producing videos in three different resolutions and
at three different frame rates. The frame rate and the resolution are very important in digital
video, because they determine how much data needs to be transmitted and stored in order
to view your video. There will often be trade-offs between the desire for great quality video
and the requirements imposed by storage and bandwidth limitations.
very short persistence (i.e., the amount of time they would remain illuminated).
Consequently, in the time it took the electron beam to scan to the bottom of the screen, the
phosphors at the top were already going dark. To combat this, the early television engineers
designed an interlaced system. This meant that the electron beam would only scan every
other line the first time, and then return to the top and scan the intermediate lines. These
two alternating sets of lines are known as the “upper” (or “odd”) and “lower” (or “even”)
fields in the television signal. Therefore a television that is displaying 25 frames per second
is really displaying 50 fields per second. Why is the frame/field issue of importance?
Imagine that you are watching a video of a ball flying across the screen. In the first 1/50th
of a second, the TV paints all of the even lines in the screen and shows the ball in its
position at that instant. Because the ball continues to move, the odd lines in the TV that are
painted in the next 1/50th of a second will show the ball in a slightly different position. If
you are using a computer to create animations or moving text, then your software must
calculate images for the two sets of fields, for each frame of video, in order to achieve the
smoothest motion. This also applies in the NTSC system that runs at 30 fps. The frames/fields
issue is generally only of concern for video which will be displayed on televisions. If your
video is going to be displayed only on computers, there is no issue, since computer
monitors use non-interlaced video signals.
Composite: The simplest type of analog connection is the composite cable. This cable uses
a single wire to transmit the video signal. The luminance and color signal are composited
together and transmitted simultaneously. This is the lowest quality connection because of
the merging of the two signals. At some point almost all video will be digital...but it doesn’t
mean that you can ignore the analog video world.
S-Video: The next higher quality analog connection is called S-Video. This cable separates
the luminance signal onto one wire and the combined color signals onto another wire. The
separate wires are encased in a single cable.
Component: The best type of analog connection is the component video system, where
each of the YCC signals is given its own cable.
How do you know which type of connection to use? Typically, the higher the quality of the
recording format, the higher the quality of the connection type.
17
Broadcast Standards
There are three television standards in use around the world. These are known by the
acronyms NTSC, PAL, and SECAM. Most of us never have to worry about these different
standards. The cameras, televisions, and video peripherals that you buy in your own
country will conform to the standards of that country. In Kenya, most of the African
countries and Europe we use PAL. America and Japan use NTSC. It will become a concern
for you, however, if you begin producing content for international consumption, or if you
wish to incorporate foreign content into your production. You can translate between the
various standards, but quality can be an issue because of differences in frame rate and
resolution. The multiple video standards exist for both technical and political reasons.
Remember that the video standard is different from the videotape format. For example, a
VHS format video can have either NTSC or PAL video recorded on it.
Analog: Traditional (analog) video camcorders record what they “see and hear” in the real
world, in analog format. So, if you are working with an analog video camera or other
analog source material (such as videotape),then you will need a video capture device that
can “digitize” the analog video. This will usually be a video capture card that you install in
your computer. A wide variety of analog video capture cards are available. The differences
between them include the type of video signal that can be digitized (e.g. composite or
component), as well as the quality of the digitized video.
After you are done editing, you can then output your video for distribution. This output
might be in a digital format for the Web, or you might output back to an analog format like
VHS or Beta-SP.
Digital: Digital video camcorders have become widely available and affordable. Digital
camcorders translate what they record into digital format right inside the camera .So your
computer can work with this digital information as it is fed straight from the camera. The
most popular digital video camcorders use a format called DV. To get DV from the camera
into the computer is a simpler process than for analog video because the video has already
been digitized. Therefore the camera just needs a way to communicate with your computer
(and vice versa).The most common form of connection is known as IEEE 1394.
There are two main ways of capturing video into the computer
a) Capturing analog video into the computer.
b) Capturing Digital video into the computer
18
The conversion process may be carried out entirely by the hardware on the card, or by
software running on your computer, or by some combination of both. In general, hardware
conversion is more reliable than software conversion.
On windows computers the digital conversion product is generally an AVI file. An AVI
(Audio Video Interleaved) file is a sound and motion picture file that conforms to the
Microsoft Windows Resource Interchange File Format (RIFF) specification.
MacIntosh files conform to their Quicktime format. In either case, the converted file is
almost always compressed. That is, much of the picture information is truncated or
discarded according to a compression scheme called a codec.
If the file format is dependant on hardware on the capture card or deviates from one of the
generally accepted codecs your ability to play back files will be limited. You should not
assume that all AVI files are alike in the way that all VHS tapes are alike.
Most capture cards do not compress audio. In fact, rather than using the 44100 Hz sample
rate found on commercial CD audio disks, audio for video is sampled at 48000 Hz. That
equates to 1.536 Mbps.
You probably will not be able to monitor audio or video levels on the computer as your
video is captured. If possible, you should use a time base corrector and waveform monitor
to make sure the signal going to the computer meets broadcast standards. Although some
capture cards have time base correctors built in, most do not.
It is not sufficient to monitor the audio input, since the computer has software audio level
and balance controls.
b) Digital Capture
Digital capture is a much simpler process than analog capture. The most common digital
recording format is “DV.”
This format is already digital and already compressed to about 4 MB/sec and already
compatible with the Microsoft AVI format.
19
To move it to your computer you need to connect your camcorder or DV recorder to your
computer using the IEEE-1394 Interface, also called “FireWire.”
There is no loss of audio or video quality in the transfer. That is the good news. The bad
news is that there is no way to adjust the video (level, setup, chroma, hue) or audio (level,
balance, equalization).
Capture software for digital transfer generally offers the additional advantage of “machine
control.” The playback device can be controlled by the computer.
DV tapes can have two different times embedded in the video signal. One is zeroed at the
beginning of the recording and shows the time on tape. It can be displayed on the
camcorder monitor in the upper right hand corner.
Your capture software depends on the tape time recorded on your DV tape. If that signal is
not continuous it will zero itself and start over (Time code break problem). This is
confusing for people. It is fatal for some digital capture software, since tape times are not
unique and the software uses the machine control interface to search the tape for specific
time points on the tape.
DV tapes can also have a digital time stamp that records the actual date and time each
frame is recorded. Clip detection can be based on the digital time stamp on the tape. A
discontinuity in the time stamp indicates the tape was stopped and restarted, ending one clip
and beginning a new one. It may also be possible to detect clips by looking for sudden
changes in video content. Whether you want to detect clips depends on how the software
treats clips and the nature of the project.
Audiences’ expectations
Audiences tend to have fairly high standards when they watch anything on video. There are
features that they expect to see in any video production that you should include in your
rendition of it.
2. A set of "shots" cut together in a nice way to tell a story. A shot is a specific subject
filmed from a specific angle. For example, if you are telling the story of a birthday party,
different shots from the event might include:
20
o A fairly high number of shots. It is rare for the camera angle to stay the same for more
than 10 or 15 seconds. The director will cut between different angles to keep things
interesting or to make different points. For example, the screen might show a man's face
while he's talking for five seconds, and then switch to a shot of his hands holding a tissue
(while the sound track continues uninterrupted with him talking) to show the emotion.
3. Interesting transitions between the shots. For example, some shots might fade into
others, some might spin into others, and some cut very simply from one to another in a
quick chain.
5. Perhaps static shots (like a chart or graph) mixed in with the normal video
6. Titles or legends on some of the shots to identify people, places and things
8. The End Credits/ list of actors and the characters/ roles played by the individuals taking
part in the production
Approaches to Editing
There are two main approaches to editing
a) Continuity editing
b) Thematic editing
a) Continuity Editing
Continuity editing refers to arranging the sequence of shots to suggest a progression of
events.
Given the same shots, an editor can suggest many different scenarios. Consider just these
two shots.
In this order it appears that the first man was shot. However, if you reverse the order of
these two scenes, the first man is watching a shooting.
21
When hundreds of scenes and takes of scenes are available to an editor, which is normally
the case in dramatic productions, the editor has tremendous control over the basic
continuity and message of the production.
In dramatic television good editors sometimes break from the expected to achieve a
dramatic effect. Unfulfilled expectations can be used to create audience tension.
b) Thematic Editing
In thematic editing, also referred to as montage editing, images are edited together based
only on a central theme. In contrast to most types of editing, thematic editing is not
designed to tell a story by developing an idea in a logical sequence.
This type of editing is often used in music videos, commercials, and film trailers
(promotional clips).
The intent is not to trace a story line, but to simply communicate action, excitement,
danger, or even the "good times" we often see depicted in commercials.
Editing Principles
A number of principles influence both shooting and editing. These principles as
enumerated by Mamer (2009) merit discussion.
Transitions
This term is used to describe shots that bridge one setting to another or that mark the
passage of time. The term covers a wide range of approaches, but often transitional shots
have the added burden of being establishing shots as well. The common approach is to
show a setting, establishing both the place and, by extension, the time of day.
There are many ways of handling transitions but editors are advised to find those that are
effective but not predictable.
22
These terms refer to employing each of the individual shots for the shortest time possible
i.e their economy while still allowing them to achieve their purpose. This is because each
individual film, scene and shot demands its own pace. Achieving economy and pace is
attained through control of the physical lengths of the shots, though many other
elements affect the sense of a film’s internal rhythm. Usually, it is a question each
individual piece of film should be on screen. For instance, if a point cannot be made in
two seconds, it certainly does not need to seconds devoted to it. But some film makers
like Late Italian director Michelangelo Antonioni, as described by Mamer (2009:349)
exploited both a slower pace and the psychological intensity of the close-up. The long-
lingering close-up shot of the main character can be used to force the viewer to identify or
experience some contemplative and environmental effects that emphasize spaces
in between dialogue. But a film can be kept lean and efficient depending on the desired
visual presentation and amount of weight the scene should have in terms of the rest of the
film.
1) Edits work best when they are motivated. In making any cut or transition from one shot
to another there is a risk of breaking audience concentration and subtly pulling attention
away from the story or subject matter. When cuts or transitions are motivated by production
content they are more apt to go unnoticed.
If cuts are prompted by action, that action will divert attention from the cut, making the
transition more fluid. Small jump cuts are also less noticeable because viewers are caught
up in the action.
For example, if a man is getting out of a chair, you can cut at the midpoint in the action. In
this case some of the action will be included in both shots. In cutting, keep the 30-degree
rule in mind.
3) Keep in Mind the Strengths and Limitations of the Medium. Remember Television is a
close-up medium.
An editor must remember that a significant amount of picture detail is lost in video images,
especially in the 525- and 625-line television systems.
23
Except for establishing shots designed to momentarily orient the audience to subject
placement, the director and the editor should emphasize medium shots and close-ups.
4) Cut away from the scene the moment the visual statement is made. The pace of a
production rests largely with the editing, although the best editing won't save bad acting or
a script that is boring to start with.
First, keep in mind that audience interest quickly wanes once the essential visual
information is conveyed. Shots with new information stimulate viewer interest.
Shot length is in part dictated by the complexity and familiarity of the subject matter.
5) Emphasize the B-Roll. A great movie is made with cutaways and inserts. In a dramatic
production the B-roll might consist of relevant details (insert shots and cutaway shots) that
add interest and information.
One critical type of cutaway, especially in dramatic productions, is the reaction shot -- a
close-up showing how others are responding to what's going on. Sometimes this is more
telling than holding a shot of the person speaking.
For example, would you rather see a shot of the reporter or the person being interviewed
when the reporter springs the question: "Is it true that you were caught embezzling a million
shillings?" By using strong supplementary footage the amount of information conveyed in a
given interval increases. More information in a shorter time results in an apparent increase
in production tempo.
If you don't think that a particular scene adds needed information, leave it out. By including
it, you will probably slow down story development, and maybe even blur the focus of the
production and sidetrack the central message.
This is the great overriding, unbreakable rule. When you cut from one image to
another, you must have a purpose for that choice. It may be show a response or to
emphasize an action or to keep shots from being too long or too static.
24
The 30-degree rule says that if you want to cut to a closer shot of a subject, the second
shot should vary by at least 30 degrees from an axis drawn from the original camera
position to the subject. The bottom line is that you should not move the camera
toward the subject in a straight line, the possibility of a disagreeable jump is great if
you do.
Emphasis
Visual interest
In the whole film, there must be shots that are not visually interesting. If you film a
dance rehearsal, for example, there will be parts in which the dancer is turned from the
camera or is framed poorly or parts in which the action is just not engaging select the
segments in which what was in front of the camera interrelates with the film frame
in a visually exciting way.
Variety
Your shots must employ a variety of approaches. Vary between close- ups and long
shots , low angles and eye level shots, images with different balances of
compositional interest , moving and static camera, and so on. In other words, use the
camera resources available to you.
If a film is composed entirely of long shots, it risks becoming visually dull and
predictable. If the area of interest in all of the compositions is in the same part of the
frame, the same problems can occur. Obviously there are exceptions. Several films that
were done largely in long shot have been successful and also some films shot
exclusively in close-up shots were successful. But few of them are exceptions. They
do not represent the kind of explorations and experiments that provide useful learning
experiences for beginners.
In conclusion, general editing principles refer to a number of factors that influence the
editor’s decisions while editing a film, such principles are transitions, economy and
pace. Apart from these principles, some basic rules are applied while editing a
film. Such rule as the 30- degree rule, variety, visual interest, emphasis and
unbreakable and overriding rule of giving a reason for a cut, must be adhered to while
editing a film.
25
Editing Interviews
Interviews are almost never run in their entirety. An audience used to short, pithy sound
bites will quickly get bored by answers that wander from the topic, are less than eloquent,
or that are just boring. Explain how you can bridge an interview edit.
In interviews you may shoot ten times more footage than you end up using. It's the job of
the video editor to cut the footage down --
To start with, cutting a section out of dialogue will normally result in an abrupt and
noticeable jump in the video of the person speaking.
One solution is to insert a three- or four-second cutaway shot over the jump in the video.
This assumes, of course, that you've already made a smooth audio edit between the
segments.
These cutaways, which are typically done in editing with an insert edit, are often reaction
shots ("noddies") of the interviewer.
If videotape is being used, these cutaway shots are typically from a separate videotape (a B-
roll) as opposed to the recording of the interview answers (the A-roll). In linear editing
having two separate video sources (an A-roll and a B-roll) can make editing easier.
With nonlinear editing everything can be recorded on a hard disk or solid-state memory
card and the segments can be instantly accessed from a single source.
Even so, the supplementary footage is commonly referred to as B-roll footage. Editors
depend greatly on this supplementary B-roll footage to bridge a wide range of editing
problems. Therefore, you should always take the time to record a variety of B-roll shots on
every interview -- insert shots, cutaways, whatever you can get that might be useful during
editing.
Another (and somewhat less than elegant) way of handling the jump cut associated with
editing together nonsequential segments of an interview is to use an effect such as a
dissolve between the segments. This makes it obvious to an audience that segments have
been cut out, and it smoothes out the "jump."
26
Digital recordings can be made in the studio or on location and uploaded (transferred)
directly to an editing computer or video server for editing. Once this transfer is made, there
will be no danger of tape damage in editors, no matter how many times the footage is
previewed. (Digital information stored on a computer disk does not gradually degrade with
repeated access the way it does when it's recorded on videotape.)
When a video server is used, the original footage can be viewed and edited by anyone with
a computer link to the server. This is generally someone within the production facility; but,
thanks to high-speed Internet connections, it could even be someone in another city-or even
in another country. In the case of animation and special effects, which are labor intensive,
projects, are often electronically transferred to countries where labor is less expensive.
The latest non-linear editors have many features that both speed up and improve video and
audio editing. For example, Some editors can "read" or understand the spoken dialogue in
video footage and match it up with a written script or with words you type in. If you
happen to have hours of video footage and are looking for the point where someone said,
"Eureka, I found it," the editing system can search through the footage the cue up the part
of the video where that phrase is spoken.
Another useful feature is image stabilization. Let's assume you have some shaky footage --
possibly involving a moving vehicle. The first thing you do is freeze the beginning of the
footage on the screen. Then you find a clearly defined object near the center of the scene
and draw a box around it. This becomes an anchor point reference. Then the whole image
is then slightly enlarged to give the process "working room." Once you roll the footage the
editor holds the selected area still, eliminating the shake and movement in the original
scene.
1. Although you may want to shoot everything on location that you think you could
possibly use, when it comes to uploading or capturing this footage on a file server or
computer hard disk, you will want to use a bit of discretion.
27
After reviewing the footage and making a rough paper-and-pencil edit, upload only the
footage that you are reasonably certain you will use. Not only does excess footage take up
valuable hard drive space, but prodding through this footage during editing adds
considerable time to the editing process.
2. After the footage is uploaded, trim the front and back ends of the segments to get rid
anything you're not going to use. This will also speed up editing and reduce storage space,
plus, it will make the clips easier to identify on the editing screen.
3. Once this is done (#2 above), look for connections between segments; specifically, how
one segment you are considering will end and another will start. Look for ways to make
scenes flow together without jarring jump cuts in the action, composition, or technical
continuity.
4. Find appropriate cutaways. In addition to enhancing the main theme or story they should
add visual variety and contribute to the visual pace.
5. Use transition effects sparingly. Although some editing programs feature 101 ways to
move from one video source to another, professionals know that fancy transitions can be
distracting and can get in the way of the message — not to mention looking pretentious.
Mixing and Uses of effects is to beautify a production. It fills the colours in the programme
with the use of animations, graphics, windows, brackets and effects like, Zoom out, Zoom
in, Page turn over, Dip to black, Fade in, Fade out, Dissolve, Cross fade, Wipe, Swap.
Moreover Colour tone, Title, End credits, Breaks, Bumpers, Scroll, Strips and
Superimpositions like Names, Callers, Phone numbers, Email, Website is also done by
mixing, as well as Promo and Recap are also prepared
Music is also adjusted in by audio mixing by keeping music in fore ground, mid ground or
background. Choice of music is exercised in it and Sound leveling is done.
28
A fade-out is simply an effect in which the scene is gradually taken out or the picture
fades to black. It is usually followed by a fade-in during which a new scene gradually
becomes bright enough to be seen clearly. Fade-out and Fade-ins are used as transitional
devices, either to get from one location to another or to signify the passage of time.
Occasionally, filmmakers fade to shades and colours other than black.
Dissolve
This is a technique in which one shot is faded out while the next shot is fade in on top
of it. In this process, the screen is not completely dark as one scene replaces another.
This is used to signify a change of time or place, just as in fade-out. Dissolve is not used
frequently because it is mainly used to soften an otherwise terrible shot.
Superimposition
This is composed of one shot overlaid on another. It can be achieved in the camera
while shooting or, more common, during editing and final printing process.
Optical effect
These refer to all graphic effects created in the lab. Optical effects include split screens,
keyholes, freeze frames, spins, wipes, etc. They were produced on an optical printer
prior to the digital age. The optical printer is a projector that has a camera shooting
straight into it. Both the projector and camera can be advanced one frame at a time.
The camera can also be repositioned to focus on specific parts of the projected frame.
The projected image can be manipulated in terms of both coloration and the speed of the
film going through the gate. The camera and projector can be controlled separately in
order to allow frames to be repeated, skipped, run in reverse, or held for many frames, an
effect known as freeze-frame. The optical printer can equally be used to enlarge or
reduce a particular film guage depending on the desire of the filmmaker. The digital
nonlinear editing (NLE) has made it possible to achieve any effect and
completely eliminate generation problem but the optical printer is still used in
creating many visual effects and can be an exciting tool for beginning filmmakers.
29
the density of the film strip depending on the particular process that was used when the
film was manufactured.
Inside the projector also is a sound head which has a photo electric cell and an exciter
lamb. The film is threaded in such way that the optical sound track passes between
the exciter lamp and the photo cell. The light falling on the cell varies according to
pattern on the sound track thereby creating an electric current which, when amplified,
reproduces the original sound.
The magnetic sound on the other hand, uses a specially produced film stock with a
sound strip running along the edge of the film. The sound strip consists of the same
iron oxide particles always used in audio recording. The recording and reproduction
processes are identical with those used for audio tape.
Tiny recording head inside the camera selectively magnetizes particles on the
sound track during filming. Inside the projector also, an identical head figuratively reads
the track as the film runs. At the same time, it generates tiny electrical signal which when
amplified, is an exact reproduction of the originally recorded sound. Both are faithful to
each other.
In conclusion, effects are created in the camera but the norm is to plan them in editing and
have them executed by the lab in the final print. Most commonly created effects are Fade-
in and fade-out, dissolve, superimposition and optical effects. There is a difference
between optical sound track and magnetic sound track. While the optical sound track
is produced photographically, and used on all standard prints, the magnetic sound track
uses a specifically produced film stock with a sound strip running along one edge of the
film.
30
Editing Sound
Four Essential Audio Elements
A good sound track is the key to a great video. Not sure you agree? Try this: pop a movie you've never
seen into your DVD player. Start the movie, but play it with your eyes closed. Just listen to the
soundtrack. No peeking!
As you listen, notice how much of the plot you can follow using only your ears. Notice how the ambient
audio (including some subtle sound effects) indicates where the scene takes place, whether in a busy
office, a quiet cottage, a nuclear submarine, or wherever the action is set. Listen to the way that the music
sets the mood. Note whether the scene is tense or tender. Is the pace of the scene frantic or relaxed?
Notice how the dialogue introduces the key characters and identifies important plot themes. It won't take
long to figure out that the bulk of the story enters your brain through your ears, not your eyes. Next, take
off your blindfold, press the Mute button on your remote, and watch the visuals without listening to the
accompanying audio. Before long, you'll likely be lost. Without the soundtrack, it can be difficult to tell
what's going on. Who's on the other end of that phone call? Did the leading lady turn because she heard a
gunshot, the doorbell or a baby crying? Who is the guy that just walked in? And why are they suddenly
on an airplane? OK, OK! Now that you're convinced of the importance of audio, un-mute the movie and
rewind it so you know what's happening!
There are four important elements that come together to make a complete soundtrack, and every editor
should know them. They are natural sound, music, narration and sound effects. When they are used
together effectively, the viewer won't notice them. The best audio edits are smooth and subtle, and do not
call undue attention to themselves. When the mix is off, the viewer will notice. Crackling audio, music
that's too loud or too soft, and fake-sounding audio effects will distract the viewer from the message of
your movie. When it comes to watching video, hearing really is believing.
Let's take a closer look at each of the four essential audio elements that we have at our disposal.
Element #1 - Nat Sound
Nat sound (also known as "natural sound" or "wild sound") includes any
audio recorded along with the video that you shoot. This includes ambient audio. When you record a
scene on a bench in a city park, the sound of children laughing and playing in the background is part of
the nat sound in that environment. In the context of this article, we'll also consider on-camera dialogue
that is recorded along with the visuals to be natural sound.
Nat sound is often the only audio that amateur videographers include in their videos. While home videos
typically consist entirely of this kind of audio, Hollywood producers go to great lengths to avoid using it
altogether. The audio recorded with the images that are shot for most feature films is usually used only as
a guideline by a team (or multiple teams) of people that re-creates every sound and every word of
dialogue in a scene. Foley artists create and record everything from footsteps to keyboard clicks as
separate sound elements that can be mixed together with great precision and control. While all of this
may be unrealistic for the videos that you produce, there is an important principle to learn: if you want
31
your videos to sound more like professional productions, you'll have to spend some time editing your
audio.
If you record dramas, documentaries, events, interviews or instructional videos, Nat sound is the
foundation of your soundtrack. It's critical that you start with the highest quality audio possible-this
means that your camcorder's built-in microphone is not the best choice. An external microphone and a
good pair of headphones are essential.
Element #2 - Music
Simply adding a track of background music can greatly improve your videos.
Music has great power to impact your viewers emotionally, and the pros use it all the time to add zing to
a scene. Listen carefully to the music tracks that accompany the programs that you watch on TV tonight.
Music often creeps in quietly-unnoticed by the viewer-then builds as emotions heighten. Want to tell
your audience how to feel? Use music. Some of the most suspenseful movies of all time are known by
their music tracks (Jaws and Psycho, for instance). The anticipation at the sound of the music in these
movies could scare an audience out of its seat.
In the same way that it can build tension and fear, music can build joy or excitement. Imagine how the
music would swell (along with the hearts of the viewers) as a hero triumphantly emerged from a
smoldering building with a child in his arms.
You can change the entire feel of a scene by simply changing the music track. The same sequence of
shots can feel spooky, silly or sad, depending on the music that you select. To see for yourself, shoot this
short sequence, edit it together, then play it back with several different music tracks.
Background music is an easy way to add professionalism to a video with dialogue or narration.
Background music should be mixed low, so as not to interfere with the words that are spoken.
In other productions, music may be the only audio track. Music montages without Nat sound can be
particularly moving if you make wedding or event videos. You'll find that it helps to lay the song on your
timeline, and then edit your footage to the music.
Element #3 - Sound Effects
These days, sound effects are more than just gunshots and explosions.
As we said, most of the sounds you hear when you watch a feature film were created in a studio and
32
edited into the production. You can use subtle sound effects to enhance your videos. Remember that
interview in the park? Adding the sound of a rolling stream or some chirping birds can enhance the
pleasant feeling of the setting.
Look again at the three shot sequence where the reader hears a sound before leaving the room. Just what
did he hear? A single CD of sound effects provides a variety of options to consider. Capture a few
sounds, then edit them into your sequence. Did he hear a scream, a knock at the door, a police siren?
Maybe it was a dog barking, the phone ringing or the sound of a marching band. Try a few of them, then
revisit the music track. If you selected a dramatic sound like a gun or car crash, pick a suspenseful music
track to coincide. You get the idea.
Sound effect CDs are easy to find, and provide a number of options for just a few dollars. A sound effect
CD is a good investment for a videographer, especially a CD that includes natural sounds. However, look
closely at the contents of the disc before you buy. You may not have much use for a collection of
carnival sounds.
Element #4 - Narration
If you produce documentaries, travel video, personal histories, instructional
videos or any type of video that seeks to explain a procedure or tell a story, narration is invaluable. The
best narration is well-scripted and planned to match the visuals in your production, not off-the-cuff
rambling. If possible, it's a good idea to script out your narration before you shoot, then gather the shots
that you need to match. If you cannot script the narration before you shoot, at least do it before you begin
editing. It is much easier to lay the narration on the timeline and edit your footage to the words than it is
to write a tight narration that matches your edited video.
Make an effort to use all four of these essential audio elements in your production. Used together, or in a
variety of different combinations, Nat sound, music, sound effects and narration are powerful tools that
can greatly improve your productions.
Mixing and editing audio is easiest with a timeline-based editing program that offers at least four audio
tracks. The timeline allows you to position sounds in relation to the visuals, and other audio clips, with
frame accuracy.
The best way to evaluate the mix of your audio is to play the scene back on the worst television you can
find. If the mix sounds good from its speakers, it will sound good wherever you play it.
1) What is natural sound? What are essential for recording natural sound?
2) What are some ways background music can affect your productions?
33
Therefore, it is necessary to convert audio from the analog format to the digital format before using it on
a computer.
Digital audio technology allows you to record, edit, and play digital audio files on a computer. Advanced
digital audio technology also lets you communicate with the computer by just speaking. This lesson
introduces you to digital audio technology. It also briefly describes the concepts of copying and
converting digital audio.
Another important characteristic of digital audio is that it can be edited on a computer by using audio
editing software. For example, you can use this software to move sections within an audio file or add
effects. You can also use audio editing software to store an audio file in different formats on the
computer.
Windows Media® Audio (WMA): This format was developed by Microsoft and is used to store digital
audio files.
Wave (WAV): This format is part of a series of standards for audio and video developed for Microsoft
Windows® 95 as a universal sound file format. It is used to store audio files in the wave audio format.
Audio files stored in this format have good audio quality but this format is used sparingly these days.
This is because the audio files in this format are larger when compared with other formats.
MPEG Audio Layer 3 (MP3): This format was developed by Motion Picture Expert Group to allow
compression of audio and video for digital distribution. The MP3 format is a popular format that is used
to store digital audio files. This is because MP3 files are generally smaller than WAV files.
Audio Streaming
Digital audio also allows streaming of digital audio files. With audio streaming, you do not have to wait
to completely download a large audio file from the Internet to play it. Instead, you can use a streaming
audio player or a browser plug-in to play the audio file from the Internet. In this case, the audio file is
sent to your computer in a continuous stream.
34
You can then copy the stored audio files to storage devices, such as recordable CDs and DVDs, in
various formats, such as WAV and MP3.
You can also convert audio from a CD or a DVD to a different format before you store it on your
computer’s hard disk.
Converting Audio
You must have audio conversion software, such as Microsoft Windows Media Player, installed on your
computer to convert audio files. The software changes the format of the audio and might also compress
the audio so that the files take up less space on the hard disk. You can then transfer these audio files from
the computer to a portable medium, such as a PDA or a cell phone.
Warning: Converting audio from CDs and DVDs to a different format is illegal. Ensure that you have
permission to convert audio to a different format before converting it.
1) Sound selection
2) Volume controls
3) Pitch and tone controls
4) Control of distractive interference
5) Sound juxtaposition
6) Maintaining consistency in sound levels
7) Selection of words of music
8) Deliberate sound omissions
35
This is the systematic incremental introduction of music or a certain SFX to a given scene. The volume
of the new sound is increased gradually to the appropriate levels.
2) Fade Out
This is the systematic decremental removal of music or a certain SFX from a given scene. The volume of
the sound is decreased gradually to the minimum.
3) Cross Fade
This is the systematic introduction of a sound effect by gradually increasing its volume while removing
another gradually ie fading in a new sound effect and fading out another.
4) Segue to
5) Fade under
This involves the systematic reduction of volume to a lower level, at a selected scene. The sound is
allowed to be audible as a background SFX.
6) Sound element
It refers to sound or dialogue recorded either exclusively for radio or in sync with pictures for television.
7) Signature tune
This is a song or tune used to signify the beginning or commencement of a given program.
8) Narration
These are the actual words spoken by the presenter directly to the audience. It normally contains the
storyline of the programme.
9) Dialogue
This refers to the words spoken when two or more persons are involved in a conversation.
10) Ambience
This refers to the sound effects from surroundings. Such may include cheers, car horns, and wind
blowing sounds.
36
Student/Shot Time code in Time code out Audio cue in Audio cue out
:13 :14 Fall Ball
Clara 1:26 1:27 8 am Is really early
1:37 1:40 Going up the Hill Longer than expected
3:19 3:28 I love going around Up to various
Linnea &
locations
Jennifer
3:28 3:35 I just love All the colorful trees
Sonia & 5:31 5:32 The Leaves
David 7:06 7:13 It’s impossible Unless you’re a girl
8:06 8:11 Swear to God Ben Affleck
Matt
8:28 8:30 This campus Is amazing
Roman 9:25 9:26 Homecoming
10:55 10:58 All the students Coming back
Stanley
11:13 11:16 Stanley It’s all gonna be okay
Rezwan 12:00 12:10 My best fall memory An amazing job
Fahmil 15:00 15:08 I just remember my We’re from Boston
Rebecca 15:52 16:03 Seeing all the Of our community
Andrew, 17:13 17:23 The thing I like Old memories
Alan, & Sam 17:46 17:47 Free Food
19:15 19:19 Fall at Tufts To go to school
Ricky 20:02 20:10 I sat in the class What I was doing
there
Nick 21:30 21:34 I actually met My best friends
Russell, 23:15 23:17 A lot of free food Always good
Sterling, & 23:30 23:42 I like when the leaves Playing Ultimate
Rayna 24:07 24:25 I did Pre-Orientation So much in common
24:48 25:07 Being from California Really exciting for
Dominique me
25:10 25:21 I did not visit Never left home
A.J. 27:15 27:19 I do remember my On the sidelines
Logan & 28:58 29:03 I love seeing everyone Really exciting
Lindsey 29:03 29:11 I love Halloween Around campus
Mae-ling 29:42 29:53 The sun’s out Great to be back
37
A sample editing script to be written for a video project, containing all of the audio and
video information to be presented in the program.
VIDEO AUDIO
Copyright Warning
CU aerial shot MUSIC UP
Key MP Production & Services Music. over CU aerial shot
POC at Vesuvius intro
Fire truck in parade
Wiring, Ill Bell Champaign office MUSIC UNDER
WS of special "easy to ride" bicycle at
Parkland College
"Time Pinnacle" laser show
38
Non-Linear Editing Film and video recorded to a hard disk and edited using a computer
(Digital Editing) showing a timeline and utilizing random access to the material.
Linear Editing Film and video assembled and cut in linear fashion i.e. film and tape reels
that are physically wound around a core with access limited by winding and
rewinding.
On-Line Broadcast quality editing and outputs using greater computing power,
hi-resolution masters and larger storage capacity and more robust visual
effects.
E.D.L Sometimes referred to as a paper list but usually a digital file specially
formatted
(Edit Decision List) to reflect all the audio and video cuts and some visual effects and
transitions.
False Edit An edit in the timeline, which separates or cuts between two continuous
frames. Sometimes used to key in effects or transitions.
Jump Cut An edit where one or many frames are missing causing the subject matter
to "jump" in time or space.
Continuity Series of cuts that form an entire sequence usually in a specific order
according to script or story logic. Since films are often shot out of sequence
it is up to the editor to follow the continuity.
Assemble Edit Linear editing term which refers to material recorded to tape in its entirety
without stopping while adding timecode at the same time.
Insert Edit Linear editing term where video and audio can be re-recorded in sections
on an assembled edit using the original pre-recorded timecode.
Also known as "punching in".
39
Non-Drop T.C. is not clock time but actually runs longer than 30 f.p.s.
If you are not concerned with the exact running length of your project then
this is the best one to use.
Log A database used to keep notes and manage separate clips and source
reels using timecode and slate information. A log book is essential for
every project
Key Frame A key frame is the beginning, middle or end marker of an effect. The
key frame also contains the information or values of the specific effect. A
minimum of two keyframes is necessary to trigger a transition or change in
values over time.
Black & Code Video tapes are sometimes pre-formatted by first laying down a black
signal and timecode. New tapes (rawstock) as a rule must be Blacked &
Coded (B & C) before an insert edit can take place.
Bars & Tone Prior to programs being recorded to tape, SMPTE Color Bars & Tone
must be placed at the head of the show as reference to give the dubbing
operator something to "line-up" to. Audio should be a continuous tone
of -18dB. 30 seconds to 1 Minute is preferred.
40
Tape Formats Digital: D-5, HD Cam, DigiBeta, HDV, DVCPro, DVCam, Mini DV,
Analog: Beta Sp, 3/4", Hi 8, VHS
After Effects Adobe software for compositing and editing of movies using imported video &
audio layers and graphics. Also utilizes plug-ins and filters.
Alpha Channel Information contained in graphic files to create mattes or “negative space”.
Can be invisible or refer to a colored background. The alpha channel must be
expressly saved in order for other programs to “see” it and make use of while
keying effects.
Anti-Alias Aliasing is the problem when creating curves using (square) pixels
Hard jagged lines can sometimes be softened by defocusing or feathering the
image.
Artifact Similar to aliasing except this is digital glitch or “burp”. Since video files are
usually compressed the program often fails to completely or accurately render
an image thus creating artifacts. Low resolution images often show more artifacts.
Back Light This light just gives the subject an edge so as to be distinguishable from the
background. This light is also important when using the Green Screen to get a
clean edge when keying. This light is usually as intense or almost as intense as the
Key Light.
Bandwith Amount of digital information pumped through a system or network such as the
Internet. DSL or Cable modems offer improved bandwidth performance and are
better suited to streaming audio and video.
41
Batch Render A feature which creates a list of movies that need rendering and tell the computer
(queue) to work on them in the order displayed. See also Smart Render.
Beta SP Professional/Industrial grade Sony video format. “Near” broadcast quality. Blue
Screen Objects shot in front of a color field and later composited onto a different
(Green Screen) background by pulling out or eliminating the original color.
CAD Computer Assisted Drawing. Archaic term referring to the original programs
used
in drafting and design work. Term not generally used in multimedia.
Capture Sometimes referred to as digitize. The importing of video and audio in real time.
CD-Rom Compact Disc developed in the eighties for storage and retrieval of digital files
and music. Audio files distributed commercially are usually sample rate 44.1k
files.
Channel (Tracks) Video, Audio, Titles, Alpha Channels all can be shown as tracks or channels
playing in a timeline.
Compression Video compression refers to reducing the quantity of data used to represent video
images and is a straight forward combination of image compression and motion
compensation.
Chroma Key or Key(ing) A process which allows superimposition or replacement of one video picture in
a predetermined area of another one. The first picture is photographed with an
object or person against a special, single-color background. The complete color
content of this particular signal is removed and the second picture is inserted in the
area where the background was.
Chryon Archaic expression referring to computer generated titles possibly still used by
major broadcast studios but not applicable to desktop systems.
Cross Platform The ability of a file to be read on both PC and Mac systems. Files should have a
dot extension for it to be read on a PC. Programs, on the other hand, cannot “cross
over” and must be written specifically for each platform. Apple recently
introduced their Intel platform which requires new code to be written for all
programs
Clock Speed Refers to CPU computing power as measured in GigaHertz. The Apple G5 you
are using usually features Dual 1.8 Gig processors.
Clone A digital copy or exact duplicate. Clones are often created from digital master
tapes for safety reasons in case the original is lost or damaged.
42
Even DVD’s use compression so that an entire feature film can fit on one disk
Composite Video Analog Video Signal such as from a VHS deck carried over a single coaxial
cable. Component Video. Three separate video signals each carrying a single
color value or RGB. See Beta SP.
Compression (files) Files can be compressed for size by a utility. Macintosh files can be compressed
using a utility such as Stuffit. On a PC, there is one utility is known as WinZip. It
is important to note that you can NOT compress Video using these utilities.
De-Saturate To remove colors. If enough de-saturation occurs the image will be black and
white or sepia. See also RGB.
Digi-Beta Component Video but in a digital format. The master tape format for
broadcasting.
Digitize The act of transferring media from an analog system such as magnetic tape to a
digital format on a computer. Modern DV camera's record a digital signal onto a
magnetic tape but the term of transferring that media to a computer is still called
digitizing. This is also called Capturing.
Director Macromedia authoring program to create Shockwaves primarily used for web
animations. This program is less popular since the advent of the newer Flash
program.
Dissolve A term used in video and audio editing to describe a procedure whereby one
signal is gradually faded out while a second signal is faded in until it fully replaces
the first signal. Also known as Cross-Fade.
Dolby Formerly a noise reduction system for analog tapes, Dolby’s biggest
contribution
Is Surround Sound for home and theater systems using 5.1 technology which
offers six channels of audio contained in a single audio stream on DVD known as
AC-3.
Dreamweaver Macromedia Web Authoring Tool featuring drag and drop features and
WYSIWYG.
Drop Shadow Effect which creates depth usually in context of lettering or graphic images.
Dual Stream Video High end editing systems such as Avid Symphony or Nitris offer dual stream
uncompressed video with real time effects. This allows for faster workflow as it
eliminates the need to render effects before viewing. See also Preview.
43
Duration The time measured between an In and Out point. See Time Code.
DVD-Audio Audio enthusiasts can now enjoy turbo-charged audio using sampling rates as
high as 96k or higher.
DVD-RAM Use of a recordable DVD much like Hard Drive to store files
Encode (Hardware) Expensive systems use video cards (hardware) which are dedicated to “real
time” encoding tasks.
Encode (Software) Inexpensive systems use software which are slower because they do not operate in
real time. Instead the video is captured or imported and then must be rendered thru
processing via the CPU.
Export/Import Transfer of files into and out of one program and into the next. Importing does not
actually move the file but instead creates a link where the program can then see it
and refer to it. If the file is moved then the program will be unable to see it and
must be told again where to find it.
Fade Can be a fade to black or a fade to any other color you choose.
Feathering Hard jagged lines such as dithering can be softened or feathered.
Fibre A fast method of data transfer similar in speeds to SCSI but has the advantage of
Being able to use longer cables.
Fill Light The Fill Light is just as it’s name implies, it fills in the details left out by the key
light. This light also defines the contrast to our image, a dimmer fill light will add
contrast to the picture.
Filter Just what the word suggests. Audio or Video filters alter, adjust or tweak the
channel with a certain desired effect.
Final Cut Pro Apple’s Desktop Editing System which is best suited for editing DV and HD
projects.
Firewire (IEEE 1394) A Standardized cable used to connect video equipment to computer equipment.
It allows for high-speed transfers and device control.
The latest cabling technology allows bi-directional signal paths. Firewire is
favored by DV cameras because it provides “machine control” from the computer
44
to the camera and allows the user to shuttle the videotape remotely from the
keyboard or mouse.
Flash The ultimate in Web animation is now featured in many recent TV spots.
Flash players by the way are the most universal playback systems in the world.
Flatten Photoshop (.psd) files are flattened when saving as a JPEG (.jpg) This literally
means the combining of various layers into one.
Frame Rate Film runs at 24 fps. Video is usually 30 fps and sometimes pulled down to 29.97
fps.
HD can also run at 24 fps. Video on the web often runs at 15 fps to save on file size.
Frame Size NTSC DV frame size is measured at 720 x 480 pixels. HD is 1080 x 1920
pixels.
Futz To degrade or alter a sound or image. See also Filter.
HDV HDV is a recording format that compresses the video before it is recorded, this
extensive compression can cause motion problems in the final Video called
Compression Artifacts. This generally only happens when shooting fast
movement.
Html Hyper Text Markup Language. The basic language of the web offering links to
other pages or programs.
Interlace Scan (I) Older video technologies such as NTSC use interlacing. When importing or
capturing it is sometimes necessary to “de-interlace” the picture. See also
Progressive (P) Scan.
Key Frame A key frame is the beginning, middle or end marker of an effect. The key frame
also contains the information or values of the specific effect. A minimum of two
keyframes is necessary to trigger a transition.
Key Light The Key Light is the primary light in this setup and is usually the largest brightest
light. This light will give most of the detail to our image but as you can see some
of the details on the left side of the face are indistinguishable.
45
Layers Each video channel or photoshop layer is combined to make a composite image.
Log A database used to keep notes and manage separate clips and source reels
using timecode and slate information. Sometimes stored as an Excel spreadsheet.
Motion Apple’s low cost answer to After Effects. The advantage to using it in conjunction
with Final Cut Pro is the ability to make changes in nested effects without having to
re-export.
MP3 Popular form of music file for use on the web or playing on a set-top or portable
player.
MP4 One of the latest efficient codecs used for delivering content over the web. Nested
Effect Denotes an effect combined with another effect.
Noise Analog noise is visible as scratches in the image or noticeable grain. Audible
noise can be clicks or pops in the track. Digital technology can re-create this by use
of filters or plug-ins such as Cinelook or Film Damage.
NTSC Video signal used in the U.S. and Japan and most of Latin America.
Usually 525 lines of resolution it is generally inferior to PAL. See Frame Size.
PAL Video signal used in most of Europe (except Russia and France) using added
lines of resolution. See Frame Size.
Peel A transition effect similar to a wipe but usually with a curled hard edge.
Photoshop (.psd) Photoshop files usually come in layers and must be flattened before saving as
JPEG’s.
46
Pixel The basic unit of measurement for digital graphics. The smallest "dot" on a
Monitor or Image. The PIXEL count is a measure of the screens' maximum
resolution. Pixels for a screen work a little differently than pixels for an image, an
image pixel can be many different colors but a screen pixel is limited to only 3
colors, Red Green and Blue. A pixel is usually square.
HD
1080 - 1920×1080 - 2,073,600 Pixels
720 - 1280×720 - 921,600 Pixels
SD
480 - 720/640×480 - 460,800 Pixels
*720 X 480 for a computer, 640 X 480 on a TV Effective, Small number is the
height
Preview Sophisticated editing systems will preview an effect in real time. If not then the
effect must be rendered before viewing. The preview function saves time and disk
space.
Progressive Scan (P) The latest hi-definition video technology or HD uses progressive scan instead of
interlace, which is similar to how computer monitors update their images. See also
Refresh Rate.
Pulldown 30 fps video is often “pulled down” or slowed down to 29.97 fps in order to
accomodate the transfer of film rates. “3/2 Pulldown” actually refers to the
difference in frames between 24 fps and 30 fps in which certain frames are repeated
or skipped.
See also Reverse Pulldown.
Quicktime Apple’s media player for audio and video files (.mov)
Loop When timeline sequences reach the end they often start again, unless you tell it to
stop. The same is applies for menus in DVD’s which will play endlessly unless
programmed to “time-out”. See also Time Out.
RAM Random Access Memory. Allows computers to have multiple programs open at
a time.
Real Time Effects Hardware cards or fast computers can often preview or play out visual effects
without having to first render. See also Preview.
47
Refresh Rate Computer monitors will update themselves anywhere from 60 to 100 times a
minute depending on the refresh rate.
Render Before a transition effect can be saved to the hard drive it must first be
rendered.
See also Smart Render.
Reposition (Repo) Moving an image within the frame. Resizing and repositioning is commonplace
in T.V.
Resolution (Audio) Sometimes referred to as 8, 16 or 32 bit audio but more likely referred to as
“Sample Rates” of 22k, 41k, 48k or 96k
Resolution (Video) D5 is the ultimate in video resolution followed by HD Cam (8 bit or 10 bit),
Digi- Beta. DV comes shortly thereafter because it is even more compressed.
See also Compression
Reverse Pulldown Frames are added back in to a digitized video sequence to create 24 frames.
RGB Red, Green, Blue values expressed as three different
components of the entire
color spectrum. Adding or subtracting one value causes the image to shift in
color.
See also Component Video. See also Luminance.
Rotation You can take an object or image and rotate it on the X, Y or Z axis.
Rotoscope A process where the action is first filmed live and later traced using an
animation process.
Sequence A group of clips ordered in a timeline containing cuts, transitions and titles.
Scaleable Resolution Independent. See Vector Based.
SCSI A fast method of data transfer can be achieved using a “SCSI bus” . Firewire
and USB connections frequently are too slow for many media applications.
Scripting (Logic) Directions placed in a timeline which can trigger different events. Used in
Flash
48
Skip Frame (Stutter) An effect created by removing frames and/or repeating frames. Popular in
music videos.
Soften (De-Focus) Also known as blurring. See also Feathering.
Source Reel Logs contain information such as Timecode and Source Reel so that master
tapes (Master Reel) can be vaulted and later brought back for re-editing
.
Smart Render “Smart Rendering” is the ability to partially render an effect and later go back
and pick up where it left off. The editor can start
and stop a render which saves time.
Speed Change Film is 24 fps and video is 30 fps. The frame rate can be changed to create an
effect. As a rule, it is best to try even multiple divisions such as 15, 12, 10 fps.
Streaming (VOD) Video on demand. Video files are now available over the internet and can play
back in real time.
Stuffed Files (.sit) Compressed files for the mac using a utility called Stuffit.
Stutter (Skip Frame) Frames can be repeated or skipped thus creating an effect. Popular among
music videos.
Super-Imposition (Super) The layering of one image or text file over another.
Telecine Sound and Picture are usually sync’d up at this point and transferred to tape.
(Film to Tape Transfer) A log is created of all the shots and included in a
database or shot list.
Time Out The maximum time before an effect or change occurs in a program or DVD.
Title Card Text over a color field or super imposed over an image.
Transition Any type of movement or change from one shot to the next. (Transition
Effect) Examples: Fade, Wipe, Key, Dissolve, Blur, Reposition etc.
49
USB A type of cable connection which can also use a hub. USB2 format is also
available which is backward compatible to USB1 and is almost as fast as
firewire.
Vector based Vector based graphics can be scaled to any size without loss of resolution.
Video-CD(VCD) Low resolution (near-VHS quality) video files that can be stored on a CD-Rom
or read on a DVD set-top box using MPEG-1 files. Format is very popular in
China.
Video File Formats AVI - Audio Video Interleaved. A computer graphics animation format used in
MicrosoftVideo for Windows.
MOV - Format used by Apple QuickTime
WMV - Format Currently Used by Microsoft
FLV - File format used for Flash Video
Web enabled DVD programming information which opens a browser and steers user to a
Web page.
White Balance An electronic process used in video cameras to retain true colors. White
balancing is performed prior to recording a specific scene. The camera is
pointed at a white object (a wall, for example) and controls on the camera are
adjusted until a hairline in the viewfinder is brought to a particular point. This
ensures
that the tints in the videotape will be natural.
White balance with a sheet of white paper works best, balancing with a colored
paper will give you a hue of the inverse color.
Wipe Transition effect using a continuous motion with either a hard or soft edge or
border.
Windows Media Player A format sometimes known as .wmv. Also see Avi.
Wire Frame 3D Modeling programs use a tracing tool which creates an object with wires
only.
Later animators will fill or cover the object so it is completely solid.
Wire Removal Visual Effects such as a person flying through the air require harnesses that are
shot in live action and therefore must be digitally erased.
X,Y,Z values Values which measure position of frame and the ability to create 3D images:
X = Up/Down (Vertical) Y = Left/Right (Horizontal) Z = Scaling (Size)
Zipped Files Compressed files using the .zip extension. If you see a .zip file it usually is a
PC file.
WinZip is a popular utility for this. See also Stuffed Files.
50