You are on page 1of 50

BJL 3206 Editing for Electronic Media

MOUNT KENYA UNIVERSITY

BACHELOR OF JOURNALISM AND MASS


COMMUNICATION

BJL 3206: EDITING FOR ELECTRONIC MEDIA


Student Guides
COMPILED BY J. K. GITHUA

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Table of Contents
Meaning and Definition of Editing for Electronic Media ....................................................... 3
Three Decision Making Areas in Cutting a Film..................................................................... 4
The rationale for Editing .......................................................................................................... 4
Types/ Modes/ Process of Editing............................................................................................ 6
Different Types of Video Editing............................................................................................. 9
The hardware and software requirements in video editing assignments................................ 11
Digital Technology and Editing ............................................................................................. 13
Digital video........................................................................................................................... 15
Analog versus Digital Video .............................................................................................. 15
Frame Rates and Resolution............................................................................................... 16
Interlaced and Non-interlaced Video ................................................................................. 16
Analog Video Formats ....................................................................................................... 17
Broadcast Standards ........................................................................................................... 18
Getting Video into Your Computer........................................................................................ 18
The process of capturing video into the computer ............................................................. 18
Audiences’ expectations......................................................................................................... 20
Approaches to Editing ............................................................................................................ 21
Editing Principles................................................................................................................... 22
General Editing Principles ................................................................................................. 22
Basic Rules in Editing........................................................................................................ 24
Editing Interviews .................................................................................................................. 26
The advantages of Digital Editing with a Video Server..................................................... 27
Six quick tips for file server editing ................................................................................... 27
Mixing and uses of basic effects in Editing ........................................................................... 28
Effects in film editing ........................................................................................................ 29
Distinguishing Optical Sound Track from Magnetic Sound Track .............................. 29
Editing Sound......................................................................................................................... 31
Four Essential Audio Elements .......................................................................................... 31
Analog and Digital Audio ...................................................................................................... 34
Characteristics of Digital Audio......................................................................................... 34
Audio File Formats............................................................................................................. 34
Audio Streaming ................................................................................................................ 34
Recording, Copying, and Converting Digital Audio ......................................................... 35
Copying Audio (Hardware)................................................................................................ 35
Copying Audio (Software) ................................................................................................. 35
Converting Audio............................................................................................................... 35
Procedures involved in sound editing .................................................................................... 35
Audio Editing technical terms:........................................................................................... 36

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Meaning and Definition of Editing for Electronic Media


The inevitable clichés about editing are true: “this is where the real magic of film making
happens, this is where the film comes alive, and that you can’t make a silk purse out
of a cow’s ear”. Yet stories abound of films botched in their conception and shooting
that were made substantially better in the editing.

This unit is designed to help you in understanding what editing is all about especially as it
concerns broadcast production.

Editing is also referred to as the cutting of film. It is defined as the process of selecting the
parts of the shots that are good and that serve the needs of the film and eventually
discarding the rest, (Mamer, 2009). It therefore requires extensive knowledge of the
mechanics of cutting. It is the process of choosing creative materials that fit a subject
matter and the blending of various photographed frames of a film, in a convincing manner,
in order to transmit the message to the audience of the artistic work, (Owuamalam, 2007).

Each scene must have generally been photographed and recorded several times with
each filming being regarded as a take. During the filming exercise or shooting exercise,
the director decides which takes are good enough to print. The printed takes therefore
form a work print with which to work with during editing (Kogah, 1999).

Film editing involves the use of plot in arranging the presentational sequence of the story
line. The strategy enables the idea of the creative work, as conceived, to be actualized,
through a technical process. The process requires the use of equipment, and script, to
match the interpretative capacity of the editor. The editor applies skill, knowledge and
experience, to produce the synergy called film.

Video editing is the process of manipulating and rearranging video shots to create a new
work. Editing is usually considered to be one part of the post production process — other
post-production tasks include titling, colour correction, sound mixing, etc.

Many people use the term editing to describe all their post-production work, especially in
non- professional situations. Whether or not you choose to be picky about terminology is
up to you. In this case we are reasonably liberal with our terminology and we use the word
editing to mean any of the following:

Rearranging, adding and/or removing sections of video clips and/or audio clips.

Applying colour correction, filters and other enhancements.

Creating transitions between clips.

Editing is a “specialty” skill or occupation in video and film production i.e. “post -
production” process. It is the process of assembling visual & aural elements into a creative
product.
3

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Three Decision Making Areas in Cutting a Film


Mamer, (2009:347) outlined three decision making areas in cutting a film. They include
cutting picture, cutting sound, and determining optical effects like dissolves, fade,
special effects etc.

For pictures, editing entails going through the shots and determining their specific order,
then deciding on the precise transition point from one shot to the next. The order of
shots may be predetermined in a narrative film, through that order may not be as rigid as
first assumed. In documentary and experimental film, you may have to devise the order
yourself.

Cutting or editing sound includes a number of approaches such as cutting sync tracks in
conjunction with the picture, determining the relationship between music and picture and
building complicated , layered sound effects after the picture is mostly or completely cut.
Optical effect indicated a graphic effect that is created in the lab. Optical effects
include split screens, keyholes, freeze-frames, spines, wipes and a host of other effects
executed by the lab at the filmmaker’s instruction and done prior to the final printing.
They are difficult to get right and may take several tries to obtain the precise effect.

The rationale for Editing


The functions of editing in a film cannot be overemphasized, No film is ever produced
and shown to the audience exactly the way it was shot. Some scenes may be omitted and
others added in error, editing makes it possible to add the missing scene and remove
unwanted ones, even after the production has been completed.

There are many reasons for editing materials electronically and any editing approach will
depend on the desired outcome. Before you begin you must clearly define your editing
goals, which could include any of the following:

Specifically, Owuamalam (2007) gave the under listed points about the functions which
the act of editing performs in any film as follows:

1. Editing facilitates the removal of film footage that can destroy society, like racial
or ethnic inciting scenes; derogatory gender scenes, offensive stereotyping;
obscene and lurid scenes; that debase morality and legally blamable scenes. It
enables the producer to correct impressions that could affect the image and
reputation of the production adversely.
2. Editing trims the footage to fit into a specific duration as dictated by the
medium of presentation (television or film theatre/cinema).
3. Editing combines shots in a spectacular way, in order to achieve an
understanding of the film. It brings discretely shot scenes together, in a

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

convincing manner, through the use of appropriate transition devices, in order to


express an idea convincingly.
4. Editing enables a film to be constructed from various sources and camera takes.
For example, a documentary about say, “Iraq after Sadam Hussien,” can be
built from various video and audio sources. A portrait of Sadam Hussein as a
still photograph, in his days as the president of Iraq can be shot and obtained on a
tape; his trial, conviction and death, can be obtained as video clips, the increase in
American causalities in Iraq as well as the insurgence in that country, can also be
obtained as video clip, (Owuamalam, 2007:219). The editor can use a myriad of
takes from various sources, as shot in different countries of the world, particularly
in Iraq and United States of American to show life in Iraq, before, and
after Saddam Hussein. The editing process will address the combination of the
various scenes in a specific manner and order, so as to articulate the major
idea of the plot and tell the story, in a convincing and believable manner.
5. Editing eliminates waste and overshoots and makes the creative
composition to remain, within the provision of the storyline, in a lucid and
comprehensible manner.
6. Remove unwanted footage. This is the simplest and most common task in editing.
Many videos can be dramatically improved by simply getting rid of the flawed or
unwanted bits.
7. Choose the best footage. It is common to shoot far more footage than you actually
need and choose only the best material for the final edit. Often you will shoot
several versions (takes) of a shot and choose the best one when editing.
8. Create and improve flow. Most videos serve a purpose such as telling a story or
providing information. Editing is a crucial step in making sure the video flows in a
way which achieves this goal.
9. To introduce/add titles, graphics, music and sound effects to add to your program.
This is often the "wow" part of editing. You can improve most videos (and have a
lot of fun) by adding extra elements.
10. Alter the style, pace or mood of the video. A good editor will be able to create
subtle mood prompts in a video. Techniques such as mood music and visual effects
can influence how the audience will react.
11. Give the video a particular "angle". Video can be tailored to support a particular
viewpoint, impart a message or serve an agenda.
12. To join together a series of separately recorded clips to create a continuous smooth-
flowing sequence.
13. To discard poor shots i.e. omit action that was not shot well, is irrelevant or
distracting
14. To ensure that the program is in line with the legal and ethical requirements.
15. To ensure that the program is commensurate with the time allocated i.e. to ensure
that the time is not exceeded or limited.
16. To add transition effects between your video clips to help give your video a
“produced” and “professional” feel and style.
17. To ensure that both visuals and audio are in sync.
18. To introduce other footage (B Roll) to the program.
 B Roll allows you to show other viewpoints
 B Roll allows you to show support graphics, video, or images
5

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

 B Roll helps you keep the video interesting versus staying on one long boring shot
19. Finally, editing serves as a structural transformer, which provides the salient
aspect of a work, in a clear and focused way, within a specific length of the film,
adjusted to suit viewership interest.
Editor is the one who plays a vital role in post-production phase. Like director,
cameraman and designer, editor also has a direct impact on any programme production.
The harmony and coordination between the editor and the producer is very necessary as
the editing can either polish or tarnish the finished product.

During the process of editing the synchronization and equation of both the producer and
editor is very important and it speeds up the pace or the work as well as it improves the
programme quality.

The choice of editing either linear or non-linear solely depends on the producer, as s/he has
to anticipate the needs according to available technical facilities. Linear editing is in
vogue now a day due to its variety of effects, more options and provision of more audio
and video layers to make a programme more colourful and bright however linear editing
is speedy and less time-consuming.

Therefore, a skilled editor can …


 take footage shot at different times and various outtakes and make it seem seamless
- like it was all shot continuously at one time
 Make a single camera shoot that re-shot the same scenes from different angles and
make it appear as though there were several cameras following the action from
several different viewpoints.
 Imply relationships or action that did not actually exist when recording.
 Control the audience’s attention and interpretation of an event.
 Adjust the duration of the shots and influence pace.
 Change the entire significance of an action.
 Seamlessly cut in retakes to replace unsatisfactory material.
 You can increase or decrease the duration of the entire program.

Types/ Modes/ Process of Editing


Editing involves two modes- the Real time and Post production editing mode. Real
time editing deals with editing of live shows and programme as the events happen while
Post Production editing deals with the arrangement of shots or picture frame obtained from
a performance. Two types of editing –Linear and Non-Linear editing forms are involved in
the process of editing a film. Non-linear editing is the digital aspect of editing a film
which involves the use of a computer while the linear editing form follows a
considerable analogue format and is largely manual in operation.

Editing involves the use of plot in arranging the presentational sequence of the story
live. The strategy enables the idea of the creative work, as conceived, to be actualized,
through a technical process. These process or modes as outlined by Owuamalam (2007) are
real time editing and post-production editing. Real time editing mode makes it possible to
6

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

present line shows and programmes as the events happen. The editing process involves the
use of materials from various sources and blending them synergically to produce the
screen experience, known as film. For example, the news coverage of the visit of president
Kenyatta to launch a development project, can show the following arrival ceremonies, as a
live Programme: the Presidential jet is seen touching down the airport runway; another
scene is shown where government officials are waiting in front of a red carpet, laid for the
president; activities inside the VIP lounge, showing journalists in the front seat, where
the president is to address them, traditional or cultural dance troupe outside the arrival
hall, entertaining the crowd etc. It is a blend of the various scenes and sound, as a package,
that produces the live programme, which is enjoyed on the television screen.

Post-Production Editing deals with the arrangement of shots or picture frames,


obtained from a performance. It is an after- performance production, designed to match the
plot and storyline with what the audience is expected to watch on the screen. It is not as
time-pressured as the real time editing, which happens simultaneously as the event is
recorded for transmission. It takes a longer time to achieve them real time editing. It is
therefore time determined so that the edited version of the film fit into a specific time
frame, as desired by the director, without losing any major aspect of the work.

There are two basic types of film editing. They originate from the equipment and process
that are applicable in realizing the editing objective. The type considered and used, is a
matter of convenience and available technology. The two types or forms of editing are
linear and non- linear editing.

Linear Editing
It is time bound as a particular time code is followed to access different data. It is done on
and by videocassettes and tapes.

The editing apparatus consists of a panel like a keyboard of a computer added by a round
knob to shuttle and jog on machines having recorded material with different monitors for
the separate display. Usually there are three recording machines used as:

1. Player 1
2. Player 2
3. Recorder
Players are the sources that supply the chunks recorded in bits and pieces. Normally first
player has the basic visuals that are main video and player two has the secondary data that
can be background voice or some strips, graphics, names to be super-imposed, overlaid or
inserted on the primary video while the recorder takes the final output given by the both
players.

In linear editing, video tapes are used for playbacks and recording. It is a tape-based
recording system, whether analog or digital. The sequence of review is orderly and
Progressive. Continuality of events takes place in a specific order, which is not to be
7

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

altered. In linear Editing System (LES) two videotape recorders (VTR) are required.
One plays back the recorded tape while the other is used in recording selected shots from
the former, according to the editing plan. The shots to be selected can be identified
from the recorded tape, using the tape counter to find the exact location of the said shot,
in the produced tape. The editor notes the numbers and arranges them according to the
takes, desired to produce the finished product. The editor uses two monitors-the
preview monitor and the final view monitor. The preview monitor is used to watch and
select shots or takes, from the review VTR. It is the pictures on this monitor that enable
the editor to pause the review tape and select shots. The other monitor, shows the
recorded images from the editing VTR. It is the picture shown on this screen that tells the
editor if the plot and storyline have been followed as indicated by the technical desire
of the director and the expectation of the producer.

The Linear Editing System is largely manual in operation. It follows a considerable


analogy format in reviews and selection. Also, the location of pictures, based on the
numbering plan, can be digitized. The editing system does not allow for the jumping
of any shot, in order to get at any other desired one. There is therefore, no random
access to shots, as produced on the tape. It insists on guided access which selects shots
from tape, in the order of recording, onto another, as may be desired. The system
copies the desired shots in a specific order and places them, in a predetermined sequence
on another tape. The linear editing system therefore, is copy oriented.

Non-linear Editing (NLE)


It is not time bound as with a single keystroke or mouse click different data can be accessed.
It is done on computers as the data is transferred from tapes to computers. Various software
programmes are used in it.

The Non-Linear Editing (NLE) system is disk based. It uses the computer for storage,
reviews and the editing of video and audio data files. The system allows one to jump from
one shot or take, to the other, irrespective of the location of the desired shot in the file.
One can jump from, say shots 1 to 7, without assessing the shots in between the
shots 2,3,4,5, and 6. The capability of random access is created since one can jump and
access any desired shot at will.

In the NLE, the programme to be edited is converted digitally into electronic signals and
recorded in a disk. The disk is loaded into a computer’s disk drive, which enables the
system to accept and respond to commands. The shot identification takes place, within
the shortest imaginable time frame. It provides one, the opportunity of taking editing
decisions that enable the shots to relate and blend with each other, to produce a thrilling
synergy, which tells the story of the plot.

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Different Types of Video Editing


There are several different ways to edit video and each method has its pros and cons.
Although most editors opt for digital non-linear editing for most projects, it makes sense to
have an understanding of how each method works.

1. Film Splicing

Technically this isn't video editing, it's film editing. But it is worth a mention as it was the
first way to edit moving pictures and conceptually it forms the basis of all video editing.
Traditionally, film is edited by cutting sections of the film and rearranging or discarding
them. The process is very straightforward and mechanical. In theory a film could be edited
with a pair of scissors and some splicing tape, although in reality a splicing machine is the
only practical solution. A splicing machine allows film footage to be lined up and held in
place while it is cut or spliced together.

2. Tape to Tape (Linear)

Linear editing was the original method of editing electronic video tapes, before editing
computers became available in the 1990s. Although it is no longer the preferred option, it is
still used in some situations.

In linear editing, video is selectively copied from one tape to another. It requires at least
two video machines connected together — one acts as the source and the other is the
recorder. The basic procedure is quite simple:

1. Place the video to be edited in the source machine and a blank tape in the recorder.

2. Press play on the source machine and record on the recorder.

The idea is to record only those parts of the source tape you want to keep. In this way
desired footage is copied in the correct order from the original tape to a new tape. The new
tape becomes the edited version.

This method of editing is called "linear" because it must be done in a linear fashion; that is,
starting with the first shot and working through to the last shot. If the editor changes their
mind or notices a mistake, it is almost impossible to go back and re-edit an earlier part of
the video. However, with a little practice, linear editing is relatively simple and trouble-
free.

3 Digital/Computer (Non-linear)

In this method, video footage is recorded (captured) onto a computer hard drive and then
edited using specialized software. Once the editing is complete, the finished product is
recorded back to tape or optical disk.

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Non-linear editing has many significant advantages over linear editing. Most notably, it is a
very flexible method which allows you to make changes to any part of the video at any
time. This is why it's called "non-linear" — because you don't have to edit in a linear
fashion.

One of the most difficult aspects of non-linear digital video is the array of hardware and
software options available. There are also several common video standards which are
incompatible with each other, and setting up a robust editing system can be a challenge.

The effort is worth it. Although non-linear editing is more difficult to learn than linear, once
you have mastered the basics you will be able to do much more, much faster.

4 Live Editing

In some situations multiple cameras and other video sources are routed through a central
mixing console (video switcher) and edited in real time. Live television coverage is an
example of live editing. Live editing is a fairly specialist topic.

Production Switcher

The switcher is an 18-input video switcher designed for both live production and
post-production. Like any video switcher, the can perform video transitions such
as wipes, mixes, keys, etc. The alone is not capable of digital video effects. The
switcher is a very complex device with many levels of operational functions. To
keep things as simple as possible, we will focus primarily on the Downstream Bus
Rows consisting of the preview and program buses (bottom two) with only
occasional use of the Key Bus Row and the M/E Bus Rows (top two). For most
projects the set-up functions of the switcher will be preset by the instructor. During
rehearsal, Technical Directors (TDs) are encouraged to ask questions of the
instructor in order to fully understand the steps necessary for a successfully
switched program. However, instead of simply memorizing a set of keystrokes,
we encourage you to attempt to fully understand the process that is at work.

The 18-input switcher has sixteen video bus inputs and two auxiliary inputs.
• BLK Black Video
• R.S. Routing Switcher
• CAM 1
• CAM 2
• CAM 3
• VTR 1
• VTR 2
• DEKO 1 Graphics
• DEKO 2 Graphics
• CB Color Bars
• COLOR Background color generator
• M/E Mix/Effects bank

10

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

In addition, you will utilize the Downstream Transition Group consisting of the
following buttons:
BKG MIX Mixes between the program and preview buses
UNI KEY MIX Mixes in the video signal feeding the uni keyer
DS KEY MIX Mixes in the video signal feeding the downstream keyer
FADE Mixes to or from black

Downstream Fader and the Downstream Auto-transition Group consisting of the


following buttons can also be used:
TRAN Performs the selected transition at a rate specified in the setup panel
TRAN REV Performs the selected transition in reverse order
CUT Performs a cut or instantaneous transition.

Switcher Inputs
0 BLACK
1 ROUTING SWITCHER
2 VTR 1 (DVC PRO 1)
3 VTR 2 (DVC PRO 2)
4 VTR 3 (BETA SP PB/REC)
5 VTR 4 (BETA SP PB)
6 VTR 5 (3/4 INCH)
7 CAMERA 1
8 CAMERA 2
9 CAMERA 3
10 CAMERA 4
11 SYNCHRONIZER
12 COLOR BARS
13 DEKO 1
14 DEKO 2
15 TOASTER 2
16 A51
17 COLOR
18 M/E

The hardware and software requirements in video editing


assignments
There are a million different ways to do video editing. You can buy a complete solution
from a company like Avid at the high end, and at the low end you can use your camera and
a VCR to cut things together. The solution involves the following different parts:

1. A source device to play the original tape or disk. Typically a VCR or camera.

There are hundreds of digital video cameras, or camcorders, on the market today from
manufacturers like Sony, Panasonic, JVC and Canon. Most of them use what are known as
MiniDV tapes.

11

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Just about every camcorder based on the MiniDV tape format includes a FireWire (IEEE
1394) port on the camera so that you can load the video onto your computer quickly and
easily. Whichever type of camera you pick, it needs to have a FireWire connection so you
can hook it to your computer. This sort of FireWire connector is common on digital
camcorders. You attach a FireWire cable to this connector, and attach the other end to your
computer.

2. A computer with at least these specs: high speed processor / big RAM / Fast hard drive
with 1 GB or more free space.

Note: Some editing software requires a high-performance computer to even work properly.

A Pentium 4 machine or a late-model Mac with 8GB of RAM and a big hard disk is a nice
machine to have when you are rendering and writing files.

You can use just about any desktop computer for video editing, as long as it has enough
CPU power, hard disk space and bus bandwidth to handle the data flowing in on the
FireWire cable. Video processing in general uses lots of CPU power and moves tons of data
on and off the hard disk. There are two different places where you will most feel the
benefits of a fast machine and the sluggishness of a slow one: When you render a movie
that you have created or write it out to hard disk, you will definitely feel the speed of the
machine. On a fast machine, rendering and writing can take minutes. On a slow machine it
can take hours.

A more important issue comes when you are reading data from or writing data to the
camera. When the video data stream is coming in from the camera through the FireWire
cable, the computer and hard disk must be able to keep up with the camera or the computer
will lose frames. When sending a completed movie back to the camera, the processor must
be able to stream the data quickly enough or the camera will lose frames.

3. A video capture device. To capture video from an analogue source (such as VHS or
Video8) you need a device to convert the video into a digital format. This can be a
standalone device which plugs into the computer or a video capture card which becomes
part of the computer.

If you are using a source device which outputs a digital signal (such as Firewire or USB)
you don't need a capture device, but you do need to make sure your computer has the
appropriate input available.

A FireWire port to connect the camera to - If your computer does not have a FireWire port,
you can buy a FireWire card and install it.

4. Connecting leads to plug the source device into the capture device or computer.

5. Software to control the capturing, editing and outputting.


12

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

There are many software packages available for editing video on your computer. Windows
XP even ships with software that's built into the operating system. Machines from Sony and
Apple have software that comes with the machines.

Adobe Premiere is a full-featured and well respected video editing package that can do
almost anything you would want to do.

In order to use a package like Adobe Premiere, you need to understand several basic
concepts. Once you understand those basic concepts, however, the whole process is
remarkably easy. After you are familiar with the fundamentals, it is extremely easy to
expand your repertoire to include all sorts of advanced techniques.

6. A video monitor (or television).

There are two Basic Non-Linear Edits for Video:

a) Assemble or “Add-on”. Sequentially assembling clips in linear order. Clips are


assembled sequentially, or, a clip is “added” on to the end of the previous clip.

b) Insert or “Cover”. Overlaying one clip on top of another. One clip is “inserted” or
overlaid on others.
Notice the new video clip only covers the picture without covering the audio.

Digital Technology and Editing


The rules for editing are the same for film, videotape, and computer editing. The medium
changes, not the message.
In addition to playback and record VCR's, video monitors and audio gear, computer editing
requires a computer that is fast enough and has enough memory to process video, a device
to capture video and audio and turn them into computer files, and hard drives that are big
enough and fast enough to handle all of the video and audio you will need to store. On
Windows computers you want to make sure your capture drive uses the NTFS file format.

Advantage
There are several ways in which modern information and communication technology (ICT)
or digital technology has improved electronic media editing and production. Editing video
13

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

on a computer offers several advantages over using videotape.


1. The first is the ability to change the content or length of any part of a program
without having to re-edit everything from that point to the end. In computer editing
you are constructing a list of instructions, describing how the program is to be
assembled by the computer. Because you can work on any part of the program
without adversely affecting subsequent parts, computer editing is commonly
referred to as "non-linear editing."
2. A second advantage of computer-based editing is the ability to use more audio and
video tracks.
3. Tape-to-tape editing allows for only one or two video tracks and (for most systems)
two monaural audio tracks. In theory, computer-based editors could have virtually
unlimited audio and video tracks available. In practice, four or five video tracks
and the same number stereo audio tracks are usually sufficient. On the video side,
this allows the usual "A" and "B" video rolls, the transitions between the A and B
rolls, a track for titles and other luminance or chroma- keys, and even enough tracks
to do a "quad split." Audio would generally consist of the location sound (two
tracks to cross fade with the video transitions), the narration, and two music tracks
(to cross fade between cuts).
4. A third advantage of computer editing lies in the ability to duplicate files without
loss. As you move files from memory to a hard drive or to tape backup or to a CD
or DVD-ROM and back again, there is no loss of quality.
5. Once video editing becomes totally digital with equipment that can handle video
with minimal compression, there will be no need for the traditional on-line and off-
line editing phases -- it can all be done on-line.
6. Digital recordings can be made in the studio or on location and uploaded
(transferred) directly to an editing computer or video server for editing. Once this
transfer is made, there will be no danger of tape damage in editors, no matter how
many times the footage is previewed. Digital information stored on a computer disk
does not gradually degrade with repeated access the way it does when it's recorded
on videotape.
7. When a video server is used, the original footage can be viewed and edited by
anyone with a computer link to the server. This is generally someone within the
production facility; but, thanks to high-speed Internet connections, it could even be
someone in another city-or even in another country. In the case of animation and
special effects, which are labor intensive, projects are often electronically
transferred to countries where labor is less expensive.
8. The latest non-linear editors have many features that both speed up and improve
video and audio editing.

Dis-advantage
1. On the "down" side, video has to be "captured," or transferred from tape into the
computer. Capturing is in itself an editing process. Clips have to be identified and
transferred in real time. You might expect capturing to take up to twice as long as
the total length of the video you are transferring even if the process is automated.
2. The way video is captured and stored in one editing system may not be compatible
with another. In other words, while you can be sure that an NTSC VHS
videocassette can be played back on any NTSC VHS machine, a video computer
file may not be readable by any software other than the software used to create it.
14

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

3. Almost all video on computers is "compressed." Uncompressed video is equivalent,


more or less, to a 20 Megabytes per second data stream. The fastest "safe" video
data rate for most hard drives is half of the tested sustained speed, or between two
and eight Megabytes per second. Compression schemes that reduce the effective
data rate to three to six Megabytes per second produce excellent video and
manageable file sizes. For example, one hour of video at four Megabytes per
second would fit on a thirteen gigabyte hard drive.
4. Depending on the sophistication of your editing hardware, some, most, or all, of the
transitions, keys, and other effects that you apply to clips in your editing program
have to be created and saved on disk by the program before they can be viewed or
played back. This process is called "rendering." It can be quite time-consuming,
especially in low-end editing systems.
Digital video
There’s a lot to know about the technology of video. But there’s no need to be intimidated
by all this technology. As video has migrated to the desktop, it has gotten increasingly
easier to produce high quality work with little technical know-how. This will give you a
foundation in the basics.

Analog versus Digital Video


One of the first things you should understand is the difference between analog and digital
video. Your television (the video display with which we are all most familiar) is an analog
device. The video it displays is transmitted to it as an analog signal, via the air or a cable.
Analog signals are made up of continuously varying waveforms. In other words, the value
of the signal, at any given time, can be anywhere in the range between the minimum and
maximum allowed. Digital signals, by contrast, are transmitted only as precise points
selected at intervals on the curve. The type of digital signal that can be used by your
computer is binary, describing these points as a series of minimum or maximum values —
the minimum value represents zero; the maximum value represents one. These series of
zeroes and ones can then be interpreted at the receiving end as the numbers representing the
original information. There are several benefits to digital signals. One of the most important
is the very high quality of the transmission, as opposed to analog. With an analog signal,
there is no way for the receiving end to distinguish between the original signal and any
noise that may be introduced during transmission. And with each repeated transmission or
duplication, there is inevitably more noise accumulated, resulting in the poor fidelity that is
attributable to generation loss. With a digital signal, it is much easier to distinguish the
original information from the noise. So a digital signal can be transmitted and duplicated as
often as we wish with no loss in fidelity.

The world of video is in the middle transition from analog to digital. This transition is
happening at every level of the industry. In broadcasting, standards have been set and
stations are moving towards digital television (DTV). Many homes already receive digital
cable or digital satellite signals. Video editing has moved from the world of analog tape-to-
tape editing and into the world of digital non-linear editing (NLE). Home viewers watch

15

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

crystal clear video on digital versatile disk (DVD) players. In consumer electronics, digital
video cameras (DV) have introduced impressive quality at an affordable price. The
advantages of using a computer for video production activities such as non-linear editing
are enormous. Traditional tape-to-tape editing was like writing a letter with a type-writer. If
you wanted to insert video at the beginning of a project, you had to start from scratch.
Desktop video, however, enables you to work with moving images in much the same way
you write with a word processor. Your movie “document ” can quickly and easily be edited
and re-edited to your heart ’s content, including adding music, titles, and special effects.

Frame Rates and Resolution


When a series of sequential pictures is shown to the human eye, an amazing thing happens.
If the pictures are being shown rapidly enough, instead of seeing each separate image, we
perceive a smoothly moving animation. This is the basis for film and video. The number of
pictures being shown per second is called the frame rate .It takes a frame rate of about 10
frames per second for us to perceive smooth motion. Below that speed, we notice jerkiness.
Higher frame rates make for smoother playback. The movies you see in a theatre are filmed
and projected at a rate of 24 frames per second. The movies you see on television are
projected at about 25 frames per second (PAL) or 30 frames per second (NTSC),
depending on the country and the video standard in use there.

The quality of the movies you watch is not only dependent upon frame rate. The amount of
information in each frame is also a factor. This is known as the resolution of the image.
Resolution is normally represented by the number of individual picture elements (pixels)
that are on the screen, and is expressed as a number of horizontal pixels times the number
of vertical pixels (e.g. Standard resolution 720x576 PAL or 720x480 NTSC). All other
things being equal, a higher resolution will result in a better quality image [e.g. HDTV
1920x1080 and UHDTV (4k) 3840x2160, FUHD (8k) 7680x4320]

You may find yourself working with a wide variety of frame rates and resolutions. For
example, if you are producing a video that is going to be shown on VHS tape, CD-ROM,
and the Web, then you are going to be producing videos in three different resolutions and
at three different frame rates. The frame rate and the resolution are very important in digital
video, because they determine how much data needs to be transmitted and stored in order
to view your video. There will often be trade-offs between the desire for great quality video
and the requirements imposed by storage and bandwidth limitations.

Interlaced and Non-interlaced Video


There is one more thing you should know about video frame rates. Standard (non-digital)
televisions display interlaced video. An electron beam scans across the inside of the screen,
striking a phosphor coating. The phosphors then give off light we can see. The intensity of
the beam controls the intensity of the released light. It takes a certain amount of time for the
electron beam to scan across each line of the television set before it reaches the bottom and
returns to begin again. When televisions were first invented, the phosphors available had a
16

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

very short persistence (i.e., the amount of time they would remain illuminated).
Consequently, in the time it took the electron beam to scan to the bottom of the screen, the
phosphors at the top were already going dark. To combat this, the early television engineers
designed an interlaced system. This meant that the electron beam would only scan every
other line the first time, and then return to the top and scan the intermediate lines. These
two alternating sets of lines are known as the “upper” (or “odd”) and “lower” (or “even”)
fields in the television signal. Therefore a television that is displaying 25 frames per second
is really displaying 50 fields per second. Why is the frame/field issue of importance?
Imagine that you are watching a video of a ball flying across the screen. In the first 1/50th
of a second, the TV paints all of the even lines in the screen and shows the ball in its
position at that instant. Because the ball continues to move, the odd lines in the TV that are
painted in the next 1/50th of a second will show the ball in a slightly different position. If
you are using a computer to create animations or moving text, then your software must
calculate images for the two sets of fields, for each frame of video, in order to achieve the
smoothest motion. This also applies in the NTSC system that runs at 30 fps. The frames/fields
issue is generally only of concern for video which will be displayed on televisions. If your
video is going to be displayed only on computers, there is no issue, since computer
monitors use non-interlaced video signals.

Analog Video Formats


At some point almost all video will be digital, in the same way that most music today is
mastered, edited and distributed (via CD or the Web) in a digital form. These changes are
happening, but it doesn’t mean that you can ignore the analog video world. Many
professional video devices are still analog, as well as tens of millions of consumer cameras
and tape machines. You should understand the basics of analog video. Because of the noise
concerns mentioned earlier, in analog video the type of connection between devices is
extremely important. There are three basic types of analog video connections.

Composite: The simplest type of analog connection is the composite cable. This cable uses
a single wire to transmit the video signal. The luminance and color signal are composited
together and transmitted simultaneously. This is the lowest quality connection because of
the merging of the two signals. At some point almost all video will be digital...but it doesn’t
mean that you can ignore the analog video world.

S-Video: The next higher quality analog connection is called S-Video. This cable separates
the luminance signal onto one wire and the combined color signals onto another wire. The
separate wires are encased in a single cable.

Component: The best type of analog connection is the component video system, where
each of the YCC signals is given its own cable.

How do you know which type of connection to use? Typically, the higher the quality of the
recording format, the higher the quality of the connection type.
17

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Broadcast Standards
There are three television standards in use around the world. These are known by the
acronyms NTSC, PAL, and SECAM. Most of us never have to worry about these different
standards. The cameras, televisions, and video peripherals that you buy in your own
country will conform to the standards of that country. In Kenya, most of the African
countries and Europe we use PAL. America and Japan use NTSC. It will become a concern
for you, however, if you begin producing content for international consumption, or if you
wish to incorporate foreign content into your production. You can translate between the
various standards, but quality can be an issue because of differences in frame rate and
resolution. The multiple video standards exist for both technical and political reasons.
Remember that the video standard is different from the videotape format. For example, a
VHS format video can have either NTSC or PAL video recorded on it.

Getting Video into Your Computer


Since your computer only “understands” digital (binary) information, any video with which
you would like to work will have to be in, or be converted to, a digital format.

Analog: Traditional (analog) video camcorders record what they “see and hear” in the real
world, in analog format. So, if you are working with an analog video camera or other
analog source material (such as videotape),then you will need a video capture device that
can “digitize” the analog video. This will usually be a video capture card that you install in
your computer. A wide variety of analog video capture cards are available. The differences
between them include the type of video signal that can be digitized (e.g. composite or
component), as well as the quality of the digitized video.

After you are done editing, you can then output your video for distribution. This output
might be in a digital format for the Web, or you might output back to an analog format like
VHS or Beta-SP.

Digital: Digital video camcorders have become widely available and affordable. Digital
camcorders translate what they record into digital format right inside the camera .So your
computer can work with this digital information as it is fed straight from the camera. The
most popular digital video camcorders use a format called DV. To get DV from the camera
into the computer is a simpler process than for analog video because the video has already
been digitized. Therefore the camera just needs a way to communicate with your computer
(and vice versa).The most common form of connection is known as IEEE 1394.

The process of capturing video into the computer

There are two main ways of capturing video into the computer
a) Capturing analog video into the computer.
b) Capturing Digital video into the computer

18

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

a) Analog Video Capture


To capture analog video you will need a video capture card that can convert composite
video or S-video to digital video. High-end cards may also be able to convert component
video to digital. Some capture cards convert both audio and video and some rely on the
sound card to handle the audio.

The conversion process may be carried out entirely by the hardware on the card, or by
software running on your computer, or by some combination of both. In general, hardware
conversion is more reliable than software conversion.

On windows computers the digital conversion product is generally an AVI file. An AVI
(Audio Video Interleaved) file is a sound and motion picture file that conforms to the
Microsoft Windows Resource Interchange File Format (RIFF) specification.

MacIntosh files conform to their Quicktime format. In either case, the converted file is
almost always compressed. That is, much of the picture information is truncated or
discarded according to a compression scheme called a codec.

If the file format is dependant on hardware on the capture card or deviates from one of the
generally accepted codecs your ability to play back files will be limited. You should not
assume that all AVI files are alike in the way that all VHS tapes are alike.

Most capture cards do not compress audio. In fact, rather than using the 44100 Hz sample
rate found on commercial CD audio disks, audio for video is sampled at 48000 Hz. That
equates to 1.536 Mbps.

You probably will not be able to monitor audio or video levels on the computer as your
video is captured. If possible, you should use a time base corrector and waveform monitor
to make sure the signal going to the computer meets broadcast standards. Although some
capture cards have time base correctors built in, most do not.

It is not sufficient to monitor the audio input, since the computer has software audio level
and balance controls.

b) Digital Capture

Digital capture is a much simpler process than analog capture. The most common digital
recording format is “DV.”

This format is already digital and already compressed to about 4 MB/sec and already
compatible with the Microsoft AVI format.

19

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

To move it to your computer you need to connect your camcorder or DV recorder to your
computer using the IEEE-1394 Interface, also called “FireWire.”

There is no loss of audio or video quality in the transfer. That is the good news. The bad
news is that there is no way to adjust the video (level, setup, chroma, hue) or audio (level,
balance, equalization).

Capture software for digital transfer generally offers the additional advantage of “machine
control.” The playback device can be controlled by the computer.

DV tapes can have two different times embedded in the video signal. One is zeroed at the
beginning of the recording and shows the time on tape. It can be displayed on the
camcorder monitor in the upper right hand corner.

Your capture software depends on the tape time recorded on your DV tape. If that signal is
not continuous it will zero itself and start over (Time code break problem). This is
confusing for people. It is fatal for some digital capture software, since tape times are not
unique and the software uses the machine control interface to search the tape for specific
time points on the tape.

DV tapes can also have a digital time stamp that records the actual date and time each
frame is recorded. Clip detection can be based on the digital time stamp on the tape. A
discontinuity in the time stamp indicates the tape was stopped and restarted, ending one clip
and beginning a new one. It may also be possible to detect clips by looking for sudden
changes in video content. Whether you want to detect clips depends on how the software
treats clips and the nature of the project.

Audiences’ expectations
Audiences tend to have fairly high standards when they watch anything on video. There are
features that they expect to see in any video production that you should include in your
rendition of it.

1. A title at the beginning

2. A set of "shots" cut together in a nice way to tell a story. A shot is a specific subject
filmed from a specific angle. For example, if you are telling the story of a birthday party,
different shots from the event might include:

o a shot of the cake

o a shot of the presents before they are opened

o a shot of the kids at the party sitting at the table

o a shot of blowing out the candles

20

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

o a shot of unwrapping a present

o A fairly high number of shots. It is rare for the camera angle to stay the same for more
than 10 or 15 seconds. The director will cut between different angles to keep things
interesting or to make different points. For example, the screen might show a man's face
while he's talking for five seconds, and then switch to a shot of his hands holding a tissue
(while the sound track continues uninterrupted with him talking) to show the emotion.

3. Interesting transitions between the shots. For example, some shots might fade into
others, some might spin into others, and some cut very simply from one to another in a
quick chain.

4. A decent soundtrack, often involving narration and/or background music

5. Perhaps static shots (like a chart or graph) mixed in with the normal video

6. Titles or legends on some of the shots to identify people, places and things

7. Slow motion or fast motion to change the tempo

8. The End Credits/ list of actors and the characters/ roles played by the individuals taking
part in the production

Approaches to Editing
There are two main approaches to editing
a) Continuity editing
b) Thematic editing

a) Continuity Editing
Continuity editing refers to arranging the sequence of shots to suggest a progression of
events.

Continuity editing primarily suggests guiding an audience through a sequence of events,


and, in the process, showing them what they want to see when they want to see it. In the
end, you've told a story or logically traced a series of events to their conclusion.

Given the same shots, an editor can suggest many different scenarios. Consider just these
two shots.

• a man glancing up in surprise

• another man pulling a gun and firing toward the camera

In this order it appears that the first man was shot. However, if you reverse the order of
these two scenes, the first man is watching a shooting.

21

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

When hundreds of scenes and takes of scenes are available to an editor, which is normally
the case in dramatic productions, the editor has tremendous control over the basic
continuity and message of the production.

In dramatic television good editors sometimes break from the expected to achieve a
dramatic effect. Unfulfilled expectations can be used to create audience tension.

b) Thematic Editing
In thematic editing, also referred to as montage editing, images are edited together based
only on a central theme. In contrast to most types of editing, thematic editing is not
designed to tell a story by developing an idea in a logical sequence.

In a more general sense, thematic editing refers to a rapid, impressionistic sequence of


disconnected scenes designed to communicate feelings or experiences.

This type of editing is often used in music videos, commercials, and film trailers
(promotional clips).

The intent is not to trace a story line, but to simply communicate action, excitement,
danger, or even the "good times" we often see depicted in commercials.

Editing Principles
A number of principles influence both shooting and editing. These principles as
enumerated by Mamer (2009) merit discussion.

General Editing Principles


The general editing principles refer to a number of factors that affect shooting and
editing. They are therefore considered very important since they influence the decisions
taken by the editor while editing a film. Such principles as enumerated and discussed by
Mamer (2009:348) include the following:

 Transitions

This term is used to describe shots that bridge one setting to another or that mark the
passage of time. The term covers a wide range of approaches, but often transitional shots
have the added burden of being establishing shots as well. The common approach is to
show a setting, establishing both the place and, by extension, the time of day.

There are many ways of handling transitions but editors are advised to find those that are
effective but not predictable.

22

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

 Economy and Pace

These terms refer to employing each of the individual shots for the shortest time possible
i.e their economy while still allowing them to achieve their purpose. This is because each
individual film, scene and shot demands its own pace. Achieving economy and pace is
attained through control of the physical lengths of the shots, though many other
elements affect the sense of a film’s internal rhythm. Usually, it is a question each
individual piece of film should be on screen. For instance, if a point cannot be made in
two seconds, it certainly does not need to seconds devoted to it. But some film makers
like Late Italian director Michelangelo Antonioni, as described by Mamer (2009:349)
exploited both a slower pace and the psychological intensity of the close-up. The long-
lingering close-up shot of the main character can be used to force the viewer to identify or
experience some contemplative and environmental effects that emphasize spaces
in between dialogue. But a film can be kept lean and efficient depending on the desired
visual presentation and amount of weight the scene should have in terms of the rest of the
film.

Even though the traditional rules of editing seem to be regularly transgressed in


commercials and music videos, the more substantive productions, especially serious
dramatic productions, seem to generally adhere to some accepted editing guidelines.

1) Edits work best when they are motivated. In making any cut or transition from one shot
to another there is a risk of breaking audience concentration and subtly pulling attention
away from the story or subject matter. When cuts or transitions are motivated by production
content they are more apt to go unnoticed.

2) Whenever possible cut on subject movement.

If cuts are prompted by action, that action will divert attention from the cut, making the
transition more fluid. Small jump cuts are also less noticeable because viewers are caught
up in the action.

For example, if a man is getting out of a chair, you can cut at the midpoint in the action. In
this case some of the action will be included in both shots. In cutting, keep the 30-degree
rule in mind.

3) Keep in Mind the Strengths and Limitations of the Medium. Remember Television is a
close-up medium.

An editor must remember that a significant amount of picture detail is lost in video images,
especially in the 525- and 625-line television systems.

The only way to show needed details is through close-ups.

23

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Except for establishing shots designed to momentarily orient the audience to subject
placement, the director and the editor should emphasize medium shots and close-ups.

4) Cut away from the scene the moment the visual statement is made. The pace of a
production rests largely with the editing, although the best editing won't save bad acting or
a script that is boring to start with.

First, keep in mind that audience interest quickly wanes once the essential visual
information is conveyed. Shots with new information stimulate viewer interest.

Shot length is in part dictated by the complexity and familiarity of the subject matter.

5) Emphasize the B-Roll. A great movie is made with cutaways and inserts. In a dramatic
production the B-roll might consist of relevant details (insert shots and cutaway shots) that
add interest and information.

One critical type of cutaway, especially in dramatic productions, is the reaction shot -- a
close-up showing how others are responding to what's going on. Sometimes this is more
telling than holding a shot of the person speaking.

For example, would you rather see a shot of the reporter or the person being interviewed
when the reporter springs the question: "Is it true that you were caught embezzling a million
shillings?" By using strong supplementary footage the amount of information conveyed in a
given interval increases. More information in a shorter time results in an apparent increase
in production tempo.

6) The final editing guideline is If in doubt, leave It out.

If you don't think that a particular scene adds needed information, leave it out. By including
it, you will probably slow down story development, and maybe even blur the focus of the
production and sidetrack the central message.

Basic Rules in Editing


Some films are cut so that their editing is as seamless as possible. This is what is
known as invisible editing. In his approach, any cut that is abrupt or calls attention to
itself is considered a bad cut. The rules that guide editing of films have their
genesis in this conventional method. The rules apply not only to editing but also to how
the shooting of a scene is approached. They are as follows:

 There must be a reason for a cut.

This is the great overriding, unbreakable rule. When you cut from one image to
another, you must have a purpose for that choice. It may be show a response or to
emphasize an action or to keep shots from being too long or too static.

24

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

 The 30-degree rule

The 30-degree rule says that if you want to cut to a closer shot of a subject, the second
shot should vary by at least 30 degrees from an axis drawn from the original camera
position to the subject. The bottom line is that you should not move the camera
toward the subject in a straight line, the possibility of a disagreeable jump is great if
you do.

 Emphasis

Cutting on a subject or an action exaggerates the significance of that subject. In essence,


the implicit message is that this subject is important enough to warrant more than a single
perspective. If there is a shot of an object on a table –say a knife-followed by a cut to
a close-up of the knife, the implication is that the knife is an important element.

 Visual interest

In the whole film, there must be shots that are not visually interesting. If you film a
dance rehearsal, for example, there will be parts in which the dancer is turned from the
camera or is framed poorly or parts in which the action is just not engaging select the
segments in which what was in front of the camera interrelates with the film frame
in a visually exciting way.

 Variety

Your shots must employ a variety of approaches. Vary between close- ups and long
shots , low angles and eye level shots, images with different balances of
compositional interest , moving and static camera, and so on. In other words, use the
camera resources available to you.

If a film is composed entirely of long shots, it risks becoming visually dull and
predictable. If the area of interest in all of the compositions is in the same part of the
frame, the same problems can occur. Obviously there are exceptions. Several films that
were done largely in long shot have been successful and also some films shot
exclusively in close-up shots were successful. But few of them are exceptions. They
do not represent the kind of explorations and experiments that provide useful learning
experiences for beginners.

In conclusion, general editing principles refer to a number of factors that influence the
editor’s decisions while editing a film, such principles are transitions, economy and
pace. Apart from these principles, some basic rules are applied while editing a
film. Such rule as the 30- degree rule, variety, visual interest, emphasis and
unbreakable and overriding rule of giving a reason for a cut, must be adhered to while
editing a film.

25

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Editing Interviews
Interviews are almost never run in their entirety. An audience used to short, pithy sound
bites will quickly get bored by answers that wander from the topic, are less than eloquent,
or that are just boring. Explain how you can bridge an interview edit.

In interviews you may shoot ten times more footage than you end up using. It's the job of
the video editor to cut the footage down --

• without leaving out anything important

• without distorting what was said, and

• without abrupt changes in mood, pacing, energy, or rhetorical direction

To start with, cutting a section out of dialogue will normally result in an abrupt and
noticeable jump in the video of the person speaking.

One solution is to insert a three- or four-second cutaway shot over the jump in the video.
This assumes, of course, that you've already made a smooth audio edit between the
segments.

These cutaways, which are typically done in editing with an insert edit, are often reaction
shots ("noddies") of the interviewer.

If videotape is being used, these cutaway shots are typically from a separate videotape (a B-
roll) as opposed to the recording of the interview answers (the A-roll). In linear editing
having two separate video sources (an A-roll and a B-roll) can make editing easier.

With nonlinear editing everything can be recorded on a hard disk or solid-state memory
card and the segments can be instantly accessed from a single source.

Even so, the supplementary footage is commonly referred to as B-roll footage. Editors
depend greatly on this supplementary B-roll footage to bridge a wide range of editing
problems. Therefore, you should always take the time to record a variety of B-roll shots on
every interview -- insert shots, cutaways, whatever you can get that might be useful during
editing.

Another (and somewhat less than elegant) way of handling the jump cut associated with
editing together nonsequential segments of an interview is to use an effect such as a
dissolve between the segments. This makes it obvious to an audience that segments have
been cut out, and it smoothes out the "jump."

26

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

The advantages of Digital Editing with a Video Server


Once video editing becomes totally digital with equipment that can handle video with
minimal compression, there will be no need for the traditional on-line and off-line editing
phases -- it can all be done on-line.

Digital recordings can be made in the studio or on location and uploaded (transferred)
directly to an editing computer or video server for editing. Once this transfer is made, there
will be no danger of tape damage in editors, no matter how many times the footage is
previewed. (Digital information stored on a computer disk does not gradually degrade with
repeated access the way it does when it's recorded on videotape.)

When a video server is used, the original footage can be viewed and edited by anyone with
a computer link to the server. This is generally someone within the production facility; but,
thanks to high-speed Internet connections, it could even be someone in another city-or even
in another country. In the case of animation and special effects, which are labor intensive,
projects, are often electronically transferred to countries where labor is less expensive.

The latest non-linear editors have many features that both speed up and improve video and
audio editing. For example, Some editors can "read" or understand the spoken dialogue in
video footage and match it up with a written script or with words you type in. If you
happen to have hours of video footage and are looking for the point where someone said,
"Eureka, I found it," the editing system can search through the footage the cue up the part
of the video where that phrase is spoken.

Another useful feature is image stabilization. Let's assume you have some shaky footage --
possibly involving a moving vehicle. The first thing you do is freeze the beginning of the
footage on the screen. Then you find a clearly defined object near the center of the scene
and draw a box around it. This becomes an anchor point reference. Then the whole image
is then slightly enlarged to give the process "working room." Once you roll the footage the
editor holds the selected area still, eliminating the shake and movement in the original
scene.

Six quick tips for file server editing


A savvy editor can take the same script, footage, and on-camera performances and subtly
or even dramatically change the meaning of a video piece. So, in a sense, editing has the
potential of being the most creative phase of the production process.With the move to
tapeless production well underway, this is an area that will see major changes in the next
few years.

1. Although you may want to shoot everything on location that you think you could
possibly use, when it comes to uploading or capturing this footage on a file server or
computer hard disk, you will want to use a bit of discretion.

27

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

After reviewing the footage and making a rough paper-and-pencil edit, upload only the
footage that you are reasonably certain you will use. Not only does excess footage take up
valuable hard drive space, but prodding through this footage during editing adds
considerable time to the editing process.

2. After the footage is uploaded, trim the front and back ends of the segments to get rid
anything you're not going to use. This will also speed up editing and reduce storage space,
plus, it will make the clips easier to identify on the editing screen.

3. Once this is done (#2 above), look for connections between segments; specifically, how
one segment you are considering will end and another will start. Look for ways to make
scenes flow together without jarring jump cuts in the action, composition, or technical
continuity.

4. Find appropriate cutaways. In addition to enhancing the main theme or story they should
add visual variety and contribute to the visual pace.

5. Use transition effects sparingly. Although some editing programs feature 101 ways to
move from one video source to another, professionals know that fancy transitions can be
distracting and can get in the way of the message — not to mention looking pretentious.

6. Use music creatively and appropriately. "Silence" is generally distracting, causing


viewers to wonder what's wrong with the sound. Finding music that supports the video
without calling attention to itself (unless, of course, it's a music video) can be a major task
in itself. If you have a bit of talent in the music area, you might consider a do-it-yourself
approach to electronically composing your music. The Fruity Loops, and Sonic Desk Smart
Sound program, among others, will not only give you full control of your music, but it will
eliminate copyright problems. Sometimes simple music effects will be all that you will
need.

Mixing and uses of basic effects in Editing


There are some terminologies that are referred to as effects in the process of editing a
film. Many of these effects can be created in camera but the norm is to plan them in
editing and have them executed by the lab in the final print.

Mixing and Uses of effects is to beautify a production. It fills the colours in the programme
with the use of animations, graphics, windows, brackets and effects like, Zoom out, Zoom
in, Page turn over, Dip to black, Fade in, Fade out, Dissolve, Cross fade, Wipe, Swap.
Moreover Colour tone, Title, End credits, Breaks, Bumpers, Scroll, Strips and
Superimpositions like Names, Callers, Phone numbers, Email, Website is also done by
mixing, as well as Promo and Recap are also prepared

Music is also adjusted in by audio mixing by keeping music in fore ground, mid ground or
background. Choice of music is exercised in it and Sound leveling is done.
28

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Effects in film editing


There are some actions that are taken with the aid of camera, computer, optical printer
etc that aim at achieving some kind of effects in the film. These effects as discussed
by Kogah (1999) and Mamer (2009) are enumerated as follows:

 Fade-out and fade-in.

A fade-out is simply an effect in which the scene is gradually taken out or the picture
fades to black. It is usually followed by a fade-in during which a new scene gradually
becomes bright enough to be seen clearly. Fade-out and Fade-ins are used as transitional
devices, either to get from one location to another or to signify the passage of time.
Occasionally, filmmakers fade to shades and colours other than black.

 Dissolve

This is a technique in which one shot is faded out while the next shot is fade in on top
of it. In this process, the screen is not completely dark as one scene replaces another.
This is used to signify a change of time or place, just as in fade-out. Dissolve is not used
frequently because it is mainly used to soften an otherwise terrible shot.

 Superimposition

This is composed of one shot overlaid on another. It can be achieved in the camera
while shooting or, more common, during editing and final printing process.

 Optical effect

These refer to all graphic effects created in the lab. Optical effects include split screens,
keyholes, freeze frames, spins, wipes, etc. They were produced on an optical printer
prior to the digital age. The optical printer is a projector that has a camera shooting
straight into it. Both the projector and camera can be advanced one frame at a time.
The camera can also be repositioned to focus on specific parts of the projected frame.
The projected image can be manipulated in terms of both coloration and the speed of the
film going through the gate. The camera and projector can be controlled separately in
order to allow frames to be repeated, skipped, run in reverse, or held for many frames, an
effect known as freeze-frame. The optical printer can equally be used to enlarge or
reduce a particular film guage depending on the desire of the filmmaker. The digital
nonlinear editing (NLE) has made it possible to achieve any effect and
completely eliminate generation problem but the optical printer is still used in
creating many visual effects and can be an exciting tool for beginning filmmakers.

Distinguishing Optical Sound Track from Magnetic Sound Track


The optical sound track is produced photographically and used on all standard print. At
one edge of the film, a close observation will reveal some wavy lines or variations in

29

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

the density of the film strip depending on the particular process that was used when the
film was manufactured.

Inside the projector also is a sound head which has a photo electric cell and an exciter
lamb. The film is threaded in such way that the optical sound track passes between
the exciter lamp and the photo cell. The light falling on the cell varies according to
pattern on the sound track thereby creating an electric current which, when amplified,
reproduces the original sound.

The magnetic sound on the other hand, uses a specially produced film stock with a
sound strip running along the edge of the film. The sound strip consists of the same
iron oxide particles always used in audio recording. The recording and reproduction
processes are identical with those used for audio tape.

Tiny recording head inside the camera selectively magnetizes particles on the
sound track during filming. Inside the projector also, an identical head figuratively reads
the track as the film runs. At the same time, it generates tiny electrical signal which when
amplified, is an exact reproduction of the originally recorded sound. Both are faithful to
each other.

In conclusion, effects are created in the camera but the norm is to plan them in editing and
have them executed by the lab in the final print. Most commonly created effects are Fade-
in and fade-out, dissolve, superimposition and optical effects. There is a difference
between optical sound track and magnetic sound track. While the optical sound track
is produced photographically, and used on all standard prints, the magnetic sound track
uses a specifically produced film stock with a sound strip running along one edge of the
film.

30

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Editing Sound
Four Essential Audio Elements
A good sound track is the key to a great video. Not sure you agree? Try this: pop a movie you've never
seen into your DVD player. Start the movie, but play it with your eyes closed. Just listen to the
soundtrack. No peeking!

As you listen, notice how much of the plot you can follow using only your ears. Notice how the ambient
audio (including some subtle sound effects) indicates where the scene takes place, whether in a busy
office, a quiet cottage, a nuclear submarine, or wherever the action is set. Listen to the way that the music
sets the mood. Note whether the scene is tense or tender. Is the pace of the scene frantic or relaxed?
Notice how the dialogue introduces the key characters and identifies important plot themes. It won't take
long to figure out that the bulk of the story enters your brain through your ears, not your eyes. Next, take
off your blindfold, press the Mute button on your remote, and watch the visuals without listening to the
accompanying audio. Before long, you'll likely be lost. Without the soundtrack, it can be difficult to tell
what's going on. Who's on the other end of that phone call? Did the leading lady turn because she heard a
gunshot, the doorbell or a baby crying? Who is the guy that just walked in? And why are they suddenly
on an airplane? OK, OK! Now that you're convinced of the importance of audio, un-mute the movie and
rewind it so you know what's happening!

There are four important elements that come together to make a complete soundtrack, and every editor
should know them. They are natural sound, music, narration and sound effects. When they are used
together effectively, the viewer won't notice them. The best audio edits are smooth and subtle, and do not
call undue attention to themselves. When the mix is off, the viewer will notice. Crackling audio, music
that's too loud or too soft, and fake-sounding audio effects will distract the viewer from the message of
your movie. When it comes to watching video, hearing really is believing.

Let's take a closer look at each of the four essential audio elements that we have at our disposal.

Element #1 - Nat Sound 
 Nat sound (also known as "natural sound" or "wild sound") includes any
audio recorded along with the video that you shoot. This includes ambient audio. When you record a
scene on a bench in a city park, the sound of children laughing and playing in the background is part of
the nat sound in that environment. In the context of this article, we'll also consider on-camera dialogue
that is recorded along with the visuals to be natural sound.

Nat sound is often the only audio that amateur videographers include in their videos. While home videos
typically consist entirely of this kind of audio, Hollywood producers go to great lengths to avoid using it
altogether. The audio recorded with the images that are shot for most feature films is usually used only as
a guideline by a team (or multiple teams) of people that re-creates every sound and every word of
dialogue in a scene. Foley artists create and record everything from footsteps to keyboard clicks as
separate sound elements that can be mixed together with great precision and control. While all of this
may be unrealistic for the videos that you produce, there is an important principle to learn: if you want

31

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

your videos to sound more like professional productions, you'll have to spend some time editing your
audio.

If you record dramas, documentaries, events, interviews or instructional videos, Nat sound is the
foundation of your soundtrack. It's critical that you start with the highest quality audio possible-this
means that your camcorder's built-in microphone is not the best choice. An external microphone and a
good pair of headphones are essential.

Element #2 - Music 
 Simply adding a track of background music can greatly improve your videos.
Music has great power to impact your viewers emotionally, and the pros use it all the time to add zing to
a scene. Listen carefully to the music tracks that accompany the programs that you watch on TV tonight.

Music often creeps in quietly-unnoticed by the viewer-then builds as emotions heighten. Want to tell
your audience how to feel? Use music. Some of the most suspenseful movies of all time are known by
their music tracks (Jaws and Psycho, for instance). The anticipation at the sound of the music in these
movies could scare an audience out of its seat.

In the same way that it can build tension and fear, music can build joy or excitement. Imagine how the
music would swell (along with the hearts of the viewers) as a hero triumphantly emerged from a
smoldering building with a child in his arms.

You can change the entire feel of a scene by simply changing the music track. The same sequence of
shots can feel spooky, silly or sad, depending on the music that you select. To see for yourself, shoot this
short sequence, edit it together, then play it back with several different music tracks.

Shot 1 - (medium shot) A man sits in a chair reading a newspaper.


Shot 2 - (closeup) He lowers the paper and tilts his head as he strains to hear something (off camera).
Shot 3 - (medium shot) With a puzzled look on his face, he folds the paper, lays it on the table, stands and
exits the frame.
Notice how the music you select changes the scene, even though the actions on screen are the same.
Scary music makes the viewer think something is amiss. Carefree music raises no such suspicion.
Because it has such great impact on the emotional response of your audience, music selection is an
important part of video production. Used well, music has the power to turn the ordinary occurrences into
extraordinary experiences.

Background music is an easy way to add professionalism to a video with dialogue or narration.
Background music should be mixed low, so as not to interfere with the words that are spoken.

In other productions, music may be the only audio track. Music montages without Nat sound can be
particularly moving if you make wedding or event videos. You'll find that it helps to lay the song on your
timeline, and then edit your footage to the music.

Element #3 - Sound Effects 
 These days, sound effects are more than just gunshots and explosions.
As we said, most of the sounds you hear when you watch a feature film were created in a studio and
32

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

edited into the production. You can use subtle sound effects to enhance your videos. Remember that
interview in the park? Adding the sound of a rolling stream or some chirping birds can enhance the
pleasant feeling of the setting.

Look again at the three shot sequence where the reader hears a sound before leaving the room. Just what
did he hear? A single CD of sound effects provides a variety of options to consider. Capture a few
sounds, then edit them into your sequence. Did he hear a scream, a knock at the door, a police siren?
Maybe it was a dog barking, the phone ringing or the sound of a marching band. Try a few of them, then
revisit the music track. If you selected a dramatic sound like a gun or car crash, pick a suspenseful music
track to coincide. You get the idea.

Sound effect CDs are easy to find, and provide a number of options for just a few dollars. A sound effect

CD is a good investment for a videographer, especially a CD that includes natural sounds. However, look
closely at the contents of the disc before you buy. You may not have much use for a collection of
carnival sounds.

Element #4 - Narration 
 If you produce documentaries, travel video, personal histories, instructional
videos or any type of video that seeks to explain a procedure or tell a story, narration is invaluable. The
best narration is well-scripted and planned to match the visuals in your production, not off-the-cuff
rambling. If possible, it's a good idea to script out your narration before you shoot, then gather the shots
that you need to match. If you cannot script the narration before you shoot, at least do it before you begin
editing. It is much easier to lay the narration on the timeline and edit your footage to the words than it is
to write a tight narration that matches your edited video.

Make an effort to use all four of these essential audio elements in your production. Used together, or in a
variety of different combinations, Nat sound, music, sound effects and narration are powerful tools that
can greatly improve your productions.

Mixing and editing audio is easiest with a timeline-based editing program that offers at least four audio
tracks. The timeline allows you to position sounds in relation to the visuals, and other audio clips, with
frame accuracy.

The best way to evaluate the mix of your audio is to play the scene back on the worst television you can
find. If the mix sounds good from its speakers, it will sound good wherever you play it.

Please answer the following questions in detail:

1) What is natural sound? What are essential for recording natural sound?

2) What are some ways background music can affect your productions?

3) How can sound effects complement and enhance your productions?

33

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Analog and Digital Audio


Audio is of two types, analog and digital. When you speak, the sound you create is in the analog or wave
format. The sound that you hear is also in analog format. Computers are basically digital devices.

Therefore, it is necessary to convert audio from the analog format to the digital format before using it on
a computer.

Digital audio technology allows you to record, edit, and play digital audio files on a computer. Advanced
digital audio technology also lets you communicate with the computer by just speaking. This lesson
introduces you to digital audio technology. It also briefly describes the concepts of copying and
converting digital audio.

Characteristics of Digital Audio


One of the important characteristics of digital audio is that it can be compressed. Audio files are
generally large. Compressed audio files save space, allow portability, and are easier to transfer over the
Internet. When you compress audio files, the quality of the audio file is generally affected

Another important characteristic of digital audio is that it can be edited on a computer by using audio
editing software. For example, you can use this software to move sections within an audio file or add
effects. You can also use audio editing software to store an audio file in different formats on the
computer.

Audio File Formats


Some of the common formats that you need to be familiar with are:

Windows Media® Audio (WMA): This format was developed by Microsoft and is used to store digital
audio files.

Wave (WAV): This format is part of a series of standards for audio and video developed for Microsoft
Windows® 95 as a universal sound file format. It is used to store audio files in the wave audio format.
Audio files stored in this format have good audio quality but this format is used sparingly these days.
This is because the audio files in this format are larger when compared with other formats.

MPEG Audio Layer 3 (MP3): This format was developed by Motion Picture Expert Group to allow
compression of audio and video for digital distribution. The MP3 format is a popular format that is used
to store digital audio files. This is because MP3 files are generally smaller than WAV files.

Audio Streaming
Digital audio also allows streaming of digital audio files. With audio streaming, you do not have to wait
to completely download a large audio file from the Internet to play it. Instead, you can use a streaming
audio player or a browser plug-in to play the audio file from the Internet. In this case, the audio file is
sent to your computer in a continuous stream.

34

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Recording, Copying, and Converting Digital Audio


 The technique of recording and storing audio files in a digital format is called digital recording.

 You can then copy the stored audio files to storage devices, such as recordable CDs and DVDs, in
various formats, such as WAV and MP3.

 You can also convert audio from a CD or a DVD to a different format before you store it on your
computer’s hard disk.

Copying Audio (Hardware)


You can copy audio from storage devices, such as a computer’s hard disk, and store it on a recordable
CD or DVD. This process of copying audio to a recordable CD or DVD is referred to as burning. You
need a special hardware device, such as a CD writer or a DVD writer, to copy audio to recordable CDs or
DVDs. A CD writer allows you to copy audio only to recordable CDs whereas most DVD writers allow
you to copy audio to recordable CDs and DVDs.

Copying Audio (Software)


Along with hardware, you also need software to copy audio to a recordable CD or DVD. You can use the
software to create different types of CDs. You can create data CDs, audio CDs, and mixed mode CDs.
Mixed mode CDs contain audio, video, and data files. For example, you can create an audio CD, and
include video files and text files on it to create a mixed mode CD.

Converting Audio
You must have audio conversion software, such as Microsoft Windows Media Player, installed on your
computer to convert audio files. The software changes the format of the audio and might also compress
the audio so that the files take up less space on the hard disk. You can then transfer these audio files from
the computer to a portable medium, such as a PDA or a cell phone.

Warning: Converting audio from CDs and DVDs to a different format is illegal. Ensure that you have
permission to convert audio to a different format before converting it.

Procedures involved in sound editing


Procedures involved in sound editing to come up with a recorded radio program.

1) Sound selection
2) Volume controls
3) Pitch and tone controls
4) Control of distractive interference
5) Sound juxtaposition
6) Maintaining consistency in sound levels
7) Selection of words of music
8) Deliberate sound omissions

35

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Audio Editing technical terms:


1) Fade In

This is the systematic incremental introduction of music or a certain SFX to a given scene. The volume
of the new sound is increased gradually to the appropriate levels.

2) Fade Out

This is the systematic decremental removal of music or a certain SFX from a given scene. The volume of
the sound is decreased gradually to the minimum.

3) Cross Fade

This is the systematic introduction of a sound effect by gradually increasing its volume while removing
another gradually ie fading in a new sound effect and fading out another.

4) Segue to

It is the systematic ending of a given sound and introducing another immediately.

5) Fade under

This involves the systematic reduction of volume to a lower level, at a selected scene. The sound is
allowed to be audible as a background SFX.

6) Sound element

It refers to sound or dialogue recorded either exclusively for radio or in sync with pictures for television.

7) Signature tune

This is a song or tune used to signify the beginning or commencement of a given program.

8) Narration

These are the actual words spoken by the presenter directly to the audience. It normally contains the
storyline of the programme.

9) Dialogue

This refers to the words spoken when two or more persons are involved in a conversation.

10) Ambience

This refers to the sound effects from surroundings. Such may include cheers, car horns, and wind
blowing sounds.

36

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

VIDEO #1 EDIT DECISION LIST (FOR VIDEO/AUDIO EDITING):

Student/Shot Time code in Time code out Audio cue in Audio cue out
:13 :14 Fall Ball
Clara 1:26 1:27 8 am Is really early
1:37 1:40 Going up the Hill Longer than expected
3:19 3:28 I love going around Up to various
Linnea &
locations
Jennifer
3:28 3:35 I just love All the colorful trees
Sonia & 5:31 5:32 The Leaves
David 7:06 7:13 It’s impossible Unless you’re a girl
8:06 8:11 Swear to God Ben Affleck
Matt
8:28 8:30 This campus Is amazing
Roman 9:25 9:26 Homecoming
10:55 10:58 All the students Coming back
Stanley
11:13 11:16 Stanley It’s all gonna be okay
Rezwan 12:00 12:10 My best fall memory An amazing job
Fahmil 15:00 15:08 I just remember my We’re from Boston
Rebecca 15:52 16:03 Seeing all the Of our community
Andrew, 17:13 17:23 The thing I like Old memories
Alan, & Sam 17:46 17:47 Free Food
19:15 19:19 Fall at Tufts To go to school
Ricky 20:02 20:10 I sat in the class What I was doing
there
Nick 21:30 21:34 I actually met My best friends
Russell, 23:15 23:17 A lot of free food Always good
Sterling, & 23:30 23:42 I like when the leaves Playing Ultimate
Rayna 24:07 24:25 I did Pre-Orientation So much in common
24:48 25:07 Being from California Really exciting for
Dominique me
25:10 25:21 I did not visit Never left home
A.J. 27:15 27:19 I do remember my On the sidelines
Logan & 28:58 29:03 I love seeing everyone Really exciting
Lindsey 29:03 29:11 I love Halloween Around campus
Mae-ling 29:42 29:53 The sun’s out Great to be back

37

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

A sample editing script to be written for a video project, containing all of the audio and
video information to be presented in the program.

TITLE: MOVING PICTURES DRAFT NO: PAGE NO:


PRODUCER: MOSHY-WORLD 004 1/1

VIDEO AUDIO

Copyright Warning
CU aerial shot MUSIC UP
Key MP Production & Services Music. over CU aerial shot
POC at Vesuvius intro
Fire truck in parade
Wiring, Ill Bell Champaign office MUSIC UNDER
WS of special "easy to ride" bicycle at
Parkland College
"Time Pinnacle" laser show

NARRATOR: MOVING PICTURES is a


WS Skylights in Village Market Mall.
television production company.
MP at Business Expo
We make television programs for business and
Danville Village Mall
government.
We also provide a full range of services related
Key over stack of labeled tapes
to making and showing television programs,
Tape Duplication from tape duplication to consulting to
Facilities Design for designing and building Production
both production and Playback playback facilities.
POC: At MOVING PICTURES we make
pictures that move; images on videotape that
move in the literal sense, but also move your
POC in edit suite
audience to understanding or to action. You
have a number of ways to communicate with
your various audiences;

38

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Glossary of Terms for Desktop Compositing, Editing and Animation

Non-Linear Editing Film and video recorded to a hard disk and edited using a computer
(Digital Editing) showing a timeline and utilizing random access to the material.

Linear Editing Film and video assembled and cut in linear fashion i.e. film and tape reels
that are physically wound around a core with access limited by winding and
rewinding.

On-Line Broadcast quality editing and outputs using greater computing power,
hi-resolution masters and larger storage capacity and more robust visual
effects.

Off-Line Non-Broadcast quality editing using video compression using less


computing power, limited storage capacity and limited titling and effects.
Outputs are generally at a low resolution but an EDL can be created to take
to an on-line facility for re-assembly at hi-res.

Digitize Transfer of videotape to a disc-based computer system. This can be


(Capture) automated using a telecine log or transferred "on the fly" if logs don't
exist.

E.D.L Sometimes referred to as a paper list but usually a digital file specially
formatted
(Edit Decision List) to reflect all the audio and video cuts and some visual effects and
transitions.

False Edit An edit in the timeline, which separates or cuts between two continuous
frames. Sometimes used to key in effects or transitions.

Jump Cut An edit where one or many frames are missing causing the subject matter
to "jump" in time or space.

Continuity Series of cuts that form an entire sequence usually in a specific order
according to script or story logic. Since films are often shot out of sequence
it is up to the editor to follow the continuity.

Assemble Edit Linear editing term which refers to material recorded to tape in its entirety
without stopping while adding timecode at the same time.

Insert Edit Linear editing term where video and audio can be re-recorded in sections
on an assembled edit using the original pre-recorded timecode.
Also known as "punching in".

39

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Time Code A time code is a sequence of numeric codes generated at regular


intervals by a timing system. Time codes are used extensively for
synchronization, and for logging material in recorded media.
 Hours:Minutes:Seconds:Frames
 00:01:58:11
Drop Frame T.C. matches clock time exactly which is why television
shows always use it. This is done by skipping the count of the last two
frames of every minute. They still exist: they are simply not counted.

Non-Drop T.C. is not clock time but actually runs longer than 30 f.p.s.
If you are not concerned with the exact running length of your project then
this is the best one to use.

Log A database used to keep notes and manage separate clips and source
reels using timecode and slate information. A log book is essential for
every project

Visual Effects Dissolve


(Transitions) Fade Wipe Key Super Title
Speed Change Blur/De-focus Filter
Stutter
Re-Position

Audio Effects Gain


(Transitions) EQ Reverb Echo Flange Futz Phase Fade
Cross Fade
Pan
The above are samples of the most popular types of transitions that can be
used solo or in combination using false edits and/or key frames.

Key Frame A key frame is the beginning, middle or end marker of an effect. The
key frame also contains the information or values of the specific effect. A
minimum of two keyframes is necessary to trigger a transition or change in
values over time.

Black & Code Video tapes are sometimes pre-formatted by first laying down a black
signal and timecode. New tapes (rawstock) as a rule must be Blacked &
Coded (B & C) before an insert edit can take place.

Bars & Tone Prior to programs being recorded to tape, SMPTE Color Bars & Tone
must be placed at the head of the show as reference to give the dubbing
operator something to "line-up" to. Audio should be a continuous tone
of -18dB. 30 seconds to 1 Minute is preferred.

40

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Tape Formats Digital: D-5, HD Cam, DigiBeta, HDV, DVCPro, DVCam, Mini DV,
Analog: Beta Sp, 3/4", Hi 8, VHS

Broadcast Standards NTSC (U.S., Japan, Latin America)


PAL (Most of Europe, Australia, South Africa) SECAM
(Russia and France)

NTSC is the original TV broadcast format using 525 lines of resolution at


29.97 fps
PAL uses 625 lines and thus offers greater resolution at 25 fps.
HI-DEF is now available at 1080 or 720 lines of resolution in various
frame rates.

After Effects Adobe software for compositing and editing of movies using imported video &
audio layers and graphics. Also utilizes plug-ins and filters.

Alpha Channel Information contained in graphic files to create mattes or “negative space”.
Can be invisible or refer to a colored background. The alpha channel must be
expressly saved in order for other programs to “see” it and make use of while
keying effects.

Anti-Alias Aliasing is the problem when creating curves using (square) pixels
Hard jagged lines can sometimes be softened by defocusing or feathering the
image.

Artifact Similar to aliasing except this is digital glitch or “burp”. Since video files are
usually compressed the program often fails to completely or accurately render
an image thus creating artifacts. Low resolution images often show more artifacts.

Aspect Ratio Frame size as it relates to Width vs. Height.


TV is 4 x 3. Hi-Def & Letterbox is 16 x 9. Theatrical Film Projection is 1.85 x 1.
Asset Any original sound, picture or graphic file used to create a movie or animation.
Avi Windows version (.avi) of a video clip. Newer codec from Microsoft is .wmv
Avid The workhorse editing system of the TV and movie industry that comes in a
variety of configurations using both off-line (compressed) video or on-line
(uncompressed) video.

Back Light This light just gives the subject an edge so as to be distinguishable from the
background. This light is also important when using the Green Screen to get a
clean edge when keying. This light is usually as intense or almost as intense as the
Key Light.

Bandwith Amount of digital information pumped through a system or network such as the
Internet. DSL or Cable modems offer improved bandwidth performance and are
better suited to streaming audio and video.
41

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Batch Render A feature which creates a list of movies that need rendering and tell the computer
(queue) to work on them in the order displayed. See also Smart Render.
Beta SP Professional/Industrial grade Sony video format. “Near” broadcast quality. Blue
Screen Objects shot in front of a color field and later composited onto a different
(Green Screen) background by pulling out or eliminating the original color.

CAD Computer Assisted Drawing. Archaic term referring to the original programs
used
in drafting and design work. Term not generally used in multimedia.
Capture Sometimes referred to as digitize. The importing of video and audio in real time.

CD-Rom Compact Disc developed in the eighties for storage and retrieval of digital files
and music. Audio files distributed commercially are usually sample rate 44.1k
files.

Channel (Tracks) Video, Audio, Titles, Alpha Channels all can be shown as tracks or channels
playing in a timeline.

Compression Video compression refers to reducing the quantity of data used to represent video
images and is a straight forward combination of image compression and motion
compensation.

Chroma Key or Key(ing) A process which allows superimposition or replacement of one video picture in
a predetermined area of another one. The first picture is photographed with an
object or person against a special, single-color background. The complete color
content of this particular signal is removed and the second picture is inserted in the
area where the background was.

Chryon Archaic expression referring to computer generated titles possibly still used by
major broadcast studios but not applicable to desktop systems.

Cross Platform The ability of a file to be read on both PC and Mac systems. Files should have a
dot extension for it to be read on a PC. Programs, on the other hand, cannot “cross
over” and must be written specifically for each platform. Apple recently
introduced their Intel platform which requires new code to be written for all
programs
Clock Speed Refers to CPU computing power as measured in GigaHertz. The Apple G5 you
are using usually features Dual 1.8 Gig processors.

Clone A digital copy or exact duplicate. Clones are often created from digital master
tapes for safety reasons in case the original is lost or damaged.

Codec Short for COmpression/DECompression. Refers to compression scheme for a


specific video asset. If your computer doesn’t have the right codec it cannot play
back the video.

42

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Even DVD’s use compression so that an entire feature film can fit on one disk
Composite Video Analog Video Signal such as from a VHS deck carried over a single coaxial
cable. Component Video. Three separate video signals each carrying a single
color value or RGB. See Beta SP.

Compression (files) Files can be compressed for size by a utility. Macintosh files can be compressed
using a utility such as Stuffit. On a PC, there is one utility is known as WinZip. It
is important to note that you can NOT compress Video using these utilities.

Decode To playback a file format using a Codec. See also Encode.

De-Focus An effect which softens an image, also known as blurring.

De-Saturate To remove colors. If enough de-saturation occurs the image will be black and
white or sepia. See also RGB.

Digi-Beta Component Video but in a digital format. The master tape format for
broadcasting.

Digitize The act of transferring media from an analog system such as magnetic tape to a
digital format on a computer. Modern DV camera's record a digital signal onto a
magnetic tape but the term of transferring that media to a computer is still called
digitizing. This is also called Capturing.

Director Macromedia authoring program to create Shockwaves primarily used for web
animations. This program is less popular since the advent of the newer Flash
program.

Dissolve A term used in video and audio editing to describe a procedure whereby one
signal is gradually faded out while a second signal is faded in until it fully replaces
the first signal. Also known as Cross-Fade.

Dolby Formerly a noise reduction system for analog tapes, Dolby’s biggest
contribution
Is Surround Sound for home and theater systems using 5.1 technology which
offers six channels of audio contained in a single audio stream on DVD known as
AC-3.
Dreamweaver Macromedia Web Authoring Tool featuring drag and drop features and
WYSIWYG.

Drop Shadow Effect which creates depth usually in context of lettering or graphic images.
Dual Stream Video High end editing systems such as Avid Symphony or Nitris offer dual stream
uncompressed video with real time effects. This allows for faster workflow as it
eliminates the need to render effects before viewing. See also Preview.

43

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Duration The time measured between an In and Out point. See Time Code.

DV Digital Video in a variety of formats including Mini-DV according to the


manufacturer.

DVD Digital Versatile Disk

DVD-Audio Audio enthusiasts can now enjoy turbo-charged audio using sampling rates as
high as 96k or higher.

DVD-RAM Use of a recordable DVD much like Hard Drive to store files

DVD-RW Recordable and Re-Writeable DVD technology


DVD+RW Competing technology for recordable DVD’s not compatible with DVD-RW

Encode (Hardware) Expensive systems use video cards (hardware) which are dedicated to “real
time” encoding tasks.

Encode (Software) Inexpensive systems use software which are slower because they do not operate in
real time. Instead the video is captured or imported and then must be rendered thru
processing via the CPU.

Export/Import Transfer of files into and out of one program and into the next. Importing does not
actually move the file but instead creates a link where the program can then see it
and refer to it. If the file is moved then the program will be unable to see it and
must be told again where to find it.

Fade Can be a fade to black or a fade to any other color you choose.
Feathering Hard jagged lines such as dithering can be softened or feathered.
Fibre A fast method of data transfer similar in speeds to SCSI but has the advantage of
Being able to use longer cables.

Fill Light The Fill Light is just as it’s name implies, it fills in the details left out by the key
light. This light also defines the contrast to our image, a dimmer fill light will add
contrast to the picture.
Filter Just what the word suggests. Audio or Video filters alter, adjust or tweak the
channel with a certain desired effect.

Final Cut Pro Apple’s Desktop Editing System which is best suited for editing DV and HD
projects.

Firewire (IEEE 1394) A Standardized cable used to connect video equipment to computer equipment.
It allows for high-speed transfers and device control.
The latest cabling technology allows bi-directional signal paths. Firewire is
favored by DV cameras because it provides “machine control” from the computer
44

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

to the camera and allows the user to shuttle the videotape remotely from the
keyboard or mouse.

Flash The ultimate in Web animation is now featured in many recent TV spots.
Flash players by the way are the most universal playback systems in the world.

Flatten Photoshop (.psd) files are flattened when saving as a JPEG (.jpg) This literally
means the combining of various layers into one.

Font Lettering Styles which come in an almost infinite variety.

Frame Rate Film runs at 24 fps. Video is usually 30 fps and sometimes pulled down to 29.97
fps.
HD can also run at 24 fps. Video on the web often runs at 15 fps to save on file size.

Frame Size NTSC DV frame size is measured at 720 x 480 pixels. HD is 1080 x 1920
pixels.
Futz To degrade or alter a sound or image. See also Filter.

H.264 Newer Apple Codec that is highly efficient.

HDV HDV is a recording format that compresses the video before it is recorded, this
extensive compression can cause motion problems in the final Video called
Compression Artifacts. This generally only happens when shooting fast
movement.

Html Hyper Text Markup Language. The basic language of the web offering links to
other pages or programs.

Import See Export

Interlace Scan (I) Older video technologies such as NTSC use interlacing. When importing or
capturing it is sometimes necessary to “de-interlace” the picture. See also
Progressive (P) Scan.

JPEG (.jpg) A still photo image which is easily shared.

Key Frame A key frame is the beginning, middle or end marker of an effect. The key frame
also contains the information or values of the specific effect. A minimum of two
keyframes is necessary to trigger a transition.

Key Light The Key Light is the primary light in this setup and is usually the largest brightest
light. This light will give most of the detail to our image but as you can see some
of the details on the left side of the face are indistinguishable.

45

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Kerning Refers to spacing between each letter of text.

Layers Each video channel or photoshop layer is combined to make a composite image.

Log A database used to keep notes and manage separate clips and source reels
using timecode and slate information. Sometimes stored as an Excel spreadsheet.

Matte Silhouette or cut-out “negative image”. If image is moving it could be referred to


as a “travelling matte”. Matte information is usually referred to as “Alpha
Channel”.

Movie (.mov) Quicktime movie

Motion Apple’s low cost answer to After Effects. The advantage to using it in conjunction
with Final Cut Pro is the ability to make changes in nested effects without having to
re-export.

MPEG-1 See Video-CD

MPEG-2 Compression scheme used in making DVD’s.

MP3 Popular form of music file for use on the web or playing on a set-top or portable
player.

MP4 One of the latest efficient codecs used for delivering content over the web. Nested
Effect Denotes an effect combined with another effect.

Noise Analog noise is visible as scratches in the image or noticeable grain. Audible
noise can be clicks or pops in the track. Digital technology can re-create this by use
of filters or plug-ins such as Cinelook or Film Damage.

NTSC Video signal used in the U.S. and Japan and most of Latin America.
Usually 525 lines of resolution it is generally inferior to PAL. See Frame Size.

PAL Video signal used in most of Europe (except Russia and France) using added
lines of resolution. See Frame Size.

Peel A transition effect similar to a wipe but usually with a curled hard edge.

Photoshop (.psd) Photoshop files usually come in layers and must be flattened before saving as
JPEG’s.

46

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Pixel The basic unit of measurement for digital graphics. The smallest "dot" on a
Monitor or Image. The PIXEL count is a measure of the screens' maximum
resolution. Pixels for a screen work a little differently than pixels for an image, an
image pixel can be many different colors but a screen pixel is limited to only 3
colors, Red Green and Blue. A pixel is usually square.
 HD
1080 - 1920×1080 - 2,073,600 Pixels
720 - 1280×720 - 921,600 Pixels
 SD
480 - 720/640×480 - 460,800 Pixels
*720 X 480 for a computer, 640 X 480 on a TV Effective, Small number is the
height

Plugin A small program that works alongside a larger program.

Preview Sophisticated editing systems will preview an effect in real time. If not then the
effect must be rendered before viewing. The preview function saves time and disk
space.

Progressive Scan (P) The latest hi-definition video technology or HD uses progressive scan instead of
interlace, which is similar to how computer monitors update their images. See also
Refresh Rate.

Pulldown 30 fps video is often “pulled down” or slowed down to 29.97 fps in order to
accomodate the transfer of film rates. “3/2 Pulldown” actually refers to the
difference in frames between 24 fps and 30 fps in which certain frames are repeated
or skipped.
See also Reverse Pulldown.

Quicktime Apple’s media player for audio and video files (.mov)

Loop When timeline sequences reach the end they often start again, unless you tell it to
stop. The same is applies for menus in DVD’s which will play endlessly unless
programmed to “time-out”. See also Time Out.

RAM Random Access Memory. Allows computers to have multiple programs open at
a time.

Real Time Effects Hardware cards or fast computers can often preview or play out visual effects
without having to first render. See also Preview.

47

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Refresh Rate Computer monitors will update themselves anywhere from 60 to 100 times a
minute depending on the refresh rate.

Render Before a transition effect can be saved to the hard drive it must first be
rendered.
See also Smart Render.

Reposition (Repo) Moving an image within the frame. Resizing and repositioning is commonplace
in T.V.

Resolution (Audio) Sometimes referred to as 8, 16 or 32 bit audio but more likely referred to as
“Sample Rates” of 22k, 41k, 48k or 96k

Resolution (Video) D5 is the ultimate in video resolution followed by HD Cam (8 bit or 10 bit),
Digi- Beta. DV comes shortly thereafter because it is even more compressed.
See also Compression
Reverse Pulldown Frames are added back in to a digitized video sequence to create 24 frames.
RGB Red, Green, Blue values expressed as three different
components of the entire
color spectrum. Adding or subtracting one value causes the image to shift in
color.
See also Component Video. See also Luminance.

RAM Random Access Memory (or simply "Memory") Allows multiple


applications to be open at the same time. The greater the RAM , the more
power and flexibility of your box.

ROM Read Only Memory. No re-writing or recording.

Rotation You can take an object or image and rotate it on the X, Y or Z axis.

Rotoscope A process where the action is first filmed live and later traced using an
animation process.

Scratch Disk Location of captured or rendered files on a hard disk.

Sequence A group of clips ordered in a timeline containing cuts, transitions and titles.
Scaleable Resolution Independent. See Vector Based.

SCSI A fast method of data transfer can be achieved using a “SCSI bus” . Firewire
and USB connections frequently are too slow for many media applications.

Scripting (Logic) Directions placed in a timeline which can trigger different events. Used in
Flash

48

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

Shockwave Macromedia Plugin Player which plays animations on the web.

Skip Frame (Stutter) An effect created by removing frames and/or repeating frames. Popular in
music videos.
Soften (De-Focus) Also known as blurring. See also Feathering.

Source Reel Logs contain information such as Timecode and Source Reel so that master
tapes (Master Reel) can be vaulted and later brought back for re-editing
.
Smart Render “Smart Rendering” is the ability to partially render an effect and later go back
and pick up where it left off. The editor can start
and stop a render which saves time.

Speed Change Film is 24 fps and video is 30 fps. The frame rate can be changed to create an
effect. As a rule, it is best to try even multiple divisions such as 15, 12, 10 fps.

Storage The capacity of a drive or disk to store information.

Streaming (VOD) Video on demand. Video files are now available over the internet and can play
back in real time.

Stuffed Files (.sit) Compressed files for the mac using a utility called Stuffit.

Stutter (Skip Frame) Frames can be repeated or skipped thus creating an effect. Popular among
music videos.

Super-Imposition (Super) The layering of one image or text file over another.

Telecine Sound and Picture are usually sync’d up at this point and transferred to tape.
(Film to Tape Transfer) A log is created of all the shots and included in a
database or shot list.

Time Code Hours:Minutes:Seconds:Frames Drop Frame refers to actual clock timings.


(Drop Frame vs. Non-Drop Frame) whereas Non-Drop actually runs longer.

Timeline Graphic display of tracks, channels and titles over time.

Time Out The maximum time before an effect or change occurs in a program or DVD.

Title Card Text over a color field or super imposed over an image.

Transition Any type of movement or change from one shot to the next. (Transition
Effect) Examples: Fade, Wipe, Key, Dissolve, Blur, Reposition etc.

49

Compiled by J. K. Githua Mount Kenya University 2015


BJL 3206 Editing for Electronic Media

USB A type of cable connection which can also use a hub. USB2 format is also
available which is backward compatible to USB1 and is almost as fast as
firewire.
Vector based Vector based graphics can be scaled to any size without loss of resolution.

Video-CD(VCD) Low resolution (near-VHS quality) video files that can be stored on a CD-Rom
or read on a DVD set-top box using MPEG-1 files. Format is very popular in
China.

Video File Formats AVI - Audio Video Interleaved. A computer graphics animation format used in
MicrosoftVideo for Windows.
 MOV - Format used by Apple QuickTime
 WMV - Format Currently Used by Microsoft
 FLV - File format used for Flash Video

Web enabled DVD programming information which opens a browser and steers user to a
Web page.

White Balance An electronic process used in video cameras to retain true colors. White
balancing is performed prior to recording a specific scene. The camera is
pointed at a white object (a wall, for example) and controls on the camera are
adjusted until a hairline in the viewfinder is brought to a particular point. This
ensures
that the tints in the videotape will be natural.
White balance with a sheet of white paper works best, balancing with a colored
paper will give you a hue of the inverse color.

Wipe Transition effect using a continuous motion with either a hard or soft edge or
border.

Windows Media Player A format sometimes known as .wmv. Also see Avi.

Wire Frame 3D Modeling programs use a tracing tool which creates an object with wires
only.
Later animators will fill or cover the object so it is completely solid.

Wire Removal Visual Effects such as a person flying through the air require harnesses that are
shot in live action and therefore must be digitally erased.

X,Y,Z values Values which measure position of frame and the ability to create 3D images:
X = Up/Down (Vertical) Y = Left/Right (Horizontal) Z = Scaling (Size)

Zipped Files Compressed files using the .zip extension. If you see a .zip file it usually is a
PC file.
WinZip is a popular utility for this. See also Stuffed Files.
50

Compiled by J. K. Githua Mount Kenya University 2015

You might also like