You are on page 1of 15

 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Unit  One  
 
Introduction  to  Video  Editing  
   
Analog vs. Digital
Analog and digital signals are used to transmit information, usually through electric
signals. In both these technologies, the information, such as any audio or video, is
transformed into electric signals. The difference between analog and
digital technologies is that in analog technology, information is translated into
electric pulses of varying amplitude. In digital technology, translation of information
is into binary format (zero or one) where each bit is representative of two distinct
amplitudes.
 
Comparison chart
Analog versus Digital comparison chart

Analog Digital
Signal Analog signal is a continuous Digital signals are discrete time
signal which represents physical signals generated by digital
measurements. modulation.

Waves Denoted by sine waves Denoted by square waves

Representation Uses continuous range of values Uses discrete or discontinuous


to represent information values to represent information

Example Human voice in air, analog Computers, CDs, DVDs, and


electronic devices. other digital electronic devices.

Technology Analog technology records Samples analog waveforms into a


waveforms as they are. limited set of numbers and
records them.

Data Subjected to deterioration by noise Can be noise-immune without


transmissions during transmission and write/read deterioration during transmission
cycle. and write/read cycle.

Response to More likely to get affected reducing Less affected since noise
Noise accuracy response are analog in nature

Flexibility Analog hardware is not flexible. Digital hardware is flexible in


implementation.

Uses Can be used in analog devices Best suited for Computing and
only. Best suited for audio and digital electronics.
video transmission.

Applications Thermometer PCs, PDAs

Bandwidth Analog signal processing can be There is no guarantee that digital


done in real time and consumes signal processing can be done in
real time and consumes more
Analog versus Digital comparison chart

Analog Digital
less bandwidth. bandwidth to carry out the same
information.

Memory Stored in the form of wave signal Stored in the form of binary bit

Power Analog instrument draws large Digital instrument drawS only


power negligible power

Cost Low cost and portable Cost is high and not easily
portable

Impedance Low High order of 100 megaohm

Errors Analog instruments usually have a Digital instruments are free from
scale which is cramped at lower observational errors like parallax
end and give considerable and approximation errors.
observational errors.

Understand Digital Video


Before you jump into the world of digital video, it is important to
understand the difference between analog and digital video and why analog
video is no longer an acceptable form of video if you are living in modern
times!

Digital vs. Analog Video


 
Analog video uses an electrical signal to capture images on magnetic
tape. The video tape formats associated with analog video are VHS, VHS-
C, 8mm,Hi8, Video8, Betamax and SVHS. A digital video signal is a pattern
of 1’s and 0’s that represent the video image. There is no variation in the
original signal once it’s captured, and the image does not lose any of its
original sharpness or clarity---it’s an exact copy of the original. Due to the
major advances in digital technology, it is highly recommended by
StashSpace that you shoot with a digital video camera and not analog
video.

Digital Video formats


include MiniDV, Digital8, MicroMV, DVCam tapes DVD and mini DVDs and
hard drive based camcorders.
Here’s why we recommend Digital Video over Analog Video:

• Digital video quality begins and remains digital. No quality is lost


when you copy your video. Plus with digital video you can capture
videodirectly from your digital camcorder to your computer
via FireWire. Analog degrades with each copy and requires a special
video capture card in order to import video onto your computer.

• Digital video resolution is higher than analog because you can


choose how many pixels, some of which offer as much as 410,000 or
more pixels per Digital CCD (charged couple devices). No matter
what digital video format you choose, they all have excellent
resolution quality. Although some digital formats have higher quality
than others, the quality of analog can deteriorate only after 5 years.

• Analog recordings are highly susceptible to deterioration. Although a


regular analog camcorder may be cheaper in price, it is imperative to
use digital video with digital camcorders so that preservation of your
video lasts and remains in the best condition.

Digital video cameras are easy to connect to computers making for


easy video editing and duplication. Plus the cameras have more
features, are lighter, and more compact.
 
   
Difference  Between  All  Video  Formats  
 
The world of video formats can be pretty confusing, but there are only a few things you really need
to know. First, it's important to note that a video format is more than just its file extension.
Extensions like AVI are not, in fact, video codecs—they're containers. Here's what you need to
know about how it all works.

What Is a Codec?

Most of the video you'll come across is compressed, meaning its been altered to take up less space
on your computer. For example, a regular Blu-Ray disc usually takes up around 30 or 50GB of
space—which is a lot for a normal person to download or store on their hard drive. So, we usually
compress movies to make them more manageable, usually with some loss in video quality.

A codec compresses and decompresses data. It interprets the video file and determines how to play
it on your screen. Your computer comes with many codecs pre-installed, though you can install
codec packs for wider support, or a program like VLC or PotPlayer which have lots of codec
support built-in (which we prefer).

Some examples of codecs include:

• FFmpeg (which includes formats like MPEG-2, the format in which DVDs are stored, and
MPEG-4, the format Apple uses for movies in the iTunes store)

• DivX, which works with a certain type of MPEG-4 file, and was often used to rip DVDs in
the pre-HD era

• XviD, an open source version of DivX, popular among movie pirates

• x264, which compresses H.264 videos (Also known as MPEG-4 AVC), and is very popular
for high definition videos

There are a lot of different codecs out there, and it can get really confusing with all the different
versions of MPEG standards. These days, you really only need to concern yourself with a few—
which we'll talk about in a moment.
What Is a Container?

A container is, essentially, a bundle of files. Usually a container consists of a video codec and an
audio codec, though it can also contain things like subtitles. Containers allow you to choose one
codec for your video and one for your audio, which is nice—that way, you can choose to use the
high-quality DTS audio, or compress your audio to something like MP3 for even more space
savings. It just gives you a bit more control over how you record your videos or rip your movies.
Popular containers include:

• AVI

• Matroska (which uses the extension MKV)

• MP4 (which has been popularized by Apple in the iTunes Store—note that this can also
come with the M4V extension, but the container is the exact same)

• MOV (which was created by Apple)

The main difference between different containers is not only the codecs they support but what
other features they support—like subtitles or chapters. These days, MKV is an extremely popular
container, mainly because it supports nearly any video codec under the sun, as well as a ton of
extra features (plus it's open source).
Understanding  Linear  vs  Non-­‐linear  Editing  

In  the  past,  film  editing  was  done  in  a  linear  fashion,  where  the  film  was  literally  cut  
into  long  strips  divided  by  scene  and  take,  and  then  glued  or  taped  back  together  to  
create  a  film  in  logical  sequence.  This  was  time-­‐consuming,  tedious  and  highly  
specialized  work.  While  linear  editing  is  still  relevant  today,  there  is  a  newer  and  more  
user-­‐friendly  system  available  for  editors:  nonlinear  editing.  Curious  about  what  these  
systems  can  and  can’t  do  and  the  pros  and  cons  each  system  has?  Well,  let’s  take  a  
look…  
 
Linear  Video  Editing  Method  
Linear  video  editing  is  a  process  of  selecting,  arranging  and  modifying  images  and  
sound  in  a  pre-­‐determined,  ordered  sequence  –  from  start  to  finish.  Linear  editing  is  
most  commonly  used  when  working  with  videotape.  Unlike  film,  videotape  cannot  be  
physically  cut  into  pieces  to  be  spliced  together  to  create  a  new  order.  Instead,  the  
editor  must  dub  or  record  each  desired  video  clip  onto  a  master  tape.  
 
For  example,  let’s  say  an  editor  has  three  source  tapes;  A,  B  and  C  and  he  decided  that  
he  would  use  tape  C  first,  B  second  and  A  third.  He  would  then  start  by  cutting  up  tape  C  
to  the  beginning  of  the  clip  he  wants  to  use,  then  as  he  plays  tape  C,  it  would  also  be  
simultaneously  recording  the  clip  onto  a  master  tape.  When  the  desired  clip  from  tape  
C  is  done,  the  recording  is  stopped.  Then  the  whole  process  is  repeated  with  tapes  B  
and  A.  
 
Pros  vs  Cons  
There  are  a  couple  of  disadvantages  one  would  come  across  when  using  the  linear  
video  editing  method.  First,  it  is  not  possible  to  insert  or  delete  scenes  from  the  master  
tape  without  re-­‐copying  all  the  subsequent  scenes.  As  each  piece  of  video  clip  must  be  
laid  down  in  real  time,  you  would  not  be  able  to  go  back  to  make  a  change  without  re-­‐
editing  everything  after  the  change.  
Secondly,  because  of  the  overdubbing  that  has  to  take  place  if  you  want  to  replace  a  
current  clip  with  a  new  one,  the  two  clips  must  be  of  the  exact  same  length.  If  the  new  
clip  is  too  short,  the  tail  end  of  the  old  clip  will  still  appear  on  the  master  tape.  If  it’s  too  
long,  then  it’ll  roll  into  the  next  scene.  The  solution  is  to  either  make  the  new  clip  fit  to  
the  current  one,  or  rebuild  the  project  from  the  edit  to  the  end,  both  of  which  is  not  
very  pleasant.  Meanwhile,  all  that  overdubbing  also  causes  the  image  quality  to  
degrade.  
 
However,  linear  editing  still  has  some  advantages:  
• It  is  simple  and  inexpensive.  There  are  very  few  complications  with  formats,  
hardware  conflicts,  etc.  
 
• For  some  jobs  linear  editing  is  better.  For  example,  if  all  you  want  to  do  is  add  
two  sections  of  video  together,  it  is  a  lot  quicker  and  easier  to  edit  tape-­‐to-­‐tape  
than  to  capture  and  edit  on  a  hard  drive.  
 
• Learning  linear  editing  skills  increases  your  knowledge  base  and  versatility.  
According  to  many  professional  editors,  those  who  learn  linear  editing  first  tend  
to  become  better  all-­‐round  editors.  
 
Nonlinear  Video  Editing  Method  
The  nonlinear  video  editing  method  is  a  way  of  random  access  editing,  which  means  
instant  access  to  whatever  clip  you  want,  whenever  you  want  it.  So  instead  of  going  in  a  
set  order,  you  are  able  to  work  on  any  segment  of  the  project  at  any  time,  in  any  order  
you  want.  In  nonlinear  video  editing,  the  original  source  files  are  not  lost  or  modified  
during  editing.  This  is  done  through  an  edit  decision  list  (EDL),  which  records  the  
decisions  of  the  editor  and  can  also  be  interchanged  with  other  editing  tools.  As  such,  
many  variations  of  the  original  source  files  can  exit  without  needing  to  store  many  
different  copies,  allowing  for  very  flexible  editing.  It  is  also  easy  to  change  cuts  and  
undo  previous  decisions  simply  by  editing  the  EDL,  without  having  to  have  the  actual  
film  data  duplicated.  Loss  of  video  quality  is  also  avoided  due  to  not  having  to  
repeatedly  re-­‐encode  the  data  when  different  effects  are  applied.  
 
Nonlinear  editing  differs  from  linear  editing  in  several  ways.  
• First,  video  from  the  sources  is  recorded  to  the  editing  computer’s  hard  drive  or  
RAID  array  prior  to  the  edit  session.  
• Next,  rather  than  laying  video  to  the  recorder  in  sequential  shots,  the  segments  
are  assembled  using  a  video  editing  software  program.  The  segments  can  be  
moved  around  at  will  in  a  drag-­‐and-­‐drop  fashion.  
• Transitions  can  be  placed  between  the  segments.  Also,  most  of  the  video  editing  
programs  have  some  sort  of  CG  or  character  generator  feature  built  in  for  lower-­‐
thirds  or  titles.  
• The  work-­‐in-­‐progress  can  be  viewed  at  any  time  during  the  edit  in  real  time.  
Once  the  edit  is  complete,  it  is  finally  laid  to  video.  
• Non-­‐linear  video  editing  removes  the  need  to  lay  down  video  in  real  time.  It  also  
allows  the  individual  doing  the  editing  to  make  changes  at  any  point  without  
affecting  the  rest  of  the  edit.  
 
Pros  vs  Cons  
There  are  many  advantages  a  nonlinear  video  editing  system  presents.  First,  it  allows  
you  access  to  any  frame,  scene,  or  even  groups  of  scenes  at  any  time.  Also,  as  the  
original  video  footage  is  kept  intact  when  editing,  you  are  able  to  return  to  the  original  
take  whenever  you  like.  Secondly,  nonlinear  video  editing  systems  offers  the  flexibility  
of  editing.  You  can  change  your  mind  a  hundred  times  over  and  changes  can  also  be  
made  a  hundred  times  over  without  having  to  start  all  over  again  with  each  change.  
Thirdly,  it  is  also  possible  to  edit  both  standard  definition  (SD)  and  high  definition  (HD)  
broadcast  quality  videos  very  quickly  on  normal  PCs  which  do  not  have  the  power  to  do  
the  full  processing  of  the  huge  full  quality  high  resolution  data  in  real-­‐time.  
 
The  biggest  downside  to  nonlinear  video  editing  is  the  cost.  While  the  dedicated  
hardware  and  software  doesn’t  cost  much,  the  computers  and  hard  drives  do,  from  two  
to  five  times  more  than  the  gear.  As  such,  the  average  price  for  a  basic  nonlinear  video  
editing  package  can  come  in  between  $5,000  and  $10,000.  For  stand-­‐alone  systems  that  
approach  broadcast  quality,  the  amount  you  pay  may  be  twice  that.  However,  as  the  
nonlinear  technology  pushes  forward,  count  on  big  gains  in  digital  video  storage  and  
compression,  as  well  as  lower  prices  on  computers  and  hard  disks  in  the  very  near  
future.  
 
Making  the  Choice  
Now  that  you  know  the  differences  between  linear  and  nonlinear  editing  systems,  you  
are  now  equipped  to  make  a  choice  between  the  two  for  your  editing  needs.  But  keep  
this  in  mind  –  on  certain  types  of  production,  a  linear  editing  system  may  actually  be  
more  efficient  and  nonlinear  may  reign  supreme  on  other  types  of  productions;  so  don’t  
write  off  either  one.  Whatever  you  do,  just  make  sure  to  do  your  homework  before  
deciding.  
 
   
Different Types of Video Editing
There are several different ways to edit video and each method has its pros and cons. Although most editors opt for
digital non-linear editing for most projects, it makes sense to have an understanding of how each method works.

This page provides a very brief overview of each method — we will cover them in more detail in other tutorials.

Film Splicing

Technically this isn't video editing, it's film editing. But it is worth a mention as it was the first way to edit moving pictures and
conceptually it forms the basis of all video editing.

Traditionally, film is edited by cutting sections of the film and rearranging or discarding them. The process is very
straightforward and mechanical. In theory a film could be edited with a pair of scissors and some splicing tape, although in
reality a splicing machine is the only practical solution. A splicing machine allows film footage to be lined up and held in
place while it is cut or spliced together.

Tape to Tape (Linear)


Linear editing was the original method of editing electronic video tapes, before editing computers became available in the
1990s. Although it is no longer the preferred option, it is still used in some situations.

In linear editing, video is selectively copied from one tape to another. It requires at least two video machines connected
together — one acts as the source and the other is the recorder. The basic procedure is quite simple:

1. Place the video to be edited in the source machine and a blank tape in the recorder.
2. Press play on the source machine and record on the recorder.

The idea is to record only those parts of the source tape you want to keep. In this way desired footage is copied in the
correct order from the original tape to a new tape. The new tape becomes the edited version.

This method of editing is called "linear" because it must be done in a linear fashion; that is, starting with the first shot and
working through to the last shot. If the editor changes their mind or notices a mistake, it is almost impossible to go back and
re-edit an earlier part of the video. However, with a little practice, linear editing is relatively simple and trouble-free.
Digital/Computer (Non-linear)

In this method, video footage is recorded (captured) onto a computer hard drive and then edited using specialized software.
Once the editing is complete, the finished product is recorded back to tape or optical disk.

Non-linear editing has many significant advantages over linear editing. Most notably, it is a very flexible method which allows
you to make changes to any part of the video at any time. This is why it's called "non-linear" — because you don't have to
edit in a linear fashion.

One of the most difficult aspects of non-linear digital video is the array of hardware and software options available. There are
also several common video standards which are incompatible with each other, and setting up a robust editing system can be
a challenge.

The effort is worth it. Although non-linear editing is more difficult to learn than linear, once you have mastered the basics you
will be able to do much more, much faster.

Live Editing
In some situations multiple cameras and other video sources are routed through a central mixing console and edited in real
time. Live television coverage is an example of live editing.

Live editing is a fairly specialist topic and won't concern most people.

   
10 Rules for Video Editors
 
This list is based on a similar series of concepts that I picked up from Gretchen Siegchrist in an
article on Video Editing on About.com. I first compiled my variation when I started teaching video
editing at the New York Institute of Technology several years ago. I compiled it because I felt my
students needed somewhere to start their understanding from. Most of them had never even
attempted to think critically about what they see on the screen in front of them and I thought this
might help in that regard. Anyway, just a couple of idea follow.

Stay Motivated
Every cut should have a motivation. There should be a reason that you want to switch from one
shot or camera angle to another. Sometimes that motivation is a simple as, the camera shook, or
someone walked in front of the camera.
Ideally, though, your motivations for cutting should be to advance the narrative storytelling of your
video. One of the most obvious signs of amateur editing are cuts and transitions that have no
motivation behind them. Adding a cube spin transition may look cool to you but ask yourself, "does
this advance the narrative or is it merely distracting".

Match the Scene


The beauty of editing is that you can take footage shot out of order or at separate times, and cut it
together so that it appears as one continuous scene. To do this effectively, though, the elements in
the shots should match up. For example, a subject who exits frame right should enter the next
shot frame left. Otherwise, it appears they turned around and are walking in the other direction. Or,
if the subject is holding something in one shot, dont cut directly to a shot of them empty-handed. If
you dont have the right shots to make matched edits, insert some b-roll in between.

Cut on Motion
Motion distracts the eye from noticing editing cuts and is the most common way of achieving the
much sought after match cut. Cutting on motion helps to establish a motivation for the cut. So,
when cutting from one image to another, always try to do it when the subject is in motion. If you
have a shot of your subject turning, then cutting to a shot of a door opening (or someone
approaching, etc.) at the height of the subjects motion provides motivation for the previous action
and makes the cut seem natural and seamless.

B-Roll is your friend


A-roll is your main footage, your main subject or the main elements of your narrative, while B-roll is
everything else. B-roll refers to video footage that sets the scene, reveals details, or helps illustrate
or enhance the narrative. For example, if you are editing an event like a show opening you can
use footage of the building exterior, or the attendees arriving. These clips can be used to cover
any rough cuts, or smooth transitions from one scene to another.

De Plane boss, De Plane


For this one to work it requires that when the footage is being planned and shot you keep the rule
in mind. Imagine that there is a horizontal line between you and your subjects. Now, stay on your
side of the line. By observing this 180-degree plane, you keep a perspective that is more natural
for the audience. Changes in perspective that break this 180 degree plane can be jarring for the
audience because they make it impossible for the audience to establish their positional
relationship to the scene.
Whatever you do dont Jump, unless you really need to of course
Usually, editors strive for match cuts, seamless changes from one scene or camera angle to the
next, editing that is completely transparent to the viewer. A jump cut occurs when you have two
consecutive shots with dramatic differences. These differences can be based on movement,
screen position, etc. Jumps cuts can occur in any type of project. Often when editing interviews
you will want to cut out some words or phrases that the subject says. When the remaining clips
are placed side-by-side, the slight repositioning of the subject will be very jarring to the audience.

Cutting to b-roll can cover this jump.


By definition, Jump cuts are not seamless, they create a disconnect for the audience, it makes the
cut very obvious and makes them take notice. Sometimes this is in fact the intention though. Films
such as Alfred Hitchcocks Psycho and Goddards Breathless purposely use jump cuts to create a
dynamic uncomfortable experience for the viewer.

45 Degrees above Zero


When editing scenes shot with multiple cameras, always try to use shots that are looking at the
subject from at least a difference of 45 degrees. Otherwise, the shots may be too similar and
appear like a jump cut to the audience. If your shots are within that 45 degree arc you may still be
able to make use of them if the camera had two different levels. A close-up can usually be cut to a
long shot without worry.

Change your Level


This requires multiple cameras to achieve but is often worth the effort. When you have multiple
shots of the same subject, its easy to cut between them without creating a jarring experience for
the audience. So, when shooting an interview, or a lengthy event such as a wedding, its a good
idea to occasionally change focal lengths. A wide shot and a medium close up can be cut together,
allowing you to edit parts out and change the order of shots without obvious jump cuts.

Look for Similarity


This principle is the key to the much sought after match cut. Theres a cut in Apocalypse Now from
a rotating ceiling fan to the blades of a helicopter. There is a similar cut at the beginning of Stanley
Kubricks 2001: A Space Odyssey, in which a scene of a bone spinning in the air is cut to a scene
of a space station in orbit around Earth. The scenes change dramatically, but the visually similar
elements make for a smooth, creative cut.
You can do the same thing in your videos. Cut from a flower on a wedding cake to the grooms
boutonniere, or tilt up to the blue sky from one scene and then down from the sky to a different
scene.

Wipe, Wipe, Wipe


There are three transitions you will see with regularity; the cut, the cross dissolve and the wipe. At
weddings, I love it when people walk in front of the camera. They are apologetic, but unless it
happened during the vows or the first dance, I am grateful for the wipe they gave me to use during
editing. When the frame fills up with one element (such as the back of a black suit jacket), it makes
it easy to cut to a completely different scene without jarring the audience. You can set wipes up
yourself during shooting, or just take advantage when they happen naturally.

   
 
 
 
 

You might also like