You are on page 1of 17

Linear vs.

Non Linear Editing


In the early days of electronic video production, linear (tape-to-tape) editing
was the only way to edit video tapes. Then, in the 1990s, non-linear editing
computers became available and opened a whole new world of editing power
and flexibility.
Non-linear editing was not welcomed by everyone and many editors resisted the
new wave. In addition, early digital video was plagued with performance issues
and uncertainty. However, the advantages of non-linear video eventually
became so overwhelming that they could not be ignored.
In the 21st Century non-linear gained dominance and linear editing headed
towards obsolescence. During this time the description "non-linear" was slowly
abandoned as it was no longer necessary—almost all editing was now digital
and the "non-linear" aspect was assumed. Linear was dead.
Until around 2008 we recommended that aspiring editors still made the effort to
learn about traditional tape-to-tape editing, for reasons including the following:
1. It was simple and inexpensive. There were very few complications with
formats, hardware conflicts, etc.
2. Some simple jobs (e.g. appending one video to another) were much
quicker and easier with linear editing.
3. Interestingly, many professional editors of the time claimed that those
who learn linear editing first tend to become better all-round editors.
By 2010 we felt is was no longer necessary for most editors to know how to
work with tapes, although we'll never discount it completely. Even for the sake
of understanding the historical development of digital media, it's not a bad
investment of your time to learn about linear editing.

DIFFERENCE BETWEEN NON-LINEAR & LINEAR EDITING


Non-Linear editing is working in a software program such as Discreet
Combustion or Adobe Premiere, where you can digitize the footage, drag it and
drop it where you want. Linear editing is when you edit footage from tape a to
tape b by using in-and-out points, which you mark, and then transfer. You have
a tape machine and a monitor for each side, and then a controller with which
you log and mark the points.
Non-linear editing system
In video, a non-linear editing system (NLE) is a video editing (NLVE) or audio
editing (NLAE) digital audio workstation (DAW) system which can perform
random access non-destructive editing on the source material. It is named in
contrast to 20th century methods of linear video editing and film editing.
Non-linear editing is a video editing method which enables direct access to any
video frame in a digital video clip, without needing to play or scrub/shuttle
through adjacent footage to reach it, as was necessary with historical video tape
linear editing systems. It is the most natural approach when all assets are
available as files on hard disks rather than recordings on reels or tapes, while
linear editing is related to the need to sequentially view a film or read a tape to
edit it. On the other hand, the NLE method is similar in concept to the "cut and
paste" technique used in film editing. However, with the appropriation of non-
linear editing systems, the destructive act of cutting of film negatives is
eliminated. Non-linear, non-destructive editing methods began to appear with
the introduction of digital video technology. It can also be viewed as the
audio/video equivalent of word processing, which is why it is called desktop
video editing in the consumer space.[1]

Operation
Video and audio data are first captured to hard disks, video server, or other
digital storage devices. The data are either direct to disk recording or are
imported from another source. Once imported, the source material can be edited
on a computer using application software, any of a wide range of video editing
software. For a comprehensive list of available software, see List of video
editing software, whereas Comparison of video editing software gives more
detail of features and functionality.
In non-linear editing, the original source files are not lost or modified during
editing. Professional editing software records the decisions of the editor in an
edit decision list (EDL) which can be interchanged with other editing tools.
Many generations and variations of the original source files can exist without
needing to store many different copies, allowing for very flexible editing. It also
makes it easy to change cuts and undo previous decisions simply by editing the
edit decision list (without having to have the actual film data duplicated).
Generation loss is also controlled, due to not having to repeatedly re-encode the
data when different effects are applied.
Compared to the linear method of tape-to-tape editing, non-linear editing offers
the flexibility of film editing, with random access and easy project organization.
With the edit decision lists, the editor can work on low-resolution copies of the
video. This makes it possible to edit both standard-definition broadcast quality
and high definition broadcast quality very quickly on normal PCs which do not
have the power to do the full processing of the huge full-quality high-resolution
data in real-time.
The costs of editing systems have dropped such that non-linear editing tools are
now within the reach of home users. Some editing software can now be
accessed free as web applications; some, like Cinelerra (focused on the
professional market) and Blender3D, can be downloaded as free software; and
some, like Microsoft's Windows Movie Maker or Apple Inc.'s iMovie, come
included with the appropriate operating system.
A multimedia computer for non-linear editing of video will usually have a video
capture card to capture analog video and/or a FireWire connection to capture
digital video from a DV camera, with its video editing software. Modern web-
based editing systems can take video directly from a camera phone over a
GPRS or 3G mobile connection, and editing can take place through a web
browser interface, so, strictly speaking, a computer for video editing does not
require any installed hardware or software beyond a web browser and an
internet connection.[citation needed]
Various editing tasks can then be performed on the imported video before it is
exported to another medium, or MPEG encoded for transfer to a DVD.
History
The first truly non-linear editor, the CMX 600, was introduced in 1971 by CMX
Systems, a joint venture between CBS and Memorex. It recorded & played back
black-and-white analog video recorded in "skip-field" mode on modified disk
pack drives the size of washing machines. These were commonly used to store
about half an hour of data digitally on mainframe computers of the time. The
600 had a console with 2 monitors built in. The right monitor, which played the
preview video, was used by the editor to make cuts and edit decisions using a
light pen. The editor selected from options which were superimposed as text
over the preview video. The left monitor was used to display the edited video. A
Digital PDP-11 computer served as a controller for the whole system. Because
the video edited on the 600 was in black and white and in low-resolution "skip-
field" mode, the 600 was suitable only for offline editing.
Various approximations of non-linear editing systems were built in the '80s
using computers coordinating multiple laser discs, or banks of VCRs. One
example of these tape & disc-based systems was Lucasfilm's EditDroid, which
used several laserdiscs of the same raw footage to simulate random-access
editing (a compatible system was developed for sound post production by
Lucasfilm called SoundDroid--one of the earliest digital audio workstations).
The LA-based post house Laser Edit (which later merged with Pacific Video as
Laser-Pacific) also had an in-house system using recordable random-access
laserdiscs. Another non-linear system was Ediflex, which used a bank of
multiple Sony Betamax VCRs for offline editing. All were slow, cumbersome,
and had problems with the limited computer horsepower of the time, but the
mid-to-late-1980s saw a trend towards non-linear editing, moving away from
film editing on Movieolas and the linear videotape method (usually employing
3/4" VCRs).
The term "nonlinear editing" or "non-linear editing" was formalized in 1991
with the publication of Michael Rubin's Nonlinear: A Guide to Digital Film and
Video Editing (Triad, 1991) -- which popularized this terminology over other
language common at the time, including "real time" editing, "random-access" or
"RA" editing, "virtual" editing, "electronic film" editing, and so on. The
handbook has remained in print since 1991, currently in its 4th edition (Triad,
2000).
Computer processing advanced sufficiently by the end of the '80s to enable true
digital imagery, and has progressed today to provide this capability in personal
desktop computers.
An example of computing power progressing to make non-linear editing
possible was demonstrated in the first all-digital non-linear editing system to be
released, the "Harry" effects compositing system manufactured by Quantel in
1985. Although it was more of a video effects system, it had some non-linear
editing capabilities. Most importantly, it could record (and apply effects to) 80
seconds (due to hard disk space limitations) of broadcast-quality uncompressed
digital video encoded in 8-bit CCIR 601 format on its built-in hard disk array.
Non-linear editing with computers as we know it today was first introduced by
Editing Machines Corp. in 1989 with the EMC2 editor; a hard disk based non-
linear off-line editing system, using half-screen resolution video at 15 frames
per second. A couple of weeks later that same year, Avid introduced the Avid/1,
the first in the line of their Media Composer systems. It was based on the Apple
Macintosh computer platform (Macintosh II systems were used) with special
hardware and software developed and installed by Avid. The Avid/1 was not the
first system to introduce modern concepts in non-linear editing such as timeline
editing and clip bins — both of these were pioneered in Lucasfilm's EditDroid
in the early 1980s.
The video quality of the Avid/1 (and later Media Composer systems from the
late 80s) was somewhat low (about VHS quality), due to the use of a very early
version of a Motion JPEG (M-JPEG) codec. But it was enough to be a very
versatile system for offline editing, to revolutionize video and film editing. The
first long form documentary to be so edited was the HBO program Earth and the
American Dream which went on to win a National Primetime Emmy Award for
Editing in 1993. The Avid had quickly become the dominant NLE platform.
The NewTek Video Toaster Flyer included non-linear editing capabilities in
addition to processing live video signals. The Flyer made use of hard drives to
store video clips and audio, and allowed complex scripted playback. The Flyer
was capable of simultaneous dual-channel playback, which allowed the
Toaster's Video switcher to perform transitions and other effects on Video clips
without the need for rendering. The Flyer portion of the Video Toaster/Flyer
combination was a complete computer of its own, having its own
Microprocessor and Embedded software. Its hardware included three embedded
SCSI controllers. Two of these SCSI buses were used to store video data, and
the third to store audio. The Flyer used a proprietary Wavelet compression
algorithm known as VTASC, which was well regarded at the time for offering
better visual quality than comparable Motion JPEG based non-linear editing
systems.
Until 1993, the Avid Media Composer could only be used for editing
commercials or other small content projects, because the Apple Macintosh
computers could access only 50 gigabytes of storage at one time. In 1992, this
limitation was overcome by a group of industry experts led by Rick Eye a
Digital Video R&D team at the Disney Channel. By February 1993, this team
had integrated a long form system which gave the Avid Media Composer Apple
Macintosh access to over 7 terabytes of digital video data. With instant access to
the shot footage of an entire movie, long form non-linear editing (Motion
Picture Editing) was now possible. The system made its debut at the NAB
conference in 1993, in the booths of the three primary sub-system
manufacturers, Avid, Silicon Graphics and Sony. Within a year, thousands of
these systems replaced a century of 35mm film editing equipment in major
motion picture studios and TV stations world wide, making Avid the undisputed
leader in non-linear editing systems for over a decade.[2]
Although M-JPEG became the standard codec for NLE during the early 1990s,
it had drawbacks. Its high computational requirements ruled out software
implementations, leading to the extra cost and complexity of hardware
compression/playback cards. More importantly, the traditional tape workflow
had involved editing from tape, often in a rented facility. When the editor left
the edit suite they could take their confidential video tapes with them. But the
M-JPEG data rate was too high for systems like Avid on the Mac and
Lightworks on PC to store the video on removable storage, so these used fixed
hard disks instead. The tape paradigm of keeping your (confidential) content
with you was not possible with these fixed disks. Editing machines were often
rented from facilities houses on a per-hour basis, and some productions chose to
delete their material after each edit session, and then recapture it the next day, in
order to guarantee the security of their content.[citation needed] In addition,
each NLE system had storage limited by its hard disk capacity.
These issues were addressed by a small UK company, Eidos plc. Eidos chose
the new ARM-based computers from the UK and implemented an editing
system, launched in Europe in 1990 at the International Broadcasting
Convention. Because it implemented its own compression software designed
specifically for non-linear editing, the Eidos system had no requirement for
JPEG hardware and was cheap to produce. The software could decode multiple
video and audio streams at once for real-time effects at no extra cost. But most
significantly, for the first time, it allowed effectively unlimited quantities of
cheap removable storage. The Eidos Edit 1, Edit 2, and later Optima systems
allowed the editor to use any Eidos system, rather than being tied down to a
particular one, and still keep his data secure. The Optima software editing
system was closely tied to Acorn hardware, so when Acorn stopped
manufacturing the Risc PC in the late 1990s, Eidos discontinued the Optima
system.
In the early 1990s a small American company called Data Translation took what
it knew about coding and decoding pictures for the US military and large
corporate clients and threw $12m into developing a desktop editor which would
use its proprietary compression algorithms and off-the-shelf parts. Their aim
was to 'democratize' the desktop and take some of Avid's market. In August
1993 Media 100 entered the market and thousands of would-be editors had a
low-cost, high-quality platform to use.
Around the same period of time there were two other competitors, providing
non-linear systems that required special hardware often cards that had to be
added to the computer system. Fast Video Machine was a PC based system that
first came out as an offline system and later became more online editing
capable. Immix Video Cube was also a contender for Media Production
companies. The Immix Video Cube had a control surface with faders to allow
mixing and shuttle controls without the purchase of third party controllers. Data
Translation's Media 100 came with 3 different JPEG codecs for different types
of graphics of video and many resolutions. The Media 100 system kept
increasing its maximum video resolution via software upgrades rather than
hardware. This was because the Media 100 cards had enough processing power
to be expanded to resolutions as high as Avid systems at the upper end of the
Avid product line. Cards at the time had embedded, dedicated CPUs (for
example a Motorola 68000 processor), which were as powerful as the
processors inside the Macintosh systems that hosted the application. These other
companies caused tremendous downward market pressure on Avid. Avid was
forced to continually offer lower priced systems to compete with the Media 100
and other systems.
Inspired by the success of Media 100, members of the Premiere development
team left Adobe to start a project called "Keygrip" for Macromedia. Difficulty
raising support and money for development led the team to take their non-linear
editor to NAB. After various companies made offers, Keygrip was purchased by
Apple as Steve Jobs wanted a product to compete with Adobe Premiere in the
desktop video market. At around the same time, Avid — now with Windows
versions of its editing software — was considering abandoning the Macintosh
platform. Apple released Final Cut Pro in 1999, and despite not being taken
seriously at first by professionals, it has evolved into a serious competitor to
Avid.
DV
Another leap came in the late 1990s with the launch of DV-based video formats
for consumer and professional use. With DV came IEEE 1394 (FireWire/iLink),
a simple and inexpensive way of getting video into and out of computers. The
video no longer had to be converted from an analog signal to digital data — it
was recorded as digital to start with — and FireWire offered a straightforward
way of transferring that data without the need for additional hardware or
compression. With this innovation, editing became a more realistic proposition
for standard computers with software-only packages. It enabled real desktop
editing producing high-quality results at a fraction of the cost of other systems.
HD
More recently the introduction of highly compressed HD formats such as HDV
has continued this trend, making it possible to edit HD material on a standard
computer running a software-only editing application.
Avid is still considered the industry standard, with the majority of major feature
films, television programs, and commercials created with its NLE
systems[citation needed]. Final Cut Pro received a Technology & Engineering
Emmy Award in 2002 and continues to develop a following.
Avid has held on to its market-leading position in the advent of cheaper
software packages, notably Adobe Premiere in 1992 and Final Cut Pro in 1999.
These three competing products by Avid, Adobe, and Apple are the foremost
NLEs, often referred to as the A-Team.[3] With advances in raw computer
processing power, new products have appeared including NewTek's software
application SpeedEdit.
Since 2000, many personal computers include basic non-linear video editing
software free of charge. This is the case of Apple iMovie for the Macintosh
platform, PiTiVi for the Linux platform (it is installed by default on Ubuntu, the
dominant desktop Linux distribution), and Windows Movie Maker for the
Windows platform. This phenomenon has brought low-cost non-linear editing to
consumers.
Quality
At one time, a primary concern with non-linear editing had been picture and
sound quality. Storage limitations at the time required that all material undergo
lossy compression techniques to reduce the amount of memory occupied.
Improvements in compression techniques and disk storage capacity have
mitigated these concerns, and the migration to High Definition video and audio
has virtually removed this concern completely. Most professional NLEs are also
able to edit uncompressed video with the appropriate hardware.
Editing controls what we see and when:- What information is revealed to or
hidden from the characters and the audience
What are we looking for when analysing editing in a clip?
• Order of shots
• Juxtaposition
• Continuity?
• Transitions
• Shot duration
• Pace and rhythm
• Special effects
Continuity
Continuity editing
• Cutting shots to tell a story with narrative continuity, helping the viewer make
sense of the action by implying spatial relationships and ensuring smooth flow
from shot to shot.
• Continuity techniques:
• Establishing shot (establishes the space in which action is to happen)
• The 180º rule (ensures that the same space is described in each shot)
• Shot/reverse shot
• Eyeline match (e.g. character looks off-screen, next shot shows us what they
see)
• Match on action (character begins to move in one shot, we see continuation of
the same movement in the next shot)
Realism - edit is invisible so action appears real rather than constructed
Non-Continuity
• Montage – giving information in compressed form – can come under…
• Non-continuity editing – Continuity is broken and construction is more
apparent. Meaning often created through juxtaposition and metaphor shot
inserts.
Transitions
• The process of cutting from one shot to another usually involves a simple
straight cut. However there are other means of transition available to a film
editor
• Fade to black
• Dissolve/cross fade
• Wipe
Shot duration/pace
• The duration of a shot will usually reflect the narrative context.
• Generally speaking short shot duration conveys action and urgency (say in a
chase sequence). Whilst long duration conveys intensity and intimacy within the
narrative, it allows us to focus upon facial expression and other aspects of mise
en scene which would otherwise be missed.
Some editing devices
• Parallel editing - crosscutting between different locations can convey the
impression that two or more events are occurring simultaneously.
• split screen – where the frame is split into sections so that we can see different
events occurring at the same time. This technique was used on the TV series
24.
• Film Editing Glossary: Cutting and transitions
• Action match a technique used in film editing, is a cut that connects two
different views of the same action at the same moment in the movement. By
carefully matching the movement across the two shots, filmmakers make it
seem that the motion continues uninterrupted. For a real match on action, the
action should begin in the first shot and end in the second shot.
• cut
A visual transition created in editing in which one shot is instantaneously
replaced on screen by another.

• continuity editing
Editing that creates action that flows smoothly across shots and scenes without
jarring visual inconsistencies. Establishes a sense of story for the viewer.

• Cut away is the interruption of a continuously filmed action by inserting a


view of something else. It is usually, although not always, followed by a
cutback to the first shot.

• dissolve
A gradual scene transition. The editor overlaps the end of one shot with the
beginning of the next one.
• editing
The work of selecting and joining together shots to create a finished film.

• Ellipsis presents an action in such a way that it consumes less time on the
screen than it does in the story.

• expansion of time usually created through overlapping editing. It is the


opposite of the ellipsis; it presents an action in such a way that it consumes
more time on the screen than it does in the story. It contains cuts that actual
repeat a previous action.

• errors of continuity
Disruptions in the flow of a scene, such as a failure to match action or the
placement of props across shots.

• establishing shot
A shot, normally taken from a great distance or from a "bird's eye view," that
establishes where the action is about to occur.
• eyeline match
The matching of eyelines between two or more characters. For example, if Sam
looks to the right in shot A, Jean will look to the left in shot B. This establishes
a relationship of proximity and continuity.
• fade (in and out)
A visual transition between shots or scenes that appears on screen as a brief
interval with no picture. The editor fades one shot to black and then fades in the
next. Often used to indicate a change in time and place.

• final cut
The finished edit of a film, approved by the director and the producer. This is
what the audience sees.

• Insert an electronic method of editing whereby the editor can freely move
shots and clips around as he pleases. Not required to linear edit (chronological
order).

• Graphic match A cut joining two shots whose compositional elements match,
helping to establish strong continuity of action.

• Jump cut
A cut that creates a lack of continuity by leaving out parts of the action.
• Long take is an uninterrupted shot in a film which lasts much longer than the
conventional editing pace either of the film itself or of films in general, usually
lasting several minutes. It can be used for dramatic and narrative effect if done
properly, and in moving shots is often accomplished through the use of a dolly
or Steadicam.
• montage
Scenes whose emotional impact and visual design are achieved through the
editing together of many brief shots. The shower scene from Psycho is an
example of montage editing.

• In parallel editing or parallel cutting, sometimes also called cross-cutting, the


sequences or scenes are intercut so as to suggest that they are taking place at the
same time. Parallel cutting might show shots of a villain being villainous
intercut with shots of the hero or heroine coming to the rescue. Most chases use
parallel editing, switching back and forth between pursuer and pursued. Phone
conversations, too, are often parallel edited.
• rough cut
The editor's first pass at assembling the shots into a film, before tightening and
polishing occurs.
• shot reverse shot cutting
Usually used for conversation scenes, this technique alternates between over-
the-shoulder shots showing each character speaking.

• Slow motion is an effect in film-making whereby time appears to be slowed


down.
• Superimposition is the exposure of more than one image on a film strip.

• wipe
In film editing, a wipe is a gradual spatial transition from one image to another.
One image is replaced by another with a distinct edge that forms a shape. A
simple edge, an expanding circle, or the turning of a page are all examples.
• Post-production Visual effects Most editing applications offer a large selection
of digital transitions with various effects. There are too many to list here, but
these effects include colour replacement, animated effects, pixlilation, focus
drops, lighting effects, etc.
Film Editing: what’s the idea?
• The general idea behind editing in narrative film is the coordination of one
shot with another in order to create a coherent whole.
The system of editing employed in narrative film is called continuity editing –
its purpose is to create and provide efficient and artful transitions
1. In filmmaking, the task of selecting and joining camera takes.
2. In the finished film, the set of techniques that governs the relation among
shots.
(Bordwell and Thompson)
Editing is the process of preparing language, images, or sound through
correction, condensation, organization, and other modifications in various
media... Editing is, therefore, also a practice that includes creative skills, human
relations, and a precise set of methods.
The Kuleshov Effect
• Lev Kuleshov, circa 1920: intercut an actor’s face with unrelated footage taken
later.
• Audiences interpreted emotional responses on the actor’s face based on the
juxtaposition of images.
• Whilst much of the moving image we see uses this effect, it does not usually
draw attention to it.
Comparing Approaches
• Students may be familiar with multiple-camera, non-sequential techniques
from film and television
• Hollywood productions may have 1000-2000 shots, 3000 for an action movie:
post-production editing is crucial in creating meaning
• Some film makers still favour a pared-down, single-camera, sequential
approach for particular sections of film
Definition of in-camera editing:
‘constructing a film by taking shots in sequence, with no subsequent editing’
(Burn and Durran)
Contrast ‘four main functions’ of film editing:
• ‘make sure that the production is the required length or time;
• to remove unwanted material or mistakes;
• to alter if necessary the way or the sequence in which events will be
portrayed;
• to establish the particular style and character of a production.’ (O’Sullivan,
Dutton and Rayner)
VERY IMPORTANT- Relations In Editing
There are five areas of choice and control in editing, based on five types of
relationships between shots:
Graphic Relations
Rhythmic Relations
Temporal Relations
Spatial Relations
Thematic Relations
• Graphic Relations
Although the primary focus of the film editor is to ensure continuity of the
narrative, film editors remain acutely aware that film is a visual art. Therefore,
they work to achieve visual interest by creating transitions between shots that
are graphically similar and graphically dissimilar, depending on the desired
effect.
• Graphic Continuity
• A graphic match is achieved by joining two shots that have a similarity in
terms of light/dark, line or shape, volume or depth, movement or stasis.
• A graphically discontinuous edit creates a clash of visual content by joining
two shots that are dissimilar in terms of one or more of the above visual
principles.
• Rhythmic Relations
• Film is not only a visual art, but also an auditory and even tactile art.
Therefore, editors also remain aware of the effects achieved by manipulating the
rhythms experienced by perceivers through thoughtful juxtapositions of longer
and shorter shots as well as through transitional devices that affect the
perceiver’s sense of beat or tempo.
• Rhythmic Transitional Devices
• Straight cut
• Fade-out
• Fade-in
• Dissolve
• Wipe
• Flip frame
• Jump cut
Temporal Relations
Editing is the process by which the difference between temporal duration and
screen duration is reconciled. It sounds simple, but consider this: most feature
films present in roughly two hours sufficient intersection of story and plot to
provide perceivers with everything they need in order to understand days,
weeks, months or even years in characters’ lives.
Temporal Relations: The Passage of Time
o To speed up time, editors make use of elliptical editing techniques such as
o Transitional devices
o Empty frames
o Cutaway shots
o To slow down time, editors make use of expansion editing techniques such as
o overlapping
o repetition
Spatial Relations
o Perhaps the most important, as well as the most overlooked, principle of
editing is its function in providing perceivers a reliable sense of the physical
space that constitutes the world of the film. Editors are responsible (with
assistance from cinematographers) for relating points in space in order to
achieve narrative continuity.
Spatial Continuity
• The standard pattern for editing a scene in a narrative film includes the
following:
• Establishing shot
• Shot/Reverse-shot
• Eyeline match (POV shot)
• Re-establishing shot
More Spatial Concepts
• Multiple camera technique
• Axis of Action (180-degree line)
• Match on Action
• Cheat Cut
• The Kuleshov Effect
Thematic Relations
Editors have at their disposal two very powerful techniques for manipulating the
perceiver’s place in the hierarchy of knowledge, and therefore affecting our
thematic understanding of the film:
• Montage sequences
• Crosscut editing
FILM LANGUAGE
MACRO and MICRO elements of
film language
▲ MACRO –
▲ GENRE
▲ NARRATIVE
▲ (REPRESENTATION)
MICRO –
CINEMATOGRAPHY
SOUND
EDITING
MISE EN SCENE
SPECIAL EFFECTS
Cinematography
Refers to the visual aspects of a film’s language
Camera shots and movement can give us clear indications of emotion, motive
and give audiences clues as to things that may be about to happen.
Camera shots
Close-up (and extreme close-up
Mid-shot
Long shot
Wide (long) shot (often establishing shot)
Low angle shot
High angle shot
Birds eye view
Camera Angles and Editing
Understand that the positioning of the camera can create and change the
meaning of the scene
Understand how the editing of different shots can also create and change the
meaning of a scene and a film
• EXTREME LONG SHOT
1. A framing in which the scale of the object shown is very small; a building,
landscape, or crowd of people will fill the screen.
2. Usually the first or last shots of a sequence, that can also function as
establishing shots..
LONG SHOT
1. A framing in which the scale of the object shown is small; a standing human
figure would appear nearly the height of the screen.
2.It makes for a relatively stable shot that can accomodate movement without
reframing
MEDIUM LONG SHOT
I. Framing such an object four or five feet high would fill most of the screen
vertically.
II. Also called plain américain, given its recurrence in the Western genre, where
it was important to keep a cowboy's weapon in the image.
MEDIUM CLOSE-UP
A. A framing in which the scale of the object shown is fairly large; a human
figure seen from the chest up would fill most of the screen.
B. Another common shot scale.
CLOSE-UP
1) A framing in which the scale of the object shown is relatively large.
2) In a close-up a person's head, or some other similarly sized object, would fill
the frame.
EXTREME CLOSE-UP
a. A framing in which the scale of the object shown is very large; most
commonly, a small object or a part of the body usually shot with a zoom lens.
b. Again, faces are the most recurrent images in extreme close-ups
CRANE SHOT
i. A shot with a change in framing rendered by having the camera above the
ground and moving through the air in any direction.
ii. It is accomplished by placing the camera on a crane (basically, a large
cantilevered arm) or similar device.
HANDHELD CAMERA, STEADYCAM
a. The use of the camera operator's body as a camera support, either holding it
by hand or using a gyroscopic stabilizer and a harness.
b. Used by newsreel and wartime camera operators.
PAN
a) Camera body turning to the right or left. On the screen, it produces a mobile
framing which scans the space horizontally.
b) A pan directly and immediately connects two places or characters, thus
making us aware of their proximity. The speed at which a pan occurs can be
exploited for different dramatic purposes.
TILT
i. The camera body swiveling upward or downward on a stationary support.
Scans the space vertically.
ii. A tilt usually also implies a change in the angle of framing;
iii)High angle view – inferior
iv)Low angle – superior
TRACKING SHOT
A mobile framing that travels through space forward, backward, or laterally.
Usually follows a character or object as it moves along the screen
EDITING – WIPE
A transition between shots in which a line passes across the screen, eliminating
the first shot as it goes and replacing it with the next one.
A very dynamic and noticeable transition, it is usually employed in action or
adventure films.
DISSOLVE
• A transition between two shots during which the first image gradually
disappears while the second image gradually appears; for a moment the two
images blend in.
• Can be used as a fairly straighforward editing device to link any two scenes, or
in more creative ways, for instance to suggest hallucinatory states.
JUMP CUT
An elliptical cut that appears to be an interruption of a single shot.
Either the figures seem to change instantly against a constant background, or the
background changes instantly while the figures remain constant
CROSSCUTTING, aka PARALLEL EDITING
↑ Editing that alternates shots of two or more lines of action occurring in
different places, usually simultaneously.
↑ The two actions are therefore linked, associating the characters from both
lines of action.
CONTINUITY EDITING
A system of cutting to maintain continuous and clear narrative action.
Continuity editing relies upon matching screen direction, position, and temporal
relations from shot to shot.
The film supports the viewer's assumption that space and time are contiguous
between successive shots.
MONTAGE
▲ An approach to editing developed by the Soviet filmmakers of the 1920s
such as Pudovkin, Vertov and Eisenstein;
▲ It emphasizes dynamic, often discontinuous, relationships between shots and
the juxtaposition of images to create ideas not present in either shot by itself.
ELLIPTICAL EDITING
Shot transitions that omit parts of an event, causing an ellipses in plot and story
duration, (achieved with a plentiful use of jump cuts) in order to both shorten
the time and suggest the character's states.
EYELINE MATCH (MATCHES)
A cut obeying the axis of action principle, in which the first shot shows a person
off in one direction and the second shows a nearby space containing what he or
she sees.
GRAPHIC MATCH (MATCHES)
Two successive shots joined so as to create a strong similarity of compositional
elements (e.g., colour, shape).
Used in transparent continuity styles to smooth the transition between two shots
MATCH ON ACTION (MATCHES)
A cut which splices two different views of the same action together at the same
moment in the movement, making it seem to continue uninterrupted.
Quite logically, these characteristics make it one of the most common
transitions in the continuity style.
LONG TAKE, aka PLAN-SEQUENCE (DURATION)
A shot that continues for an unusually lengthy time before the transition to the
next shot.
The average length per shot differs greatly for different times and places, but
most contemporary films tend to have faster editing rates.
In general lines, any shot above one minute can be considered a long take.
OVERLAPPING EDITING (DURATION)
Cuts that repeat part or all of an action, thus expanding its viewing time and plot
duration.
Most commonly associated with experimental filmmmaking, due to its
temporally disconcerting and purely graphic nature,
it is also featured in films in which action and movement take precedence over
plot and dialogue.
RHYTHM (DURATION)
The perceived rate and regularity of sounds, series of shots, and movements
within the shots.
Rhythmic factors include beat (or pulse), accent (or stress), and tempo (or
pace).
Rhythm is one of the essential features of a film, for it decisively contributes to
its mood and overall impression on the spectator.
LIGHTING & COLOUR
• Is used to create mood and atmosphere
• Positioning of lights creates different effects
• High key lighting
• Low key lighting
Sound
The world of the film as we see it on the cinema screen is known as the
DIEGETIC world.
When we watch a film the sound we hear can be DIEGETIC OR NON-
DIEGETIC.
DIEGETIC SOUND is sound that is part of the film world.
NON-DIEGETIC sound is sound that is not recognised as part of the film world
– e.g. voice over, background music
PARALLEL SOUND – sound which compliments the visual image.
CONTRAPUNTAL SOUND – sound which does not fit with the image but
helps to create new meanings.

You might also like