You are on page 1of 200

lOMoARcPSD|19697633

Multi tasking performance

B.Sc. Computer Science (Bharathiar University)

Studocu is not sponsored or endorsed by any college or university


Downloaded by Wahitha Banu (wahitha19@gmail.com)
lOMoARcPSD|19697633

Multimedia and its Applications

MCA Second Year


(Elective - 1)
Paper No. E 1.3

School of Distance Education


Bharathiar University, Coimbatore - 641 046

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Author: Dhiraj Sharma

Copyright © 2009, Bharathiar University


All Rights Reserved

Produced and Printed


by
EXCEL BOOKS PRIVATE LIMITED
A-45, Naraina, Phase-I,
New Delhi-110 028
for
SCHOOL OF DISTANCE EDUCATION
Bharathiar University
Coimbatore-641 046

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

CONTENTS

Page No.
UNIT I
Lesson 1 Introduction to Multimedia 7
Lesson 2 Making Multimedia 25
UNIT II
Lesson 3 Making Instant Multimedia 43
Lesson 4 Multimedia Building Blocks (Using Text) 65
Lesson 5 Multimedia Building Blocks (Using Sound) 79
UNIT III
Lesson 6 Multimedia Building Blocks (Using Graphics and Images) 97
Lesson 7 Multimedia Building Blocks (Using Animation) 113
Lesson 8 Multimedia Building Blocks (Using Video) 124
UNIT IV
Lesson 9 Multimedia and the Internet 143
Lesson 10 Designing and Tools for the World Wide Web 158
UNIT V
Lesson 11 High Definition Television and Desktop Computing 175
Lesson 12 Knowledge-based Multimedia Systems 188
Model Question Paper 201

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

MULTIMEDIA AND ITS APPLICATIONS

SYLLABUS

UNIT I
What is Multimedia - Introduction to making multimedia - Macintosh and Windows
production platforms - Basic Software tools.
UNIT II
Making Instant Multimedia - Multimedia authoring tools - Multimedia building
blocks - Text - Sound.
UNIT III
Images - Animation - Video.
UNIT IV
Multimedia and the Internet - The internet and how it works - Tools for World Wide
Web - Designing - Designing for the World Wide Web.
UNIT V
High Definition Television and Desktop Computing - Knowledge based Multimedia
systems.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

5
Introduction to Multimedia

UNIT I

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

6
Multimedia and its Applications

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

7
LESSON Introduction to Multimedia

1
INTRODUCTION TO MULTIMEDIA

CONTENTS
1.0 Aims and Objectives
1.1 Introduction
1.2 Building Blocks of Multimedia
1.2.1 Text
1.2.2 Graphics
1.2.3 Sound
1.2.4 Animation
1.2.5 Video
1.3 Interactivity of Multimedia Applications
1.4 Delivery of Multimedia
1.5 Hypertext and Hypermedia
1.5.1 Hypertext
1.5.2 Hypermedia
1.6 Hypertext/media and Our Memory
1.7 Linking
1.7.1 Structural Links
1.7.2 Associative Links
1.7.3 Referential Links
1.8 Applications of Multimedia
1.8.1 Education and Training
1.8.2 Entertainment
1.8.3 Tool for Business
1.8.4 Video Conferencing and Virtual Reality
1.8.5 Electronic Encyclopaedia
1.9 Skills Needed for Multimedia Development
1.10 Let us Sum up
1.11 Lesson End Activities
1.12 Keywords
1.13 Questions for Discussion
1.14 Suggested Readings

1.0 AIMS AND OBJECTIVES


After studying this lesson, you would be able to:
z Offer a variety of definitions of multimedia and its component parts
z Identify where multimedia can add value
z Identify a variety of roles in developing multimedia

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

8 z Describe hypertext and hypermedia concepts


Multimedia and its Applications
z Describe how multimedia applications are influencing every aspect of life

1.1 INTRODUCTION
Multimedia is an old concept that has been given new meaning by the computer
industry through their efforts to create multimedia-capable computing platforms.
Multimedia can be scholarly defined as the ‘interactive dramatisation of information’.
Multimedia technology uses the computer to combine text, graphics, animation, audio
and full-motion video under the user’s control. Although combinations of these
functions have been available for some years, it has been difficult to integrate them so
that the non-technical user can manipulate them and thereby create documents or
applications that incorporate all these features. Traditionally, putting together such a
presentation requires the skills of a computer programmer or an information
technology specialist. In the past few years there has been sporadic use made of
multimedia materials both in academia and in industry, but this has been largely
driven by skilled individuals working ad hoc and in isolation.
Multimedia systems have the power to involve users in rich interactions with diverse
information. For interactions with complex, interrelated, bodies of information,
hypermedia techniques can be employed to give the user non-linear access to the
stored material. If a multimedia application is to offer the user more than this, for
instance to provide advice, or help with a problem, then a more intensive interaction is
necessary. Exploitation of knowledge-based system techniques may be relevant and
the interaction will need to have dialogue-like characteristics.
Multimedia is a new aspect of literacy that is being recognised as technology expands
the way people communicate. The concept of literacy increasingly, is a measure of the
ability to read and write. In the modern context, the word, means reading and writing
at a level adequate for written communication. A more fundamental meaning is now
needed to cope with the numerous media in use, perhaps meaning a level that enables
one to function successfully at a certain status in society. Multimedia is the use of
several different media to convey information. Several different media are already a
part of the canon of global communication and publication (text, audio, graphics,
animation, video, and interactivity). Others, such as virtual reality, computer
programming and robotics are possible candidates for future inclusion. With the
widespread use of computers, the basic literacy of ‘reading’ and ‘writing’ are often
done via a computer, providing a foundation stone for more advanced levels of
multimedia literacy.
Multimedia is the use of several media (e.g. text, audio, graphics, animation, video) to
convey information. Multimedia also refers to the use of computer technology to
create, store, and experience multimedia content.

Definition
Multimedia can be defined as any combination of two or more of the following:
z text
z graphics
z sound
z animation
z video
which are integrated together and delivered by a computer. England & Finney (2002)
cite Feldman, adding the ideas of ‘seamless integration’ and ‘information
environment’.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

The term ‘multimedia’ is associated with sound and video in particular. It came into 9
Introduction to Multimedia
widespread use when it became technically feasible to incorporate these two elements
into software packages and applications. Multimedia need not incorporate both these
elements and just because you can incorporate a variety of individual media
components into a multimedia application, it does not mean that you have to. An ‘all-
singing all-dancing’ multimedia presentation with many special effects to dazzle the
eye may not be the most effective way of getting your message across. More often
than not, the special effects can detract from the content.

1.2 BUILDING BLOCKS OF MULTIMEDIA


The basic building blocks of multimedia are text, graphics, animation, sound and
video. All these have certain significance to the end-user. Each of these building
blocks also has business models for production and exploitation. Writers, musicians,
artists and actors have long-established norms for making a living from their talents.
Multimedia developers, on the other hand, want to be free to re-use work as they see
fit – to exploit it.
Also, in considering these building blocks, don't forget the artifacts and the processes
that give rise to them. Some may have an intrinsic value (for example, the storyboard
for a movie may later be exhibited in a gallery), and all inevitably form, or imply,
some kind of contract between the parties involved.
Also you should bear in mind that multimedia inevitably involves the creative use of
the latest technology. "New" is often confused with "Good". Although innovations
eventually make things more efficient, they describe a "trough of disillusionment" that
follows the peak of inflated expectations, before the innovation reaches the height of
enlightenment. It's a useful way to understand the responses of investors, consumers
and technologists to innovation and its promises.

1.2.1 Text
Text is an essential component of any multimedia application. It can be used for:
z title and headlines (what it’s all about)
z menus (where to go)
z navigation (how to get there)
z content (what you see when you get there)

1.2.2 Graphics
Graphics can be used to
z reinforce text
z supplement text
z create impact
Images can be
z photographic
z drawn
Sources of graphics include
z paint and drawing programs
z scanners
z photo CDs

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

10 z digital cameras
Multimedia and its Applications
z digital video stills
Computer graphics can be
z bitmaps
z vector-based

1.2.3 Sound
Sound can be used to:
z create atmosphere
z feedback
z give instructions
z give warnings
z supplement information on screen
Sound can be
z MIDI - a series of instructions to a module or device to replay a note of a given
pitch, volume, voice and duration
z digital audio - digital approximation of analogue sound

1.2.4 Animation
z animation can be used to add visual impact to a multimedia application.
z visual effects, such as dissolves and wipes, can be used for primitive animation.
z more complex animations are based on Cell animation techniques.
z animations can be 2D or 3D.

1.2.5 Video
z digital video is a powerful tool in multimedia and can add great impact to a
multimedia presentation.
z MPEG allows computers to supports FSFM (full screen full motion) video, as
seen in DVD, as well as the more familiar small, jerky windows of Quicktime,
AVI and streamed video.

1.3 INTERACTIVITY OF MULTIMEDIA APPLICATIONS


A key associated concept is interactivity – multimedia applications, whether
developed in-house or produced commercially, tend to be interactive.
Interactive multimedia allows the user to engage in a conversation with the
application – the user can control, which elements are delivered, when they are
delivered and in what order they are delivered.
This can range from clicking a mouse to move forward to the next page, to click on
‘hot-spots’ to navigate forwards and backwards or to jump to another section, or to
select a topic from a menu, or to click a button to start playing a video clip.
Another associated concept is hypermedia. These are interactive applications where
the user can freely navigate through an application from link to link rather than
proceeding in a linear manner.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Hypermedia applications allow different users to follow different routes through a 11


Introduction to Multimedia
package. The World Wide Web is an example of a distributed hypermedia system. But
there is more to multimedia than the Web.
It is in the integration and the creation of the user's experience that multimedia adds
value. This experience can be located along a number of axes - between "lean
forward" (multimedia that you consume at a computer screen, and with which you
must actively interact) or "lean back" (multimedia you consume from a distance, such
as interactive television); with high or low levels of user interaction: a linear
experience that has been tailored to your profile through which you move forwards or
backwards, or a highly interactive experience that you may explore in any direction.

1.4 DELIVERY OF MULTIMEDIA


Multimedia can be delivered in various formats:
z standalone (CD, DVD, memory cards)
z across networks (WWW, ISDN, cable, cellular, and so on)
Increasingly, multimedia is being delivered in a dual format e.g. networked CDs.
Remember too that increasingly multimedia is delivered on devices other than a
standard computer. Cable and satellite companies supply interactive set top boxes.
Pocket devices such as mobile phones, the PocketPC™ or the Palm™ supply
interactive access to media content.
Multimedia may also have to be delivered across multiple channels - the interactive
text for a satellite set-top box may be very different from that of a cable company's.

1.5 HYPERTEXT AND HYPERMEDIA


Any student, who has used online help for gaming etc., will already be familiar with a
fundamental component of the Web-Hypertext.
Hypertext is the concept whereby, instead of reading a text in a liner fashion (like a
book), you can at many points jump from one place to another, go forward or back,
get much more detail on the current topic, change direction and navigate as per your
desire.

1.5.1 Hypertext
Hypertext is conceptually the same as regular text – it can be stored, read, searched, or
edited - with an important difference: hypertext is text with pointers to other text. The
browsers let you deal with the pointers in a transparent way – select the pointer, and
you are presented with the text that is pointed at.

Definitions of Hypertext
A way of presenting information online with connections between one piece of
information and another. These connections are called hypertext links. Thousands of
these hypertext links enable you to explore additional or related information
throughout the online documentation. See also hypertext link.
z This term describes the system that allows documents to be cross-linked in such a
way that the reader can explore related documents by clicking on a highlighted
word or symbol.
z A non-sequential method for reading a document displayed on a computer screen.
Instead of reading the document in sequence from beginning to end, the reader
can skip to topics by choosing a highlighted word or phrase embedded within the

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

12 document. This activates a link, connecting the reader to another place in the same
Multimedia and its Applications
document or to another document. The resultant matrix of links is called a web.
z This is a mark-up language that allows for non-linear transfers of data. The
method allows your computer to provide the computational power rather than
attaching to a mainframe and waiting for it to do the work for you.
z In computing, hypertext is a user interface paradigm for displaying documents
which, according to an early definition (Nelson 1970), “branch or perform on
request.” The most frequently discussed form of hypertext document contains
automated cross-references to other documents called hyperlinks. Selecting a
hyperlink causes the computer to display the linked document within a very short
period of time.

1.5.2 Hypermedia
Hypermedia is a superset of hypertext. Hypermedia documents contain links not only
to other pieces of text, but also to other forms of media – sounds, images, and movies.
Images themselves can be selected to link to sounds or documents. Hypermedia
simply combines hypertext and multimedia.
The example of Hypermedia can be: you are viewing a manufacturing plant’s floor
plan, you select a section by clicking on a room. The employee's name and picture
appears with a list of their current projects. Hypertext and Hypermedia are concepts,
not products and both terms were coined by Ted Nelson.

Definitions of Hypermedia
Hypermedia is a term created by Ted Nelson in 1970. It used as a logical extension of
the term hypertext, in which graphics, audio, video, plain text and hyperlinks
intertwine to create a generally non-linear medium of information. This contrasts with
multimedia, which, although often capable of random access in terms of the physical
medium, is essentially linear in nature. The difference should also be noted with
hyper-graphics or super-writing which is a Lettrist form from the 1950s which
systemises creativity across disciplines.
A classic example of hypermedia is World Wide Web, whereas, a movie on a CD or
DVD is an example of standard multimedia. The difference between the two can (and
often do) blur depending on how a particular technological medium is implemented.
The first hypermedia system was the Aspen Movie Map.

1.6 HYPERTEXT/MEDIA AND OUR MEMORY


Humans associate pieces of information with other information and create complex
knowledge structures. Hence, it is also said that the human memory is associative. We
often remember information via association. For example, a person starts with an idea
which reminds of a related idea or a concept which in turn reminds him/her of another
idea. The order in which a human associates an idea with another idea depends on the
context under which the person wants information.
When writing, an author converts his/her knowledge which exists as a complex
knowledge structure into an external representation. Information can be represented
only in a linear manner using physical media such as printed material and video tapes.
Therefore, the author has to convert his/her knowledge into a linear representation
using a linearisation process. This is not easy. So the author will provide additional
information, such as a table of contents and an index, to help the reader understand the
overall organisation information.
The reading process can be viewed as a transformation of external information into an
internal knowledge base combined with integration into existing knowledge

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

structures, basically a reverse operation of the writing process. For this, the reader 13
Introduction to Multimedia
breaks the information into smaller pieces and rearranges those based on the readers’
information requirement. We rarely read a text book or a scientific paper from start to
finish. We tend to browse through the information and then follow the information
headings that are interesting to us.
Hypermedia, using computer enabled links, allows us to partially imitate writing and
reading processes as they take place inside our brain. We can create non-linear
information structures by associating pieces of information in different ways using
links. Further, we can use a combination of media comprising of text, images, video,
sound and animation for value addition in the representation of information. It is not
necessary for an author to go through a linearisation process of his/her knowledge
when writing. Also, the reader can access some of the information structures the
author had when writing the information. This will help the reader create his/her own
representation of knowledge and to amalgamate that knowledge into the existing
knowledge structures.
In addition to being able to access information through association, hypermedia
applications are supported by a number of additional aspects. These include an ability
to incorporate various media, interactivity, vast data sources, distributed data sources,
and powerful search engines. All these make hypermedia an extremely powerful tool
to create, store, access and manipulate information.

1.7 LINKING
Hypermedia systems as well as information in general contains various types of
relationships between various information elements. Examples of typical relationships
include similarity in meaning or context, similarity in logical sequence or temporal
sequence, and containment.
Hypermedia allows these relationships to be installed as links which connect the
various information elements, so that these links can be used to navigate within the
information space.
One possible structure is based on the mechanics of the links. We can also look at the
number of sources and destinations for links (single-source single-destination,
multiple-source single-destination, etc.) the directionality of links (unidirectional,
bi-directional), and the anchoring mechanism (generic links, dynamic links, etc.).
A more useful link structure is based on the type of information relationships being
represented. In particular, we can divide relationships into those based on the
organisation of the information space called structural links and those related to the
content of the information space called associative and referential links.
Let us take a brief look at these links.

1.7.1 Structural Links


The information contained within the hypermedia application is typically organised in
some suitable fashion. This organisation is represented using structural links. We can
group structural links together to create different types of application structures. If we
look, for example, at a typical book, then this has both a linear structure i.e. from the
beginning of the book linearly to the end of the book and usually a hierarchical
structure in the form of the book contains chapters, the chapters contain sections, the
sections containing matter. Typically in a hypermedia application we try to create and
utilise appropriate structures.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

14
Multimedia and its Applications
1.7.2 Associative Links
An associative link is a link which is completely independent of the specific structure
of the information. For instance we have links based on the meaning of different
information components. The most common example which most people would be
familiar is cross-referencing within books for example – for more information on X
refer to Y. It is these relationships - or rather the links which are a representation of
the relationships – which provide the essence of hypermedia and in many respects can
be considered to be the defining characteristic of hypermedia.

1.7.3 Referential Links


A third type of link is a referential link. It is related to the associative link. Rather than
representing an association between two related concepts, a referential link provides a
link between an item of information and an explanation of that information. A simple
example would be a link from a word to a definition of that word. One simple way of
understanding the difference between associative and referential links is that the items
linked by an associative link can exist independently, but are related at a conceptual
level.

Check Your Progress 1


1. What is Multimedia?
……………………………………………………………………………….
……………………………………………………………………………….
2. What are the building blocks of Multimedia?
……………………………………………………………………………….
……………………………………………………………………………….
3. Explain the concept of Interactivity of Multimedia applications.
……………………………………………………………………………….
……………………………………………………………………………….
4. Define hypertext and hypermedia?
……………………………………………………………………………….
……………………………………………………………………………….
5. Explain the concept of hypermedia/text in terms of human memory.
……………………………………………………………………………….
……………………………………………………………………………….
6. Illustrate various links used in hypermedia.
……………………………………………………………………………….
……………………………………………………………………………….

1.8 APPLICATIONS OF MULTIMEDIA


Multimedia can be applied to all fields of human communications. These can be used
in the following areas:
z Presentations
z Entertainment
z Advertising
z Reference
z Learning
z Simulation

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Application areas are wide-ranging. Nowadays, multimedia applications are only 15


Introduction to Multimedia
limited by our imagination. However, in pitching for and carrying out work for a
client you will be inevitably educating them continuously as to what is possible.
A common problem is that the client has no imagination for the first 90% of a project,
and too much for the last 10%!
Some possible applications are:
z Entertainment: DVDs, video CD-ROMs, interactive music CD-ROMs,
interactive encyclopaedias, games (infotainment and edutainment)
z Public information: kiosks, visitor attractions
z Public performance: screens at sporting events and concerts, handheld devices
for audience interaction
z Education: interactive multimedia assisted learning
z Marketing: interactive multimedia manuals and catalogues
z Multimedia databases: audio, video and graphical assets ready for re-use in, say a
TV programme, movie, or even multimedia package!
z Personal communication: multimedia annotations for word-processing,
spreadsheets, email, mobile phones
Multimedia will help spread the information age to millions of teachers/learners.
Multimedia educational computing is one of the fastest growing markets in the world
today.
Multimedia is fast emerging as a basic skill that will be as important to life in the
twenty-first century as reading is now. In fact, multimedia is changing the way people
read, interact and distribute information. Instead of limiting one to the linear
re-presentation of text as printed in books, multimedia makes reading enjoyable with a
whole new dimension by giving words an important new dynamics. In addition to
conveying meaning, words in multimedia serve as triggers that readers can use to
expand the text in order to learn more about a topic. This is accomplished not only by
providing more text but by bringing it to life with audio, video and graphics.
Accelerating this growth are advances in technology and price wars that have
dramatically reduced the cost of multimedia computers. The growing number of
internet users has created a huge market for multimedia. The new tools are enabling
educators to become developers. Noting how multimedia is used to enable individuals
to create course material, that once required teams of specialists, individuals can now
produce multimedia desktop video productions.
Let us discuss some of the major application areas of multimedia in detail:

1.8.1 Education and Training


Multimedia presentations are a great way to introduce new concepts or explain a new
technology. Individuals find it easy to understand and use.
Multimedia can be used for education, training, simulations, digital publications,
museum exhibits and so much more. With the advent of multimedia authoring
applications like Flash, Shockwave and Director amongst a host of other equally
enchanting applications are available in the market today. Your application of
multimedia is only limited by your imagination. Training or instructional methods and
advancement in technologies have always gone hand in hand. For example:
z Virtual reality, where 3D experimental training can simulate real situations.
z Computer simulations of things too dangerous, expensive, offensive, or time-
sensitive to experience directly Interactive tutorials that teach content by selecting

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

16 appropriate sequencing of material based on the ongoing entry of student


Multimedia and its Applications
responses, while keeping track of student performance.
z Electronic presentations.
z Instruction or resources provided on the Internet (World Wide Web; 24 hours a
day).
z Exploratory hypertext software (i.e. encyclopaedias, databases) used for
independent exploration by learners to complete research for a paper, project, or
product development. They may use IMM resources to collect information on the
topic or use multimedia components to create a product that blends visual, audio
or textual information for effectively communicating a message.
Education courses, skills, and knowledge are sometimes taught out of context due to
lack of application of real time examples. To overcome this, educators are using
multimedia to bring into their classrooms real-world examples to provide an
in-context framework important for learning. Multimedia and tools like the Internet
give Faculty instant access to millions of resources.

1.8.2 Entertainment
The field of entertainment uses multimedia extensively. One of the earliest and the
most popular applications of multimedia is for games. Multimedia made possible
innovative and interactive games that greatly enhanced the learning experience.
Games could come alive with sounds and animated graphics. These applications
attracted even those to computers, who, otherwise would never have used them for
any other application.
Games and entertainment products may be accessed on standard computer work-
stations via CDs or networks or on special purpose Game machines that connect to
television monitors for display. These functions are quiet complex and challenging for
the users.
These products rely on fairly simple navigational controls to enable the user to
participate. Joystick and track ball are often used for moving objects, pointing guns or
flying aircrafts while mouse buttons and keystrokes are used to trigger events like
firing guns / missiles.
Multimedia based entertainment and game products depend on the use of graphics,
audio, animation and video to enhance their operation. A game may include computer
graphics taking the user on a hunt on a deserted island for hidden treasures or a
princess. Audio is used for sound effects while video and animation are used for
special effects.
These types of products also offer multi player features in which competition is
managed between two or more players.

1.8.3 Tool for Business


Even basic office applications like a MS word processing package or a MS Excel
spreadsheet tool becomes a powerful tool with the aid of multimedia business.
Pictures, animation and sound can be added to these applications, emphasizing
important points in the documents and other business presentations.

1.8.4 Video Conferencing and Virtual Reality


Virtual reality is a truly fascinating multimedia application. In this, the computer
creates an artificial environment using hardware and software. It is presented to the
user in such a way that it appears and feels real. Three of the five senses are controlled
by the computer in virtual reality systems. Virtual reality systems require extremely
expensive hardware and software and are confined mostly to research laboratories.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Another multimedia application is video conferencing. When a conference is 17


Introduction to Multimedia
conducted between two or more participants at different sites by using computer
networks to transmit audio and video data, then it is called video conferencing. A
videoconference is a set of interactive telecommunication technologies which allow
two or more locations to interact via two-way video and audio transmissions
simultaneously. It has also been called visual collaboration and is a type of groupware.

Figure 1.1: Video Conferencing System

Digital compression of audio and video streams in real time is the core technology
behind video conferencing. Codec is the hardware or software that performs
compression. Compression rates of up to 1:500 can be achieved. The resulting digital
stream of 1's and 0's is subdivided into labelled packets, which are then transmitted
through a digital network usually ISDN or IP.
The other components required for a VTC (Video Tele Conference) system include:
Video input: video camera or webcam
Video output: computer monitor or television
Audio input: microphones
Audio output: usually loudspeakers associated with the display device or telephone
Data transfer: analog or digital telephone network, LAN or Internet
There are basically two kinds of VTC systems:
1. Desktop systems are add-ons to normal PC's, transforming them into VTC
devices. A range of different cameras and microphones can be used with the
board, which contains the necessary codec and transmission interfaces.
2. Dedicated systems have all required components packaged into a single piece of
equipment, usually a console with a high quality remote controlled video camera.
These cameras can be controlled from a distance to move left and right, tilt up and
down, and zoom. They are known as PTZ cameras. The console contains all
electrical interfaces, the control computer, and the software or hardware-based

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

18 codec. Omni directional microphones are connected to the console, as well as a


Multimedia and its Applications
TV monitor with loudspeakers and/or a video projector.
There are several types of dedicated VTC devices:
Large group VTC are non-portable, large, more expensive devices used for large
rooms and auditoriums.
Small group VTC are non-portable or portable, smaller, less expensive devices used
for small meeting rooms. Individual VTC are usually portable devices, meant for
single users, have fixed cameras, microphones and loudspeakers integrated into the
console.

1.8.5 Electronic Encyclopaedia


It is the application of multimedia for the creation of an encyclopaedia with millions
of entries and hypertext cross references covering a wide variety of research and
reference topics mainly for educational and training purposes.

1.9 SKILLS NEEDED FOR MULTIMEDIA DEVELOPMENT


Multimedia mixes sound, visual images and movement. More often than not it
involves interactivity. The creation of a single product involves a whole range of
multi-disciplinary skills.
The development of major multimedia applications is usually a team effort, involving
people with a variety of skills. Usually projects focus on the roles of each individual
involved. A person may have many skills, but it may be someone else's role to
demonstrate some of these skills. For example, if you are employed as the
Programmer, and someone else is the Graphic Artist, learn not to offer your graphic
design skills or opinions unless asked.

Executive Producer Project Manager Creative Director Graphic Artist

Script-Writer Programmer Interface Designer Animator

Usability Engineer Tester Evaluator Musician/Actor


Composer Sound Designer Videographer Photographer

In most multimedia production teams, each person fills multiple roles. The larger the
project, the more specialised the roles. Also, remember that many roles may be
defined, or even validated externally. The governments of different nations publish
their skills frameworks. This defines many different roles within their IT industry. In
other industries trade unions may control what roles individuals may perform, and
what those roles consist of.

Check Your Progress 2


1. What is interactive multimedia?
……………………………………………………………………………….
……………………………………………………………………………….
2. Explain the application of multimedia in education and training.
……………………………………………………………………………….
……………………………………………………………………………….
3. How does a video tele-conference system work?
……………………………………………………………………………….
……………………………………………………………………………….

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

19
1.10 LET US SUM UP Introduction to Multimedia

Multimedia uses several media (e.g. text, audio, graphics, animation, video) to convey
information. Multimedia also refers to the use of computer technology to create, store,
and experience multimedia content. In this lesson, we have discussed the concept of
multimedia, its applications in various fields like education, training, business,
entertainment etc.
During this lesson you are working towards being able to achieve the following
objectives.
z to understand the basic concepts and various applications of multimedia
z to identify appropriate elements for multimedia interfaces
z to evaluate architectures for multimedia applications
Some final questions to ask yourself are:
z Have you planned your work for the rest of this module?
z Can you offer a variety of definitions of multimedia and its component parts?
z Where can multimedia add value?
z What are the different roles, or jobs, involved in developing multimedia?

1.11 LESSON END ACTIVITIES


1. In groups discuss the following issues and record your thoughts regarding
multimedia awareness:

What is multimedia?
Has there been a “multimedia revolution”, and if
so what caused it?
Why create multimedia?

What are the current and potential application areas for


multimedia?
What benefits can Multimedia bring?

What are the potential pitfalls?

Who develops it? What skills and talents needed


by a multimedia developer?
How is multimedia developed?
Where and when is multimedia developed? In the
recording studio, on the movie set, in a creative
design environment, in the training departments of
large companies?

2. Examine a range of multimedia packages, devices or installations from a variety


of application areas. For each package you examine consider the following:
™ how easy is it to use?
™ is it aesthetically pleasing?
™ how easy is it to navigate around?

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

20 ™ how effectively are text, graphics, animation, sound and video used?
Multimedia and its Applications
™ how well does it meet the purpose for which it was intended?
™ which package did you like the most and why?
™ which package did you like the least and why?

Potential Sources for Packages


™ WWW – try the websites of multimedia developers or tool makers
(e.g. Macromedia, Disney)
™ Help files for software packages
™ your own CD-ROMs or DVDs, or those distributed with magazines
™ libraries

1.12 KEYWORDS
Multimedia: Multimedia can be defined as any combination of two or more of the
following: text, graphics, sound, animation and video which are integrated together
and delivered by a computer.
Hypertext: Hypertext is conceptually the same as regular text - it can be stored, read,
searched, or edited – with an important difference: hypertext is text with pointers to
other text.
Hypermedia: Hypermedia is a superset of hypertext. Hypermedia documents contain
links not only to other pieces of text, but also to other forms of media – sounds,
images, and movies.

1.13 QUESTIONS FOR DISCUSSION


1. Discuss the applications of multimedia in detail.
2. What do you understand by links?
3. How does video conferencing system work?

Check Your Progress: Model Answers


CYP 1
1. Multimedia is the use of several media (e.g. text, audio, graphics,
animation, video) to convey information. Multimedia also refers to the use
of computer technology to create, store, and experience multimedia content.
Multimedia can be scholarly defined as the ‘interactive dramatisation of
information’. Multimedia technology uses the computer to combine text,
graphics, animation, audio and full-motion video under the user’s control.
Although combinations of these functions have been available for some
years, it has been difficult to integrate them so that the non-technical user
can manipulate them and thereby create documents or applications that
incorporate all these features. Multimedia systems have the power to
involve users in rich interactions with diverse information. For interactions
with complex, interrelated, bodies of information, hypermedia techniques
can be employed to give the user non-linear access to the stored material. If
a multimedia application is to offer the user more than this, for instance to
provide advice, or help with a problem, then a more intensive interaction is
necessary. Exploitation of knowledge-based system techniques may be
relevant and the interaction will need to have dialogue-like characteristics.
Contd…

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Multimedia is a new aspect of literacy that is being recognised as 21


Introduction to Multimedia
technology expands the way people communicate. The concept of literacy
increasingly, is a measure of the ability to read and write. In the modern
context, the word, means reading and writing at a level adequate for written
communication. A more fundamental meaning is now needed to cope with
the numerous media in use, perhaps meaning a level that enables one to
function successfully at a certain status in society. Multimedia is the use of
several different media to convey information. Several different media are
already a part of the canon of global communication and publication: (text,
audio, graphics, animation, video, and interactivity). Others, such as virtual
reality, computer programming and robotics are possible candidates for
future inclusion. With the widespread use of computers, the basic literacy
of ‘reading’ and ‘writing’ are often done via a computer, providing a
foundation stone for more advanced levels of multimedia literacy.
2. The basic building blocks of multimedia are text, graphics, animation,
sound and video. Each of these has significance to the end-user – we have
tastes and preferences shaped by the newspapers we read and programmes
we watch. Each of these building blocks also has business models for
production and exploitation. Writers, musicians, artists and actors have
long-established norms for making a living from their talents. Multimedia
developers, on the other hand, want to be free to re-use work as they see
fit – to exploit it.
Also, in considering these building blocks, don't forget the artifacts and the
processes that give rise to them. Some may have an intrinsic value (for
example, the storyboard for a movie may later be exhibited in a gallery),
and all inevitably form, or imply, some kind of contract between the parties
involved. Also you should bear in mind that multimedia inevitably involves
the creative use of the latest technology. "New" is often confused with
"Good". Although innovations eventually make things more efficient, they
describe a "trough of disillusionment" that follows the peak of inflated
expectations, before the innovation reaches the height of enlightenment. It's
a useful way to understand the responses of investors, consumers and
technologists to innovation and its promises.
3. A key associated concept is interactivity – multimedia applications,
whether developed in-house or produced commercially, tend to be
interactive.
Interactive multimedia allows the user to engage in a conversation with the
application – the user can control, which elements are delivered, when they
are delivered and in what order they are delivered.
This can range from clicking a mouse to move forwards to the next page, to
clicking on ‘hot-spots’ to navigate forwards and backwards or to jump to
another section, to selecting a topic from a menu, to clicking a button to
start playing a video clip.
Another associated concept is hypermedia. These are interactive
applications where the user can freely navigate through an application from
link to link rather than proceeding in a linear manner.
Hypermedia applications allow different users to follow different routes
through a package. The World Wide Web is an example of a distributed
hypermedia system. It is in the integration and the creation of the user's
experience that multimedia adds value. This experience can be located
along a number of axes - between "lean forward" (multimedia that you
consume at a computer screen, and with which you must actively interact)
or "lean back" (multimedia you consume from a distance, suchContd… as

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

22 interactive television); with high or low levels of user interaction: a linear


Multimedia and its Applications
experience that has been tailored to your profile through which you move
forwards or backwards, or a highly interactive experience that you may
explore in any direction.
4. Hypertext: Hypertext is conceptually the same as a regular text – it can be
stored, read, searched, or edited – with an important difference: hypertext is
text with pointers to other text. The browsers let you deal with the pointers
in a transparent way – select the pointer, and you are presented with the text
that is pointed to.
Hypermedia: Hypermedia is a superset of hypertext. Hypermedia
documents contain links not only to other pieces of text, but also to other
forms of media - sounds, images, and movies. Images themselves can be
selected to link to sounds or documents. Hypermedia simply combines
hypertext and multimedia.
5. Humans associate pieces of information with other information and create
complex knowledge structures. Hence, it is also said that the human
memory is associative. For example, a person starts with an idea which
reminds him/her of a related idea or a concept which in turn reminds
him/her of another idea. The order in which a human associates an idea
with another idea depends on the context under which the person wants
information. When writing, an author converts his/her knowledge which
exists as a complex knowledge structure into an external representation.
Information can be represented only in a linear manner using physical
media such as printed material and video tapes. Therefore, the author has to
convert his knowledge into a linear representation using a linearisation
process. The reading process can be viewed as a transformation of external
information into an internal knowledge representation combined with
integration into existing knowledge structures, basically a reverse operation
of the writing process. For this, the reader breaks the information into
smaller pieces and rearranges these based on the reader’s information
requirement. We rarely read a text book or a scientific paper from start to
end. Hypermedia, using computer enabled links, allows us to partially
imitate writing and reading processes as they take place inside our brain.
We can create non linear information structures by associating pieces of
information in different ways using links. Further, we can use a
combination of media consisting of text, images, video, sound and
animation for value addition in the representation of information. It is not
necessary for an author to go through a linearisation process of his/her
knowledge when writing. The reader can also access some of the
information structures the author had when writing the information. This
helps the readers create their own representation of knowledge and to gel it
into existing knowledge structures.
6. Different types of links used in hypermedia are:
Structural Links: The information contained within the hypermedia
application is typically organised in some suitable fashion. This
organisation is typically represented using structural links. We can group
structural links together to create different types of application structures. If
we look, for example, at a typical book, then this has both a linear structure
i.e. from the beginning of the book linearly to the end of the book and
usually a hierarchical structure in the form of the book that contains
chapters, the chapters contain sections, the sections contains sub-sections
etc. Typically in a hypermedia application we try to create and utilize
appropriate structures.
Contd…

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

23
Associative Links: An associative link is a link which is completely Introduction to Multimedia
independent of the specific structure of the information. We have
links based on the meaning of different information components. The most
common example which most people would be familiar with is
cross-referencing within books for example - for more information on
X refers to Y.
Referential Links: A third type of link is a referential link. It is related to
the associative link. Rather than representing an association between two
related concepts, a referential link provides a link between an item of
information and an explanation for that information.

CYP 2
1. Users can use a variety of input devices to interact with the computer, such
as a joystick, keyboard, touch screen, mouse, trackball, microphone, etc.
Multi refers to the multiple file usages used in the multimedia product, such
as sound, animation, graphics, video, and text.
In multimedia, many media sources can be used as components in the
multimedia product, such as a videodisk, CDROM, videotape, scanner, CD
or other audio source, camcorder, digital camera, etc. Media may also refer
to the storage medium used to store the interactive multimedia product,
such as a videodisk or CDROM.
2. Multimedia is used in education and training fields as follows:
™ Computer simulations of things too dangerous, expensive, offensive, or
time-sensitive to experience directly. Interactive tutorials that teach
content by selecting appropriate sequencing of material based on the
ongoing entry of student responses, while keeping track of student
performance.
™ Electronic presentations.
™ Instruction or resources provided on the Internet (World Wide Web;
24 hours a day).
™ Exploratory hypertext software (i.e. encyclopaedias, databases) used
for independent exploration by learners to complete research for a
paper, project, or product development. They may use IMM resources
to collect information on the topic or use multimedia components to
create a product that blends visual, audio or textual information for
effectively communicating a message.
Education courses, skills, and knowledge are sometimes taught out of
context due to lack of application of real time examples. To overcome this,
educators are using multimedia to bring into their classrooms real-world
examples to provide in-context framework important to learning.
Multimedia and tools like the Internet give Faculty instant access to
millions of resources.
3. When a conference is conducted between two or more participants at
different sites by using computer networks to transmit audio and video
data, then it is known as video conferencing. A video conference is a set of
interactive telecommunication technologies which allow two or more
locations to interact via two-way video and audio transmissions
simultaneously. It has also been called visual collaboration and is a type of
groupware.

Contd…

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

24 Digital compression of audio and video streams in real time is the core
Multimedia and its Applications
technology behind video conferencing. Codec is the hardware or software
that performs compression. Compression rates of up to 1:500 can be
achieved. The resulting digital stream of 1's and 0's is subdivided into
labelled packets, which are then transmitted through a digital network
usually ISDN or IP.
The other components required for a VTC system include:
Video input: video camera or webcam
Video output: computer monitor or television
Audio input: microphones
Audio output: usually loudspeakers associated with the display device or
telephone
Data transfer: analog or digital telephone network, LAN or Internet.

1.14 SUGGESTED READINGS


Dhiraj Sharma, Foundations of IT, Excel Books, 2008.
Vaughan, Multimedia Making IT Work, Fifth Edition, Tata McGraw-Hill.
John F. Koegel Bufford, Multimedia Systems, Pearson Education, 2003.
Judith Jeffloate, Multimedia in Practice (Technology and Applications), PHI, 2003.
Ze-Nian Li and Mark S. Drew, Fundamentals of Multimedia, Prentice-Hall, 2004.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

25
LESSON Making Multimedia

2
MAKING MULTIMEDIA

CONTENTS
2.0 Aims and Objectives
2.1 Introduction
2.2 Multimedia Development Lifecycle
2.2.1 Concept Development
2.2.2 Requirements Specification
2.2.3 Design
2.2.4 Prototype
2.2.5 Production
2.2.6 Testing and Evaluation
2.2.7 Delivery and Support
2.3 Iterative Design Techniques
2.3.1 Prototyping
2.3.2 Exploratory Programming
2.4 Selection of the Production Platform (Macintosh and Windows Production Platforms)
2.5 Macintosh vs PC
2.5.1 Macintosh Platform
2.5.2 Windows Multimedia PC Platform
2.5.3 Networking Macintosh and Windows Computers
2.6 Basic Software Tools
2.6.1 Types of Basic Tools
2.7 Let us Sum up
2.8 Lesson End Activity
2.9 Keywords
2.10 Questions for Discussion
2.11 Suggested Readings

2.0 AIMS AND OBJECTIVES


After studying this lesson, you would be able to:
z Describe typical phases in a multimedia development lifecycle
z Contrast this with other methodologies such as scenario-based design
z Select the production platform for multimedia
z Use basic software tools in multimedia
z Understand various types of basic tools

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

26
Multimedia and its Applications
2.1 INTRODUCTION
This lesson examines the different stages of a multimedia application lifecycle from
conception to delivery. The need for a systematic approach is emphasised. The
requirements for an interactive system can never be defined precisely from the start
and thus the process will be an iterative one. Different approaches are needed,
including understanding user requirements and culture. You may even need to
recognise when customers may want to undermine your methodical approaches. For
some people, the customer is always right. Later in the module we look at user-
centred design methodologies.
Also we introduce the importance of early definition of testing methodology, test
specifications etc. The definition of a test – how we can demonstrate that we have met
an objective, will often clarify that objective and avoid misinterpretation. Also, you
should understand the importance of good project management.
The lifecycle is not something that you merely follow, it is something that you select,
monitor and drive, perhaps even modify as you gain greater understanding of what is
useful. Storyboards are an early example of documentation that creates impressions,
makes commitments, to which the developer might later be held.

2.2 MULTIMEDIA DEVELOPMENT LIFECYCLE


The exact approach taken varies from organisation to organisation and depends on the
complexity of the project, but in general all projects go through the following stages
or phases, often repeating stages:
z Concept development
z Requirements specification
z Design
z Prototype
z Production
z Testing and evaluation
z Delivery
These development stages generally overlap and involve much iteration. This can
make planning and reporting problematic.

2.2.1 Concept Development


The starting point is an initial idea or concept. This can be further defined and refined
through brainstorming. Rough storyboards can be used as an aid to visualisation. The
topic can then be further researched. A feasibility study may be carried out and initial
estimates of costs, timescales and, where appropriate, projected revenue made. These
estimates are often based on previous work.
A production overview document called the proposal or brief can then be developed.
This document should include an outline of how the subject is to be approached, the
scope of the project and the intended audience. A history of the company’s work and a
budget can also be included.
To be successful, the proposal must be readable, and concise and to suit the type of
production. The proposal is often delivered face to face, and hence effective personal
communication is required. In this situation, a storyboard may be used as a graphical
aid to communicating the idea.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

A multimedia developer may then ground their proposal to a potential client in a bid 27
Making Multimedia
to get them to commission their proposed project. Alternatively a client may ask a
number of developers to pitch for a given project they have in mind. Sometimes
relatively small projects may have a discussion/meeting lasting a day or longer to
agree the outline of the project.
Once the client and developer have agreed in principle to go ahead with the project, a
detailed content outline, schedule and budget can then be drawn up. Typically a
developer receives the right to invoice the customer for part of the budget at the outset
and several more payments will be due on acceptance of some or even all of the stages
described below.

Context of Use
However, at this stage relatively little is known about the eventual user of the software –
their preferences, needs and wants, the things they must do at the same time as use the
software, the higher-level activities in which they are engaged, the goals and
objectives they are trying to achieve. All this is what is known as the context of use.
Often the customer, the one paying for the project, has an imperfect understanding of
all of this. There remains a conflict between developer, customer and user to be
managed throughout, and there is much current research on how best to do this. One
approach is that of scenario-based design, another of task analysis, and each results in
a variety of possible lifecycles.

2.2.2 Requirements Specification


The next stage may be to produce a more detailed requirement specification. The
treatment is expanded in detail to include all the aims, objectives and subject scope,
which were covered in the successful proposal. Some users may resist this – some
software methodologies and approaches that seek to establish the context of use – see
a fixed requirement specification as unhelpful. In many cases the requirements
specification is more helpful to the developer – who can use this to negotiate budget
and agreement of scope than it is to the customer.
The requirement specification should cover the minimum hardware and software
configuration(s) needed to run the application, the intended audience and the aims of
the application. The content outline can also be included along with general
navigational and screen layout guidelines. The requirement specification may have to
be refined as the project develops. The requirements should also specify when the
application is to be developed and at what cost.
The requirements specification allows the client and developer to agree the scope,
deadline and deliverables of the final product. These requirements are gathered
through consultation with the client and, where possible, typical end-users. On
completion of the product, the requirements specification can be used as the basis of
the test specification to ensure that all the specified requirements have been met.
The complexity of the requirements-gathering stage depends not only on the
complexity of the application but also on the quality of the communication between
the customer and the application producer. The quality of communication is affected
not only by personal communication skills, but also by the customer’s knowledge of
what is possible, and by the producer’s understanding of the subject matter. This
‘knowledge gap’ can be the single most significant barrier to unambiguous
communications.

2.2.3 Design
The concept is further refined during this stage. A design document is drawn up. This
describes how the content will be organised and accessed. Navigation maps can help

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

28 in this. A script will be produced. This provides a complete guide to the production.
Multimedia and its Applications
An early version of this may have been supplied to give substance to the initial
proposal.
Storyboards are used to aid visualisation of representative screens and sequences. The
design should be presented to the client and, where possible, typical users for
comment and feedback, and revisions made, if necessary.
Having developed a broad overview of the project, more detailed design on the user
interface (buttons for navigation, help access, quit access etc.) and on the layout of
individual screens can be carried out.
In addition to designing the actual application, the project should be planned and
budgeted. This involves estimating how much time should be spent on individual
components, setting milestones or deliverables, and costing the project. This should be
started in the analysis phase and refined during the design phase.

2.2.4 Prototype
A small portion of the product is completed in detail. This detailed mini-production
tests not only the design but the production process as well. A skeleton outline of the
application can also be developed to test the structure and navigation. The client then
critiques the prototype and revisions are made. Where possible, typical end users
should also be involved in evaluating the prototype. In some cases, the client takes the
prototype to distributors to help sell the larger project.

2.2.5 Production
This is traditionally the most intensive phase in the development cycle. Different
approaches may be taken. For large-scale productions, the script expands into a
lengthy document describing screens, menus, titles and captions, every word spoken
or seen. Descriptions of images, animations, audio and video may also be included.
An evolutionary approach may be taken for small-scale productions. Here, the
prototype is added to and refined until the requirements have been met and the client
is satisfied.
In each case, individual media components have to be sourced, digitised, and
processed. Copyright clearances will have to be obtained. Integration can then take
place as the elements are imported into the authoring environment. Images are then
arranged on screen, objects animated, text formatted and audio and movies
synchronised. Programming can then take place, with the software coming to life as
interactivity is added to the content. Installation routines are prepared. After this the
product is then ready for beta testing.
However, there can be certain projects where production is an ongoing issue, even
after delivery, which may only have involved a small, token, amount of content.
Typically this might involve a content management system. Here, the customer may
take responsibility for generating further assets to the specifications that your software
engine can then access for their end-users.
E-commerce websites are a good example of this marketplace, but education, training
and audio-visual entertainment products also have a constant flow of new products
and variations to old. Implementation also demands that you develop an installation
process that is capable of being performed by the required person. Typically, this
might be a member of the customer's staff, who may not even be particularly
technical.

2.2.6 Testing and Evaluation


When programmers are confident that the code is accurate, beta testers are given the
chance to show that they are wrong. The program is installed on a number of different

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

platforms (for example on the minimum, typical and most up-to-date hardware 29
Making Multimedia
specifications) and the testers are asked to review its functionality, critique it and
report bugs. Test scripts must be prepared and followed to ensure that every aspect of
the programme is tested under a variety of likely contexts. Wherever possible typical
users should be involved in the testing of the product. Corrections are made. Further
testing and debugging takes place until both developer and client are satisfied that the
product is ready for mass distribution. Regression testing requires a subset of all
known tests to be run every time an aspect of the programme is changed. Too often, a
fix to a bug breaks another aspect of the programme or reveals a previously
undetected flaw.
Testing is primarily concerned with the functionality of the application. These are
aspects which either work or do not work. The usability of the interface also needs to
be evaluated. Is the application fit for purpose? Is it effective to user’s environment? If
the customer has gone through a number of changes in their business objectives, do
we still understand what the objectives for the programme are? This can make all the
difference between getting a follow-up job and not.
Smithson & Hirschheim (1998) propose a three layer model to evaluate the quality of
a system: Efficiency, Effectiveness and Understanding. The first two are fairly
familiar, although effectiveness can be the source of endless debate, but understanding
is a more problematic issue – dealing with political issues, personal constructs etc.

2.2.7 Delivery and Support


With CD and DVD, a Master is prepared and sent to the disc duplicator for
replication. Artwork for packaging and labels might be supplied – usually as digital
files. Clearly, master must have been tested until it is flawless. The accompanying
instructions will have been tested to ensure that no user has problems with them.
Alternatively, the product may be delivered online, in which case you should be
concerned about server security and back-up provisions.
The customer may require a contractual period during which you will correct any
faults in the software at no cost to the customer within predefined time-periods (often
related to severity of the problem). This means that you must include in your project
plan a number of developer days to provide this.

2.3 ITERATIVE DESIGN TECHNIQUES


Design and implementation is a significant portion of the multimedia application
development. Many experts believe that requirements of an interactive system cannot
be completely specified from the beginning of the lifecycle and propose that
prototyping or exploratory programming be used in iterative design and
implementation processes. This argument can be applied to the design and
implementation of a multimedia application’s navigation scheme.
z a structure of the system can be designed and implemented, for experiment and
comment from the customer.
z by repeating this process, the structure can evolve into the final shape of the
application to which the creative presentation pieces are added.
The same iterative process can also be applied to screen layout. A drawback to this
technique is that it affords the customer many opportunities to change the application
specification. There is an additional danger that you may inadvertently commit to
something that is too difficult to deliver within budget or on time.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

30
Multimedia and its Applications
2.3.1 Prototyping
An initial program is developed for customer comment. The main objective of the
prototype (as shown in Figure 2.1) is to generate requirements.

Figure 2.1: Prototyping within Software Life Cycle

2.3.2 Exploratory Programming


The term exploratory programming refers to implementing an initial solution to an
outline specification and modifying the solution until the customer is satisfied. The
key to this approach is to use techniques which allow for rapid system iterations.
Suggested changes may be incorporated and demonstrated as quickly as possible. This
requires a very high-level programming language and the use of powerful integrated
software tools.
Figure 2.2 shows the iteration-taking place. This model is used where it is difficult to
define the specifications in detail or where correctness is not paramount. This
technique is also known as Iterative Prototyping. Rapid Application Development is a
variation on this.

Develop
outline
specification

Write Software Use Software

no Software
Adequate
?
yes
Deliver
Software

Figure 2.2: Exploratory Programming

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

31
2.4 SELECTION OF THE PRODUCTION PLATFORM Making Multimedia
(MACINTOSH AND WINDOWS PRODUCTION
PLATFORMS)
Selection of the proper platform for the development of multimedia project may be
based on your personal preference of computer, your budget constraints, project
delivery requirements, and the type of material and content in the project. Many
developers believe that multimedia project development is smoother and easier on the
Macintosh than in Windows, even though projects destined to run in Windows must
then be ported across platforms. But hardware and authoring software tools for
Windows have improved: today you can produce many multimedia projects with
equal ease in either the Windows or Macintosh environment.

2.5 MACINTOSH VS PC
Since its inception, the Macintosh has been, by definition, a multimedia computer: at
the famous rollout of the Macintosh in January 1984 at Apple's annual shareholders'
meeting, the new device actually introduced itself in a crudely synthesized voice.
Whereas the Macintosh had good built-in audio right from the start, in 1984 IBM
personal computers could not process sound without very expensive add-on
components. With its focus on business computing, the PC remained for many years
able to provide only system beeps and limited sound effects on a tiny (and tinny)
on board speaker. Recently, due primarily to the demands of game software, lower-
cost sound boards and software have become available for PCs. Other multimedia
tools and hardware, such as video digitizers, are now readily available in PC
marketing channels.
When installed with Windows, a sound board, and SuperVGA graphics, the PC
readily challenges the Macintosh in delivering excellent audio and visual
presentations. An MPC computer, moreover, will always provide sound capability, a
CD-ROM player, access to the Media Control Interface (MCI) for extensions to video
overlay boards and other peripherals, and minimum CPU and memory configuration.
The Multimedia Personal Computer, or MPC, is an industry wide effort begun in the
late 1980s to provide a standardized and capable multimedia computing environment
for PCs.
In 1997, with the Macintosh share of the personal computer market at an all-time low
of about 3 percent, Gistics, Inc., a market research firm, studied the comparative
advantages of Macintosh versus Windows/Intel computers and discovered that
Macintosh users:
z Spend 38 fewer hours per year futzing with files
z Save $4,950 annually on support and training
z Use more tools (14.3 versus 8.3)
z Save $2,211 in three-year cost of ownership
z Earn $5.01 more per hour
z Earn $12.22 more revenue per hour of labor
z Create $14,550 more profits per year per person
z Earn 32 percent more net profit per project
z Achieve platform payback in 7.2 months (versus 13.9)

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

32
Multimedia and its Applications
2.5.1 Macintosh Platform
All Macintoshes can play sound. And the latest generation of Macintoshes includes
hardware and software for digitizing sound without additional hardware or sound
cards. For most Macintoshes, 8-bit, 16-bit, and 24-bit graphics capability is available
‘out of the box.’ The AV series of Macintoshes can digitize video as well as sound.
Unlike the Windows environment, where users can operate an application with
keyboard input, the Macintosh requires a mouse.
Nevertheless, there is significant variation in the ways you can set up your Macintosh
hardware and software. The Macintosh computer you will need for developing a
project depends entirely upon the project's delivery requirements, its content, and the
tools you will need for production. Of course, the ideal production station is the
newest, fastest, and most flexible computer you can get your hands on, but such a
configuration may be beyond the scope of your budget. Thankfully, acceptable
performance is not limited to the top-of-the-line configuration: most Macintosh
models sold today are sufficient for multimedia development.
Apple introduced the first Power Macintosh computers based on reduced instruction-
set computing (RISC) microprocessors during 1994. RISC technology was typically
used in engineering workstations and commercial database servers designed for raw
computational power, but in an alliance with IBM and Motorola, Apple designed and
built this new line of RISC based models. They supplanted earlier models based upon
the Motorola 68000, 68030 and 68040 processors. In 1997, the G3 series was
introduced with clock speeds greater than 233 MHz and offering higher performance
than existing Pentium-based Windows machines.

2.5.2 Windows Multimedia PC Platform


The MPC computer is not a hardware unit per se, but rather a standard that includes
minimum specifications to turn Intel-microprocessor-based computers into
multimedia computers. The standard applies not only to desktop computers but also to
increasingly powerful multimedia laptops.
Because the MPC is a standard, not a computer, you can assemble your own clone
with components from various suppliers and meet the standard. Upgrade kits that
typically include a CD-ROM player and a sound board are available from many
hardware vendors. MPC-compliant systems are available in pre-packaged
configurations from most manufacturers. Vendors who sell MPC computers will
guarantee that software written to the MPC standard, usually labeled with the MPC
mark, will play on their machines.
The Multimedia PC Marketing Council was rolled out with the support of Microsoft
and major PC manufacturers in 1990. Since then, oversight of the MPC specification
has been transferred to the Software Publishers Association (SPA), and the
Multimedia PC Marketing Council is now called the Multimedia PC Working Group.
The original MPC Level l minimum standard workstation consisted of a 16 MHz
386SX microprocessor, at least 2MB of RAM, a 30MB hard disk, a CD-ROM drive,
VGA video (16 colors), an 8-bit audio board, speakers and/or headphones, and
Microsoft Windows software with the Multimedia Extensions package. This
minimum-configuration MPC was not powerful enough to develop serious multimedia
and was hardly powerful enough to play multimedia at all. A more realistic MPC
Level 2 minimum standard was released in 1993. This specification defined the
minimum system functionality for Level 2 compliance as a 25 MHz 486SX
(or compatible) microprocessor with at least 4MB of RAM (8MB recommended), a
3.5-inch high-density (1.44MB) floppy disk drive, a 160MB or larger hard drive, and
a CD-ROM drive capable of sustained 300 Kbps transfer rate (double speed) with
CD-DA (Red Book) outputs and volume control, 16-bit sound capability with

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

microphone input, and a color monitor with display resolution of at least 640x480 33
Making Multimedia
with 65,536 (64K) colors.
In June 1995, the Multimedia PC Working Group released the Multimedia PC Level 3
specification providing for improved sound and video performance. For content
producers, MPC3 offers a solid set of minimum performance standards for multimedia
machines. Rather than merely defining the hardware, this specification focuses on the
quality of the end user experience. This quality is assured through the creation of
validation suites that test each machine as a complete system, rather than as a series of
individual parts. And the specification offers a number of exciting new capabilities
upon which developers can depend in creating new titles, such as wavetable sound, 4x
CD-ROM drives and MPEG video.
The big upgrade in this MPC3 standard, following many months of meetings and
negotiations within the Working Group, was the requirement for MPEG video
playback compliance. The MPC3 platform consequently provides full motion video
with TV-quality and CD-quality sound (albeit in a 352x240-pixel window). If your
multimedia project will be distributed into retail consumer channels, you should
consider applying to the MPC Working Group for a license to use the MPC logo on
your packaging.

2.5.3 Networking Macintosh and Windows Computers


If you are working in a multimedia development environment consisting of a mixture
of Macintosh and Windows computers, you will want them to communicate with each
other. You will also wish to share other resources among them, such as printers.
Local Area Networks (LANs) and Wide Area Networks (WANs) can connect the
members of a workgroup. In a LAN, workstations are usually located within a short
distance of one another, on the same floor of a building, for example WANs are
communication systems spanning great distances, typically set up and managed by
large corporations and institutions for their own use, or to share with other users.
LANs allow direct communication and sharing of peripheral resources such as file
servers, primers, scanners, and network modems. They use a variety of proprietary
technologies, most commonly Ethernet or TokenRing, to perform the connections.
They can usually be set up with twisted-pair telephone wire, but be sure to use ‘data-
grade level 5’ wire-it makes a real difference, even if it's a little more expensive! Bad
wiring will give you a never-ending headache of intermittent and often untraceable
crashes and failures.
WANs are expensive to install and maintain, but other methods for long-distance
communication are available without a dedicated telephone network: dial-up
connections to the Internet through an Internet Service Provider (ISP), CompuServe,
America Online, MSN, Prodigy, or MCI Mail allow messages and files to be uploaded
to private electronic (e-mail) mailbox addresses and downloaded later by the recipient.
You pay for a local telephone call and the length of time you are connected to the
service (usually at a reasonable hourly rate or flat fee). If you are working with people
in various time zones (an artist in New York, a programmer in San Francisco, and a
client in Singapore), all can communicate and share information with other locations
at any time of day or night.
If you are operating a cross-platform multimedia development shop, you should install
an Ethernet system so your PCs can talk with your Macintoshes. This is many times
more efficient than carrying removable media among your machines. Macintoshes
have Ethernet networking built in; your PCs will require Ethernet cards. Ethernet is
only a method for wiring up computers, so you still will need client/server software to
enable the computers to speak with each other and pass files back and forth. Here you
have two options: you can add software to your Macintosh to allow it to connect to a

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

34 network of Windows PCs that use the Microsoft Client TCP/IP protocols, or you can
Multimedia and its Applications
add software to your Windows PC that allows it to connect to a network of
Macintoshes that uses AppleTalk. Both require Ethernet as the connection method.
With DAVE from Thursby Systems software is installed on the Macintosh to enable
the Macintosh to connect to the Microsoft TCP/IP network. Users can then mount
shared Macintosh hard drives and PostScript printers that are connected to the
Macintosh's AppleTalk network so they can be used by the PCs in the network.
DAVE uses the industry standard TCP/IP protocol with a NetBIOS driver. To connect
a PC to a network of Macintoshes, you can use MACLAN Connect from Miramar
(http://www.miramarsys.com) to share the directories and files on all your computers
using AppleTalk protocols-you do not have to install a dedicated server workstation.

Check Your Progress 1


1. What are the various stages in multimedia development lifecycle?
……………………………………………………………………………….
……………………………………………………………………………….
2. How you would select a production platform for multimedia projects?
……………………………………………………………………………….
……………………………………………………………………………….

2.6 BASIC SOFTWARE TOOLS


In this section, we will discuss various tools used in the field of multimedia. The basic
toolset for building multimedia projects, contains, one or more authoring systems and
various applications for text, images, sounds and motion video editing. A few
additional applications are useful for capturing images from the screen, changing file
formats and moving files among computers when you are part of a team. These are
basically tools for the housekeeping tasks that make your creativity and productivity
better. The software in your multimedia toolkit and your skill at using it will
determine the kind of multimedia work you can do and how fine and fancy you can
render it. Keep your tools sharp by upgrading them when new software and features
become available, by thoroughly studying and learning each tool, by reading tips and
tricks in the computer magazines and FAQs (Frequently Asked Questions) files
on-line and in Internet newsgroups and by observing the practices and products of
other multimedia developers.

2.6.1 Types of Basic Tools


Various types of basic tools for creating and editing multimedia elements are:
z Painting and Drawing tools
z Image editing tools
z OCR software
z 3D Modeling and Animation tools
z Sound editing programs
z Animation, Video and Digital movies

Painting and Drawing Tools


Painting software is dedicated to producing crafted bitmapped images. Drawing
software like Corel Draw and Canvas is dedicated to producing vector based line art
easily printed to paper using Postscript or another page mark up system such as Quick
Draw.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Main features/criteria for selection are: 35


Making Multimedia
z Intuitive graphical interface with pull down menus, status bars, palette control and
dialog boxes for quick logical selection.
z Scalable dimensions for resizing, stretching and distorting.
z Paint tools to create geometric shapes.
z Ability to pour a colour, pattern or gradient.
z Ability to paint with patterns and clip arts.
z Customisable pen and brush shape and sizes.
z Eyedropper tools for colour sampling.
z Auto trace tool for converting bitmapped images into vector based outlines.
z Multiple undo capabilities.
z Support for scalable text fonts.
z Painting features with anti-aliasing, air brushing, color washing, blending,
masking etc.
z Support for third party special effects.
z Object and layering capabilities.
z Zooming for magnified pixel editing.
z All common colour depths.
z Good file importing and exporting capabilities.

Image Editing Tools


These are specialise and powerful tools for enhancing and re-touching existing
bitmapped images. These applications also provide many of the features and tools of
the painting and drawing programs and can be used for creating images from scratch
as well as images digitised from scanners, video frame grabbers, digital camera, clip
art files or original art work files created with a drawing package.
Features of image editing applications are:
z Conversion of major image data types and industry standard file formats.
z Direct input from scanners etc.
z Employment of virtual memory scheme.
z Multiple window scheme.
z Image and balance control for brightness, contrast etc.
z Masking undo and restore features.
z Multiple video, Anti-aliasing, sharpening and smoothing controls.
z Colour mapping controls.
z Geometric transformations.
z All colour palettes.
z Support for third party special effects plug-ins.
z Ability to design in layers that can be combined, hidden and recorded.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

36 Optical Character Recognition Software (OCR)


Multimedia and its Applications
Often, you will have printed matter and other text to incorporate into your project but
no electronic text file. With OCR software, a flat bed scanner and your computer you
can save many hours of rekeying printed words and get the job done faster and more
accurately than a roomful of typists.
OCR software turns bitmapped characters into electronically recognizable ASCII text.
A scanner is typically used to create the bitmap. Then, the software breaks the bitmap
into chunks according to whether it contains text or graphics by examining the texture
and density of areas of the bitmap and by detecting edges. The text areas of the bitmap
are then converted to ASCII characters using probability and expert system algorithm.
Most OCR application claim 99 percent accuracy when reading 8 to 36 point
characters at 300 dpi and can reach processing speeds of about 150 character per
second.

3D Modeling and Animation Tools


With 3D modeling software, objects rendered in perspective appear more realistic.
One can create stunning scenes and wander through them, choosing just the right
lighting and perspective for your final rendered image. Powerful modeling packages
such as Macromedia’s Extreme 3D, Autodesk’s 3D Studio Max. Strata Vision’s 3D,
Specular’s Logo motion and Infini-D and Caligari’s truespace are also bundled with
assortments of pre-rendered 3D clip art objects such as people, furniture, buildings,
cars, aero plane, trees and plants.
Features of a good 3D modeling software are:
z Multiple windows that allow you to view your model in each dimension.
z Ability to drag and drop primitive shapes into a scene.
z Create and sculpt organic objects from scratch.
z Lathe and extrude features.
z Colour and texture mapping.
z Ability to add realistic effects such as transparency, shadowing and fog.
z Ability to add spot, local and global lights, to place them anywhere and
manipulate them for special effects.
z Unlimited cameras with focal length control.

Sound Editing Programs


Sound editing tools for both digitised and MIDI sound lets you see music as well as
hear it. By drawing a representation of a sound in fine increments, whether a score or
a waveform, you can cut, copy, paste and otherwise edit segments of it with great
precision – something impossible to do in real time with music playing.
System sounds are beeps used to indicate an error, warning or special user activity.
Using sound editing software, you can make your own sound effects and install them
as system beeps.

Animation, Video and Digital Movies


Animations and digital video movies are sequences of bitmapped graphic scenes or
frames, rapidly played back. But animations can also be made within the authoring
system by rapidly changing the location of the object to generate an appearance of
motion. Most authoring tools adapt either a frame or object oriented approach to
animation but rarely both.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Movie making tools take advantage of QuickTime and Microsoft Video for Windows 37
Making Multimedia
also known as AVI or Audio Video Interleaved technology and let you create, edit and
present digitised video motion segments usually in a small window in your project.
To make movies from video you need special hardware to convert the analog video
signal to digital data. Movie making tools such as Premiere, Video Shop and Media
Studio Pro let you edit and assemble video clips captured from camera, tape and other
digitised movie segments, animations, scanned images and from digitised audio and
MIDI files. The completed clip usually with added transition and visual effects can
then be played back either stand alone or windowed within your project.
Morphing is an animation technique that allows you to dynamically blend two still
images creating a sequence of in-between pictures that when played back rapidly in
Quick Time, metamorphoses the first image into the second. For example a racing car
transforms into a tiger, and a daughter’s face becomes her mother’s.

Accessories
A Screen Grabber is an essential accessory. Bitmap images are so common in
multimedia, that it is important to have a tool for grabbing all or part of the screen
display so that you can import it into your authoring system or copy it into an image
editing application. Screen grabbing to the clipboard lets you move a bitmapped
image from one application to another without the cumbersome steps of exporting the
image to a file and then importing it back to the destination. Another useful accessory
is Format Converter which is also indispensable for projects in which your source
material may originate on Macintoshes, PCs, Unix Workstations or even mainframes.

Check Your Progress 2


1. Discuss the role of animation and digital video movies in multimedia
applications.
……………………………………………………………………………….
……………………………………………………………………………….
2. What are Image Editing Tools?
……………………………………………………………………………….
……………………………………………………………………………….
3. What is the role of Optical Character Recognition Software (OCR) in
multimedia projects?
……………………………………………………………………………….
……………………………………………………………………………….
4. What are 3D Modeling and Animation Tools?
……………………………………………………………………………….
……………………………………………………………………………….

2.7 LET US SUM UP


In this lesson, we have tried to understand the Multimedia Development Lifecycle.
The stages in the cycle are: Concept development, Requirements Specification,
Design, Prototype, Production, Testing and evaluation, Delivery and Support. Another
section deals the selection of the production platform such as Macintosh and Windows
production platforms. The lesson also throws light on Basic Software Tools used in
multimedia. The basic types of tools used are: Painting and Drawing Tools, Image
Editing Tools, Optical Character Recognition Software (OCR), 3D Modeling and
Animation Tools, Animation, Video and Digital Movies, Accessories etc.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

38
Multimedia and its Applications 2.8 LESSON END ACTIVITY
To judge the multimedia awareness, discuss the following in groups:
z What is design?
z What does a designer do?
z What factors does a designer need to consider when developing a new product?
z Is fashion design different from engineering design?
z What differentiates good design from bad?
z What does a multimedia designer design?
z Multimedia designers: are they artists or software engineers? What is the
difference?
z How can we promote good design when designing interactive systems?

2.9 KEYWORDS
Screen Grabber: It is an essential accessory. The images are so common in
multimedia, that it is important to have a tool for grabbing all or part of the screen
display so that one can import it into authoring system or copy it into an image editing
application.
Image Editing Tools: These are specialized tools for enhancing and re-touching
existing bitmapped images. These applications also provide many of the features and
tools of the painting and drawing programs and can be used for creating images from
scratch.
LANs and WANs: Local area networks (LANs) and wide area networks (WANs) can
connect the members of a workgroup. In a LAN, workstations are usually located
within a short distance of one another, on the same floor of a building. WANs are
communication systems spanning great distances, typically set up and managed by
large corporations and institutions for their own use, or to share with other users.

2.10 QUESTIONS FOR DISCUSSION


1. Discuss the suitability of Macintosh and Windows production platforms.
2. When would it be more appropriate to follow a linear development cycle, and
when a highly iterative one?
3. How do you find out a customer's preferred development lifecycle?

Check Your Progress: Model Answers


CYP 1
1. The exact approach taken varies from organisation to organisation and
depends on the complexity of the project, but in general all projects go
through the following stages or phases, often repeating stages:
™ Concept development
™ Requirements specification
™ Design
™ Prototype
™ Production
Contd…
Downloaded by Wahitha Banu (wahitha19@gmail.com)
lOMoARcPSD|19697633

™ Testing and evaluation 39


Making Multimedia
™ Delivery
These development stages generally overlap and involve iteration.
2. The selection of the proper platform for the development of multimedia
project may be based on user’s personal preference of computer, budget,
project delivery requirements, and the type of material and content in the
project. Many developers believe that multimedia project development is
smoother and easier on the Macintosh than in Windows, even though
projects destined to run in Windows must then be ported across platforms.
But hardware and authoring software tools for Windows have improved:
today one can produce many multimedia projects with equal ease in either
the Windows or Macintosh environment.
CYP 2
1. Animations and digital video movies are sequences of bitmapped graphic
scenes or frames, rapidly played back. But animations can also be made
within the authoring system by rapidly changing the location of the object
to generate an appearance of motion. Most authoring tools adapt either a
frame or object oriented approach to animation but rarely both.
2. These are specialized and powerful tools for enhancing and re-touching
existing bitmapped images. These applications also provide many of the
features and tools of the painting and drawing programs and can be used
for creating images from scratch as well as images digitised from scanners,
video frame grabbers, digital camera, clip art files or original art work files
created with a drawing package.
3. With OCR software, a flat bed scanner and computer we can save many
hours of rekeying printed words and get the job done faster and more
accurately than a roomful of typists. OCR software turns bitmapped
characters into electronically recognizable ASCII text. A scanner is
typically used to create the bitmap. Then, the software breaks the bitmap
into chunks according to whether it contains text or graphics by examining
the texture and density of areas of the bitmap and by detecting edges. The
text areas of the bitmap are then converted to ASCII characters using
probability and expert system algorithm. Most OCR application claim 99
percent accuracy when reading 8 to 36 point characters at 300 dpi and can
reach processing speeds of about 150 character per second.
4. With 3D modeling software, objects rendered in perspective appear more
realistic. One can create stunning scenes and wander through them,
choosing just the right lighting and perspective for your final rendered
image. Powerful modeling packages such as Macromedia’s Extreme 3D,
Autodesk’s 3D Studio Max. Strata Vision’s 3D, Specular’s Logo motion
and Infini-D and Caligari’s truespace are also bundled with assortments of
pre-rendered 3D clip art objects such as people, furniture, buildings, cars,
aero plane, trees and plants.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

40
Multimedia and its Applications 2.11 SUGGESTED READINGS
Dhiraj Sharma, Foundations of IT, Excel Books, 2008.
England, E. & Finney, A., (2002) Managing Multimedia Book 1 – People and Processes,
3rd Ed., Addison Wesley, Wokingham UK.
Smithson, S. and Hirschheim, R.A., (1998). "Analysing information systems evaluation:
another look at an old problem," European Journal of Information Systems, Vol. 7, No. 3.
Vaughan, Multimedia Making IT Work, Fifth Edition, Tata McGraw-Hill.
John F. Koegel Bufford, Multimedia Systems, Pearson Education, 2003.
Judith Jeffloate, Multimedia in Practice (Technology and Applications), PHI, 2003.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

41
Making Instant Multimedia

UNIT II

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

42
Multimedia and its Applications

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

43
LESSON Making Instant Multimedia

3
MAKING INSTANT MULTIMEDIA

CONTENTS
3.0 Aims and Objectives
3.1 Introduction
3.2 Making and Linking Multimedia Objects
3.2.1 AppleEvents
3.2.2 Dynamic Data Exchange (DDE) and Object Linking and Embedding (OLE)
3.3 Office Suites
3.3.1 MS Office
3.4 Word Processors
3.4.1 MS Word
3.4.2 WordPerfect
3.4.3 WordPro
3.5 Spreadsheets
3.5.1 Lotus 1-2-3
3.5.2 MS Excel
3.6 Databases
3.6.1 FileMaker Pro
3.6.2 MS Access
3.7 MS PowerPoint
3.8 Multimedia Authoring Tools
3.8.1 Authoring Tools versus Programming Tools
3.8.2 Types of Authoring Tools
3.9 Cross-Platform Authoring Notes
3.10 Choice of the Right Tool for the Job
3.11 Multimedia Tool Features
3.12 Let us Sum up
3.13 Lesson End Activity
3.14 Keywords
3.15 Questions for Discussion
3.16 Suggested Readings

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

44
Multimedia and its Applications 3.0 AIMS AND OBJECTIVES
After studying this lesson, you would be able to:
Comprehend the process of making instant multimedia
Understand various packages and languages used in multimedia
Understand different types of authoring tools
Choose the appropriate tools in multimedia development
Make a choice in selecting appropriate language in multimedia development

3.1 INTRODUCTION
There is no reason to buy a dedicated multimedia authoring package if your current
software can serve the purpose. Indeed, not only can you save money by doing
multimedia with tools that are familiar and already at hand, but you also save the time
spent on arduous and sometimes lengthy learning curves involved in mastering many
dedicated authoring systems. Common desktop presentation tools have become
multimedia powerful, while dedicated multimedia authoring systems are offering
simplified, easy-to-use versions.
Most personal computers sold today are able to produce at least the sound and
animation elements of multimedia. Manufacturers of popular software for word
processing, spreadsheets, database management, graphing, drawing, and presentation
have added capabilities for sound, image, and animation to their products. You can
call a voice annotation, picture, or QuickTime or AVI movie from most word
processing applications. You can click a cell in a spreadsheet to enhance its content
with graphic images, sounds, and animations. Your database can include pictures,
audio clips and movies. Your presentation software can easily generate interesting
titles, visual effects, and animated illustrations for your product demo. With these
multimedia-enhanced software packages, you get many more ways to effectively
convey your message than just a slide show.
To enliven your material and provide interesting illustrations, you can add multimedia
elements to familiar tools such as word-processed documents, spreadsheets,
presentation aids, and even HTML documents. But where do you get these elements?
You can either make your images, sounds, and animations from scratch or you can
import them from collections of clip media. You can also license rights to use
resources or content, such as pictures, songs and music, and video, from their owners.
Importing from stock material limits you somewhat, but it may be all you need, and
these collections can yield quick and simple multimedia productions. If you make
your multimedia elements from scratch or edit existing material, you'll need to have
special software and hardware tools to customize the images, sounds, and animations,
but the results are more spectacular and dramatic. You would need special multimedia
tools for digitizing your sounds and creating animations and movies before you can
attach these objects to your word, data, or presentation documents.
Some multimedia projects may be so simple that you can cram all the organizing,
planning, rendering, and testing stages into a single effort, making instant multimedia.
Here is an example: The topic at your weekly sales meeting is sales force
performance. You want to display your usual spreadsheet so the group can see real
names and numbers for each member of the team; then you want to show a
multicolored 3D bar graph for visual impact. Preparing for your meeting, you annotate
the cell containing the name of the most productive salesperson for the week, using
sounds of applause taken from a public domain CD-ROM, or a recording of your CEO
saying ‘Good job!’ or a colleague's ‘Wait till next week!’ At the appropriate time
during the meeting, you click that cell and play the annotation. And that's it, you have

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

just made and used instant multimedia. The following overviews do not include all 45
Making Instant Multimedia
products in each category of software tools, but they will give you an idea of how
multimedia might be applied in your everyday life working with computers.

3.2 MAKING AND LINKING MULTIMEDIA OBJECTS


The elements of multimedia (and other digitized information) are often treated as
discrete objects that have particular characteristics or properties. With objects
described in a common format using object-oriented programming systems (OOPs),
text, bitmapped images, sounds, and video clips can be dynamically linked among
applications and documents or even embedded in them. This object-oriented approach
to information management is supported on both Macintosh and Windows platforms
and is utilized at the core of some multimedia authoring systems.

3.2.1 AppleEvents
On the Macintosh, AppleEvents lets applications communicate with each other,
sharing data and commands. Inter Application Communication (lAC) works with
AppleEvents to automatically update documents that are linked with the ‘publish-and-
subscribe’ features of AppleEvents.
When you publish an application and then edit the data in it, the changes you make are
copied to all of the subscribers to that data, even across a network. Publish-and-
subscribe uses a transition file called the edition file. You can subscribe to a
spreadsheet table in a word processing document, for example, and when you change
the spreadsheet, the word processing document gets changed automatically. Or you
can embed a PICT image or QuickTime file in one application and change it in
another, and the changes will appear in both applications – the two applications talk
directly to each other.
To use publish-and-subscribe, following are the steps:
1. Select data that you want to place into another application or document.
2. From the Edit menu, choose Create Publisher. This brings up a dialog box asking
you to name the edition file that will connect the publisher to the other documents
subscribing to the data.
3. After you have created the edition file, go to the document or application where
you want to use the data and select Subscribe To... from the Edit menu.
Now you have placed a live copy of the data in your document; whenever you modify
the original publisher data, the subscriber is automatically updated, too.

3.2.2 Dynamic Data Exchange (DDE) and Object Linking and Embedding
(OLE)
Dynamic Data Exchange (DDE) and Object Linking and Embedding (OLE) are two
methods for linking data objects among Windows applications. For example, let's say
you want to advertise your new mousetrap design with a flashy graphic-an illustration
showing your mousetrap compared to other mousetraps on the market – and some text
describing its extraordinary features. First, you make a colorful picture in a graphics
application such as Microfax Designer, then you create a bar chart comparing the
number of mice in a spreadsheet program such as Excel, and finally, you paste all
your elements into a word processor such as Microsoft Word.
When two applications share data through DDE, they are in a conversation. DDE
allows data to be transmitted between a client (the application that initiates the
conversation) and a server (the application responding to the client). Data can be
transmitted as a hot link so that modifications in the server application are also

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

46 updated in the client application, or as a cold link so that data in the client application
Multimedia and its Applications
remain independent of the server after it has been imported.
OLE lets you embed or link data objects created in different Windows or Macintosh
applications. An embedded object becomes a part of the file into which it is pasted,
independent of the original application where it was created. A linked object, on the
other hand, is changed automatically in a container file that points to the original file
when the original file is updated. Linking is a useful feature for data that may be
modified after it has been placed into other files. While using OLE, make sure that
your linked files are not moved to other directories, or the links may be broken.
OLE 2.0 (and later versions) have improved the ability to track links between
containers and objects, but the best way to ensure that your object is not lost is to
embed it in a file.

3.3 OFFICE SUITES


Office suites integrate into a single package the various productivity tools essential to
running a business. Suites offer the convenience of a common interface with similar
menus, commands, and toolbars, and they also allow you to share data among the
applications in that suite using OLE and DDE. Suites also provide compatibility
across the Macintosh and Windows platforms. Microsoft Office, for example, contains
Word for word processing; Excel, a spreadsheet program, PowerPoint for creating
presentations; and for Windows, a database manager (Access) and a scheduler. Each
program is a stand-alone application, but all are interlinked through OLE and DDE.
Claris's Claris Works for Macintosh and Windows provides a word processor,
spreadsheet program, database manager, presentation tool, drawing/painting tool, and
communications software in one application. Works offers seamless cross-platform
file sharing between Macintosh and Windows platforms.

3.3.1 MS Office
MS Office is the most efficient suite of applications for document creation,
communication and business information analysis. For many functions, the business
platform has evolved from paper to the Web. Microsoft Office extends desktop
productivity to the web, streamlining the way you work and making it easier to share,
access and analyze information so you get better results. Office offers a multitude of
new features. Of particular importance for this release are the features that affect the
entire suite. These Office-wide, or shared features hold the key to the new realm of
functionality enabled by Office. Office offers a new Web-productivity work style that
integrates core productivity tools with the Web to streamline the process of sharing
information and working with others. It makes it easier to use an organization's
intranet to access vital business information and provides innovative analysis tools
that help users make better, timelier business decisions. Office delivers new levels of
resiliency and intelligence, enabling users and organizations to get up and running
quickly, stay working and achieve great results with fewer resources. The components
of MS Office are as follows:
MS Word
MS Excel
MS PowerPoint
MS Access

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

47
3.4 WORD PROCESSORS Making Instant Multimedia

Many word-processed documents are ultimately printed to paper, but many are also
delivered on a server or floppy disk, to an electronic mailbox, or as HTML documents
on the World Wide Web. If others will be viewing your document on a computer,
consider attaching multimedia voice notes, pictures, or animated illustrations to
emphasize your point or to clarify something that is difficult to express in words.

3.4.1 MS Word
MS Word is a powerful word processor that allows you to create:
Memos
Fax coversheets
Web pages
Reports
Mailing labels
Brochures
Tables, and
Many other professional and business applications.
Microsoft's Word for Macintosh and Windows provides essentially the same user
interface on both platforms and offers special multimedia features. You can make and
import various image formats, including PICT, TIFF, BMP, and EPS, to place them in
your document. You can add QuickTime movies to your document; control the
movie's playback characteristics (forward, backward, start, and stop); and perform
simple editing with cut, copy, and paste commands. In the Windows version, AVI
movies can also be played within your Word document. You can import digitized
sounds, and you can record voice comments from an internal microphone, saving the
recording (with a portion of the text or an icon as an identifier) for playback.
Annotations can be searched for sound effects and content, edited, and even saved as
separate files in any of four formats. With Word for Windows, you can also create
links to other programs using OLE.
MS Word provides easy graphics handling, calculation of the data tables, ability to
create a mailing list, list sorting and efficient file management.

Major features of the Word


Major features of the Word are as follows:
1. Auto Summarize Feature: This feature automatically summarizes the key points
in the document. Word determines the most important sentences and gives a
custom summary based on the analysis.
2. Auto Complete Feature: Auto Complete Feature automatically offers suggestions
to complete the word or the phrase that has been typed partially. To accept
suggestions, press the Enter Key and the Word automatically replaces the partially
typed word with the complete word. Word automatically completes the current
date, a day of the week, a month other than the current one, your name and the
company name and the AutoText entries.
3. Automatic Grammar Checking: It marks the incorrect grammar with a green
wavy line as you type.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

48 4. Letter Wizard: This helps you format and enter the key information for the letters
Multimedia and its Applications
to ensure that they are consistent and professional. It lets you write quickly and
easily and also to add to your letter.
5. Office Assistant: The MS Office uses IntelliSense Natural-Language Technology.
The assistant anticipates the kind of help you require and suggests the Help topics
on the work that you are doing. This office assistant provides the visual examples
and the step-by-step instructions for the specific tasks.
6. Smart Spelling Features
Recognizes your name, your organization’s name and the professional names
of varying ethnicity.
Recognizes your writing pattern and does not mark some patterns as errors in
the document.
Ignores the Internet and the file addresses as error in the spellings.
7. Natural Language Grammar Checker: This provides improved syntactical
analysis, better rewrite suggestions and user friendly grammar styles.
8. Spelling and Grammar Checking Combination Feature: This feature eliminates
the separate dialog boxes and provides the interface that lets you proofread the
document online.
9. Hyperlinks Features: This links to the Microsoft Outlook, HTML or the other
files on any internal and external Web site or file server.

3.4.2 WordPerfect
In Corel's WordPerfect Macintosh version, a tool palette and drawing commands
allow you to create and edit graphics with the standard Macintosh drawing tools, as
well as create Bezier curves and polygons; there's also a free rotation tool. A color
editor lets you blend, rainbow, and complement colors. You can edit, size, scale, and
crop graphic images, and then click and drag them anywhere in your document while
text automatically reformats around them.
WordPerfect for Macintosh offers a QuickTime movie-playing facility. The movie is
represented by its poster, usually the first frame of the movie. You can represent your
movie as a character, anchor it to a page or paragraph, move it, add a caption, or put a
frame around it, just as you can with graphics. There is a movie controller that gives
you many options, such as custom playback or changing the poster.
Using DDE, WordPerfect for Windows can share data with other DDE compatible
programs that use DDE links. If the data changed in a linked program, they are
automatically updated in the WordPerfect document. A figure editor makes it easy to
add graphics to your documents. You can view, retrieve, create, modify, and size
figures, and save or import them into your document. WordPerfect for Windows
works with the common graphic formats for DOS, as well as Windows metafiles and
bitmaps.

3.4.3 WordPro
With its Windows DDE and OLE capabilities, Word Pro (formerly Ami Pro) from
Lotus can link to other applications and embedded objects, such as sounds and AVI
movies. Using DDE, you can paste a link in Windows bitmap or metafile format into
an empty selected frame. You can even create a macro to control another application
through DDE. With OLE, you can link or embed objects into a frame in a Word Pro
document.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

49
3.5 SPREADSHEETS Making Instant Multimedia

Spreadsheets have become the backbone of many users' information management


systems. A spreadsheet organizes its data in columns and rows. Calculations are made
based on user-defined formulas for, say, analyzing the survival rates of seedlings, or
the production of glass bottles in Russia, or a household's consumption of energy in
ergs per capita. Spreadsheets can answer what-if questions, build complex graphs and
charts, and calculate a bottom line. From Alaska to Zimbabwe, spreadsheets have
become a ubiquitous computer tool.
Most spreadsheet applications provide excellent chart-making routines; some allow
you to build a series of several charts into an animation or movie, so you can
dramatically show change over time or under varying conditions. Full-color curves
that demonstrate changing annual sales, robbery and assault statistics, or birth rates
may have a far greater effect on an audience than will a column of numbers. The latest
spreadsheets let you attach special notes and drawings, including full multimedia
display of sounds, pictures, animations, and video clips.

3.5.1 Lotus 1-2-3


Lotus 1-2-3 lets you rearrange graph elements by clicking and dragging and using a
menu to access data objects from the outside world. You can place bitmapped pictures
and other objects such as QuickTime movies anywhere in your spreadsheet. There is a
complete color drawing package for placing lines, circles, arrows, and special text on
top of the spreadsheet to help illustrate its content.

3.5.2 MS Excel
MS Excel is a spreadsheet package. When you start excel, a blank workbook appears
in the document window. The workbook is the main document using excel for storing
and manipulating the data. A workbook has individual worksheets each of consisting
of data. Each work sheet is made up of 256 columns and 65,536 rows. Using a special
template document, you can create a slide show with Microsoft Excel (in both the
Macintosh and Windows versions) to present worksheets, charts, and graphics. You
can apply video and audio transition effects between slides, adjusting speed and the
method of slide advance. The SLIDES.XLA file must be installed in the Windows
version, and the Slideshow Add-In file for the Macintosh. QuickTime and AVI
movies can be linked to Microsoft Excel documents.

Features of Excel
(a) The multiple Undo feature can Undo up to the last 16 actions.
(b) When you quit Microsoft Excel with multiple files open, you get a YES to ALL
option. You can choose this option to save all the files before exiting, instead of
being prompted to close each open file.
(c) Conditional Formats dynamically apply a different font style, pattern and the
border to the cells whose values fall outside or within the limit specified by you.
This lets you quickly spot areas of interest without reading through tables of
values.
(d) The Hyperlinks Feature helps you to create hyperlinks that connect to other
office files on the system, your network. A hyperlink can be text in the cell, a
graphic or you can write a formula that creates a hyperlink.
(e) The Web Queries feature allows you to create and run the queries to retrieve data
available on the World Wide Web.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

50 (f) The Internet Assistant Wizard steps you through the process of saving the
Multimedia and its Applications
worksheet data and the charts in the HTML format. You can save the data and the
chart as a complete new Web Page or add them to an existing Web Page.
(g) The new Share Workbook feature lets multiple users open a workbook on the
network and edit the document simultaneously.
(h) CellTips and the ScrollTips automatically display the comments added to cell.
(i) The Worksheet has expanded to include 65,536 rows and you can type up to
32,000 characters in a cell.
(j) Natural Language formulas allow you to create formulas that use row and the
column headers instead of the range references.
(k) The Auditing and the Validation Facility allow you to circle the invalid data and
to see at a glance all the entries that don’t meet your validation rules.
(l) The enhanced Get External Data features enable you to query Access and the
other databases either on the system or a network or the Internet or intranet
resources.

3.6 DATABASES
A database program can store, sort, retrieve, and organize many types of information.
Like spreadsheets, databases can exist in a digital environment without ever needing
to be printed to paper. Images, sounds, and movies are treated as objects and can be
stored, retrieved, and played by many databases. In the coming years, it is likely that
multimedia databases will become a primary method by which corporate users interact
with multimedia elements.

3.6.1 FileMaker Pro


Claris's FileMaker Pro, a relational database, rates high in ease of use and cross-
platform capability with both Windows and Macintosh PowerPC versions. FileMaker
Pro has a relatively simple interface, yet it is powerful enough to handle moderately
complex operations through scripting capabilities. You can use the built-in graphics
tools and record sound within the application, or you can import images, sounds, and
QuickTime movies from other applications. Although you can design layouts from
scratch, there are now 40 customizable templates from which to choose for business,
education, or home use. FileMaker Pro supports AppleEvents in the Macintosh
version and OLE/DDE in the Windows version.

3.6.2 MS Access
MS Access is the relational database application in the Microsoft Office Professional.
With Access, you can perform the following tasks:
Organize data into manageable related units.
Enter, modify and locate data.
Extract subsets of data based on the specific criteria.
Create custom forms and reports.
Automate common database tasks.
Graph data relationships.
Microsoft Access (Windows only) is a relational database application available on its
own or as part of the Microsoft Office Professional bundle of products. With a
relational database, you input and store data only once, but you can view data in
various ways. With Access, you can view the data in tables that show data from many

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

records at once, forms that show data from each individual record, and reports from 51
Making Instant Multimedia
which to summarize and print data. The Database Wizard automatically builds tables,
queries, forms, and reports with common business and personal database templates,
including ones for asset management, order entry, and music collection tracking. It
even adds sample data to help you get started. Or, if you already have a flat-file list or
spreadsheet, use the Table Analyzer Wizard to import the data into an Access
database. Access supports OLE objects and allows you to import images into forms
and reports.

Major features of Access


Major features of Access are as follows:
The Publish to the Web Wizard converts your Access information to a dynamic
Internet or intranet site including query pages.
The Outlook Journal helps you to track when a database file was opened or closed, or
when an object was printed.
A new Hyperlink data type is supported to allow insertion of links to other objects,
documents, or Internet performance.
Improved design features include the ability to create forms with multiple tabs.
Lightweight Forms and Reports load without loading Visual Basic for Applications,
leading to faster performance.
User-Level Security Wizard creates a secured copy of the database.
The Visual basic code for the objects has been updated with the methods, properties
and other language elements.
The multiple pages button allows you to select the number of pages to preview.
The Performance Analyzer analyses database objects and suggest ways to make them
fast.

3.7 MS POWERPOINT
MS PowerPoint is a powerful presentation software which is used to create
professional quality presentations. These can be reproduced on the Transparency,
Paper, 35mm slide, Photo print, on screen presentations. This allows the user to easily
publish presentations on the Internet.

Features of PowerPoint
(a) PowerPoint Central: It connect you with the resources like the templates, sounds
and the animation clips on the CD-ROM and the sites on the Internet.
(b) Slide Finder: Slide Finder allows the previewing and the insertion of slides from
the other presentations.
(c) Quick Start Tutorial: This helps to introduce the features of Power Point.
(d) Graphs: It provides the improved charting module for the Power Point. Following
are the major features:
i. Additional Chart Types: MS PowerPoint gives new chart types such as
bubble, pie of pie and the bar of pie. It also offers additional 3D and 2D chart
types such as cylinder, pyramid and cone.
ii. Chart Data Tables: Enhances the chart by adding explanatory details by
attaching the data table that contains the numbers represented
diagrammatically.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

52 iii. Rotated Texts on the Chart Axes: To display all the necessary data
Multimedia and its Applications
proportionately for easier viewing, the fonts can be scaled and the text rotated
along the chart axes.
iv. Picture, Text and Gradient Fills: To graphically represent data, you can fill
the chart elements such as the bars, areas and the surfaces with texture,
imported pictures or gradient fills.
(e) Multiple Undo Feature: This feature displays an Undo List on the standard tool
bar from which you can select the change you want to reverse.
(f) Active Web Service: This active web service is used and shared by all Microsoft
Office programs to browse rich webs of the presentations and documents on the
local computer, any server, an Intranet or the Web.
(g) Built in Buttons: Power Point has a set of built in buttons for the actions such as
Forward, Back, Home, Help, Information, sound and Movie. By clicking on any
of these buttons another program can be started.
(h) CD-Auto Play: CD-Audio tracks can be played during the presentation.
(i) Auto Content Wizard: Auto content wizard guides user to pick from the set of
pre-built templates. It also provides ideas and the starter text for the presentations.
(j) Summary slide: Summary slide is used to create a summary slide based on the
titles of the slides created.
(k) Office Art: This is a drawing tool shared by Microsoft Office programs and
provides:
i. AutoShapes-includes six new auto shapes.
ii. Bezier Curves-used to draw exact curves with point positions.
iii. Transparent Background- inserts a bit map as a part of design of the slides.
(l) Multimedia Capabilities: Animation effects and Multimedia include:
i. Custom Animation - an easier way to define and preview animated effects.
ii. Voice narration - to add a presenter’s voice to the self-running documentation.
iii. Music tracks - to add background music and the sound effects to the
presentations.
iv. Animated templates - animation effects can be added to the slide master and
will be automatically added when the slides are created.

Check Your Progress 1


1. What is Microsoft Office?
……………………………………………………………………………….
……………………………………………………………………………….
2. What is MS Word? Elaborate the important features of MS Word.
……………………………………………………………………………….
……………………………………………………………………………….
3. What is MS Excel? Elaborate the important features of MS Excel.
……………………………………………………………………………….
……………………………………………………………………………….
4. Elaborate the important features of MS Access.
……………………………………………………………………………….
……………………………………………………………………………….

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

53
3.8 MULTIMEDIA AUTHORING TOOLS Making Instant Multimedia

Multimedia authoring tools provide the important framework you need for organizing
and editing the elements of your multimedia project, including graphics, sounds,
animations, and video clips. Authoring tools are used for designing interactivity and
the user interface, for presenting your project on screen, and for assembling
multimedia elements into a single, cohesive project.
Authoring software provides an integrated environment for binding together the
content and functions of your project. Authoring systems typically include the ability
to create, edit, and import specific types of data; assemble raw data into a playback
sequence or cue sheet; and provide a structured method or language for responding to
user input.
With multimedia authoring software, you can make:
Video productions
Animations
Games
Demo disks and interactive guided tours
Presentations
Interactive kiosk applications
Interactive training
Simulations, prototypes, and technical visualizations
Authoring tools usually refers to computer software that helps multimedia developers
create products. Authoring tools are different from computer programming languages
in that they are supposed to reduce the amount of programming expertise required in
order to be productive. Some authoring tools use visual symbols and icons in
flowcharts to make programming easier. Others use a slide show environment.
Authoring tools help in the preparation of texts. Generally, they are facilities provided
in association with word processing, desktop publishing, and document management
systems to aid the author of documents. They typically include an on-line dictionary
and thesaurus, spell-checking, grammar-checking, style-checking, and facilities for
structuring, integrating and linking documents.
Also known as Authorware, it is a program that helps you write hypertext or
multimedia applications. Authoring tools usually enable you to create a final
application merely by linking together objects, such as a paragraph of a text, an
illustration, or a song. By defining the objects’ relationships to each other, and by
sequencing them in an appropriate order, authors (those who use authoring tools) can
produce attractive and useful graphics applications.

3.8.1 Authoring Tools versus Programming Tools


The distinction between authoring tools and programming tools is not clear-cut.
Typically, though, authoring tools require less technical knowledge to master and are
used exclusively for applications that present a mixture of textual, graphical, and
audio data.
Multimedia authoring tools provide the important framework you need for organising
and editing the elements of your multimedia project including graphics, sounds,
animations and video clips. Authoring tools are used for designing interactivity and
user interface, for presenting your project on screen and for assembling multimedia
elements into a single cohesive project.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

54 Authoring software provides an integrated environment for binding together the


Multimedia and its Applications
contents of your project. Authoring systems typically include the ability to create, edit
and import specific types of data, assemble raw data into a playback sequence or a cue
sheet and provide a structured method or language for responding to user input.

3.8.2 Types of Authoring Tools


This lesson arranges the various authoring tools into groups based on the metaphor
used for sequencing or organizing multimedia elements and events:
1. Card- or page-based tools
2. Icon-based, event-driven tools
3. Time-based and presentation tools
4. Object-oriented tools
1. Card- or page-based tools: In these authoring systems, elements are organized as
pages of a book or a stack of cards. Thousands of pages or cards may be available
in the book or stack. These tools are best used when the bulk of your content
consists of elements that can be viewed individually, like the pages of a book or
cards in a card file. The authoring system lets you link these pages or cards into
organized sequences. You can jump, on command, to any page you wish in the
structured navigation pattern. Card or page-based authoring systems allow you to
play sound elements and launch animations and digital video.
2. Icon-based, event-driven tools: In these authoring systems, multimedia elements
and interaction cues (events) are organized as objects in a structural framework or
process. Icon-based, event-driven tools simplify the organization of your project
and typically display flow diagrams of activities along branching paths. In
complicated navigational structures, this charting is particularly useful during
development.
3. Time-based tools: In these authoring systems, elements and events are organized
along a timeline, with resolutions as high as 1/30 second. Time-based tools are
best to use when you have a message with a beginning and an end. Sequentially
organized graphic frames are played back at a speed that you can set. Other
elements (such as audio events) are triggered at a given time or location in the
sequence of events. The more powerful time-based tools let you program jumps to
any location in a sequence, thereby adding navigation and interactive control.
Multimedia elements and events are often treated as objects that live in a
hierarchical order of parent and child relationships. Messages passed among these
objects order them to do things according to the properties or modifiers assigned
to them. Objects typically take care of themselves. Send them a message and they
do their thing without external procedures and programming.
4. Object-oriented tools: In these authoring systems, multimedia elements and
events become objects that live in a hierarchical order of parent and child
relationships. Messages passed among these objects order them to do things
according to the properties or modifiers assigned to them. In this way, for
example, Suzie (a teenager object) may be programmed to take out the trash every
Friday evening, and does so when she gets a message from Dad. Spot, the puppy,
may bark and jump up and down when the postman arrives. Spot is defined by
barking and jumping modifiers; Suzie by hauling-trash modifiers or child objects
(groan, sit up, stand up, saunter slowly, get bag, carry bag, slam door, etc.).
Objects typically take care of themselves. Send them a message and they do their
thing without external procedures and programming.
Let us discuss each in detail.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Card and Page-based Authoring Tools 55


Making Instant Multimedia
Card and page-based authoring tools provide a simple and easily understood metaphor
for organizing multimedia elements. Because graphic images typically form the
backbone of a project, both as navigation menus and as content, many developers first
arrange their images into logical sequences or groupings similar to the chapters and
pages of a book, or cards in a card catalog. In Card or page-based tools the elements
are organized as pages of a book or a stack of cards. Thousands of pages or cards may
be available in the book or stack. These tools are best used when the bulk of your
content consists of elements that can be viewed individually, like the pages of a book
or cards in a card file. The authoring system lets you link these pages of a book or
cards into organized sequences. You can jump, on command, to any page you wish in
the structured navigation pattern. Navigation routines become, then, simply directives
to go to a page or card that contains appropriate images and text, and associated
sounds, animations, and video clips. Card or page-based authoring systems allow you
to play sound elements and launch animations and digital video.
The characteristics of objects are defined by properties (highlighted, bold, red, hidden,
active, locked, etc.). Each object may contain programming script, usually a property
of that object, that is activated when an event such as a mouse click) related to that
object occurs. Events cause messages pass along the hierarchy of objects in your
project; for example, a mouse click message can be sent from a button to the
background, to the page and then to project itself.
Most card or page-based authoring systems require a special intermediate file that also
receives scripted message handlers and acts as a repository for special routines and
resources that are available to all projects being executed by the application. In
HyperCard, this file is called Home; in ToolBook, you may have one or more System
Books.

Icon-based Authoring Tools


Icon-based, event-driven tools provide a visual programming approach to organizing
and presenting multimedia. First you build a structure or flowchart of events, tasks
and decisions, by dragging appropriate icons from a library. These icons can include
menu choices, graphic images, sounds and computations. The flowchart graphically
depicts the project's logic. When the structure is built, you can add your content: text,
graphics, animation, sounds and video movies. Then, to refine your project, you edit
your logical structure by rearranging and fine-tuning the icons and the properties.
With icon-based authoring tools, non-technical multimedia authors can build
sophisticated applications without scripting. By placing icons on a flow line, you can
quickly sequence events and activities, including decisions and user interactions.
These tools are useful for storyboarding, as you can change sequences, add options
and restructure interactions by simply dragging and dropping icons. You can print out
your navigation map or flowchart, an annotated project index with or without
associated icons, design and presentation windows and a cross-reference table of
variables.

Time-based Authoring Tools


Time-based systems are popular multimedia authoring tools. Each uses its own
distinctive approach and user interface for managing events over time. Many use a
visual timeline for sequencing the events of a multimedia: presentation, often
displaying layers of various media elements or events' alongside the scale in
increments as precise as one second. Others arrange long sequences of graphic frames
and add the time component by adjusting each frame's duration of play.
1. Director: Macromedia's Director is a powerful and complex multimedia authoring
tool from Macromedia with a broad set of features to create multimedia

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

56 presentations, animations and interactive multimedia applications. It requires a


Multimedia and its Applications
significant learning curve, but once mastered, it is among the most powerful of
multimedia development tools. In Director, you assemble and sequence the
elements of your project using a Cast and a Score. Director movies are generally
good for cross-platform delivery. To make run-time projectors for both platforms,
however, you need to use a Mac version to make the Mac projector and a
Windows version to make a Windows projector. This can be costly because you
need a copy of Director on each platform. If you make a Shockwave projector,
however, it will launch from both Mac and Windows platforms using the
Shockwave Player.
2. Cast: The Cast is a multimedia database containing still images, sound files, text,
palettes, QuickDraw shapes, programming scripts, QuickTime movies, Flash
movies and even other Director files. Not only can you import a wide range of
data types multimedia element formats directly into this Cast, but you can also
create multimedia elements from scratch using Director's own tools and editors.
A full-featured painting tool lets you create bitmapped artwork in any color depth.
You can create gradients, tile patterns and animated transformations (such as
rotations and skews) of artwork. Other tools edit and create QuickDraw shapes,
text, QuickTime movies, palettes and scripts.
3. Score: Once you have imported or created the multimedia elements for your
project and placed them into your Cast, you time these Cast members together
using the Score facility. Score is a sequencer for displaying, animating and
playing Cast members and it is made up of frames that contain Cast members,
temp, a palette, timing and sound information. Each frame is played back on a
stage at a rate specified in the tempo channel. The Score provides elaborate and
complex visual effects and transitions, adjustments of color palettes and tempo
control.
Animations, for example, are made by placing a graphic or sprite onto the stage
and changing its location slightly over several or more frames. When the frames
are played back at tempo, the sprite moves. You can synchronize animations with
sound effects by highlighting a range of frames and selecting the appropriate
sound from your Cast.
4. Lingo: Director Utilizes Lingo, a full-featured object-oriented scripting language
to enable interactivity and programmed control. A built-in script editor offers
Lingo debugging facilities. Because you can attach scripts to individual elements
of the Cast, you can copy and paste complete interactive sequences. Lingo also
uses Xtras, which are special code segments used to control external sound and
video devices. Several Xtras and extensive examples of their use are shipped with
Macromedia Director. With Lingo, you can also control operations on the Internet
such as sending mail, reading documents and images and building Web pages on
the fly.
Using Lingo scripts, you can chain together separate Director documents and call
other files as subroutines. You can also import elements into your Cast using
pointers to a file. This allows you to share the same elements among many Casts;
when your Score calls for that element, it is loaded into RAM from the file.
Chaining and sharing let you create Director projects as large or complex as your
storage medium will accommodate.

Object-oriented Tools
Object-oriented tools are particularly useful for games, which contain many
components with many personalities, and for simulating real-life situations, events,
and their constituent objects.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

1. Quarklmmedia: The powerful object-oriented mTropolis tool is developed and 57


Making Instant Multimedia
integrated into Quark's long-term multimedia strategies, Quark's page layout tool,
QuarkXPress, will become more presentation and network capable. Currently
available for object-oriented programming is Quarklmmedia, which works in
conjunction with QuarkXPress for building multimedia projects for CD-ROM,
Internet/intranet, or disk distribution. Quarklmmedia consists of the
Quarklmmedia Design Tool and Quarklmmedia Viewer. The Design Tool works
as an Xtra with QuarkX. Press to stitch together multimedia objects in a play list.
Quarklmmedia Viewer then allows users to run and view projects on both Mac
and Windows platforms.
2. MediaForge: MediaForge from Strata provides a Flexible Authoring Metaphor
Environment so developers can choose to work within an object-based authoring
system, or write scripts, or use a combination of both. Like the interface for
mTropolis, MediaForge's object-based authoring system lets you drag and drop
graphic files, movies, and sound objects in your presentation. For the scripting
approach, the MediaBasic Editor is the scripting engine for Strata's Visual
MediaBasic scripting language. MediaForge's objects live in a hierarchical
metaphor, which assigns

3.9 CROSS-PLATFORM AUTHORING NOTES


You face two major hurdles when you move multimedia projects across platforms;
these hurdles have to do with the different schemes Macintosh and Windows
computers use to manage text and colors.
If your projects use only bitmapped images and sounds, the text issue is moot. But if
you use text in fields or require user entry of text, you will face size and shape issues.
The Macintosh and Windows environments each use different fonts (even when the
fonts have the same name), so you may wish to experiment with your fonts before
designing or converting a project.
Each platform also uses its own character set; some special characters may appear as
different characters on the other platform. Here are some important tips for working
with text in cross-platform applications:
For text in boxes, center the text, leaving plenty of space or margin to avoid
possible word-wrap on the other platform.
Avoid outline and shadow styles on the Macintosh. They are not currently
supported in Windows and may default to boldface.
When the look of a larger-size font is extremely important, turn it into a bitmap by
screen capturing before you convert.
If you use TrueType fonts or Adobe ATM, the fonts must be installed and
available on both platforms or must be embedded in the playback application.
Colors can also be difficult to manage in cross-platform projects, because both
computer platforms employ different palette-mapping systems. The colors you use on
the Macintosh, for example, may not appear the same on the PC. When you convert a
Macintosh 256-color graphics file to Windows, all colors are mapped to their nearest
equivalents, so the results you get will depend on the color palettes used on each
platform.

3.10 CHOICE OF THE RIGHT TOOL FOR THE JOB


Each multimedia project you undertake will have its own underlying structure and
purpose and will require different features and functions. In the best case, you must be
prepared to choose the tool that best fits the job; in the worst case, you must know
which tools will at least get the job done. Authoring tools are constantly being

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

58 improved by their makers, who add new features and increase performance with
Multimedia and its Applications
upgrade development cycles of six months to a year. It is important that you study the
software product reviews in computer trade journals, as well as talk with current users
of these systems, before deciding on the best ones for your needs.
Delivering your project may require building a run-time version of the project using
the multimedia authoring software. A run-time version allows your project to play
back without requiring the full authoring software and all its tools and editors. Often,
the run-time version does not allow users to access or change the content, structure,
and programming of the project. If you are going to distribute your project widely,
you should distribute it in the run-time version. Because the World Wide Web has
become a significant delivery medium for multimedia, authoring systems typically
provide a means to convert their output so that it can be delivered within the context
of HTML, either with special plug-ins or by embedding Java or other code structures
in the HTML document. Make sure your authored project can be easily distributed.
It is also increasingly important to use tools that make transfer across platforms easy.
For many developers, the Macintosh remains the multimedia authoring platform of
choice, but 80 percent of that developer's target market may be Windows platforms. If
you develop on a Macintosh, look for tools that provide a compatible authoring
system for Windows or offer a run-time player for the other platform.
Because the Web has become a significant delivery medium for multimedia, authoring
systems typically provide a means to convert their output so that it can be delivered
within the context of HTML or DHTML either with special plug-ins or by embedding
Java, JavaScript or other code structures in the HTML document.

3.11 MULTIMEDIA TOOL FEATURES


Common to nearly all multimedia tool platforms are a number of features for
encapsulating the content, presenting the data, obtaining user input and controlling the
execution of the product.
These features include:
Page
Controls (Navigation, Input, Media Controls)
Data (Text, Graphics, Audio, Video, Live Audio/Video, Database
Execution (Linear Sequenced, Program controlled, Temporal Controlled, Inter
activity Controlled)

Check Your Progress 2


1. What are the basic tools of multimedia?
……………………………………………………………………………….
……………………………………………………………………………….
2. What is the selection criterion for image editing tools?
……………………………………………………………………………….
……………………………………………………………………………….
3. What are multimedia authoring tools?
……………………………………………………………………………….
……………………………………………………………………………….
4. What are the types or categories of authoring tools?
……………………………………………………………………………….
……………………………………………………………………………….

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

59
3.12 LET US SUM UP Making Instant Multimedia

This lesson discusses the common desktop presentation tools that have become
multimedia powerful. Dedicated multimedia authoring systems are offering
simplified, easy-to-use versions. Some multimedia projects may be so simple that one
can cram all the organizing, planning, rendering, and testing stages into a single effort,
making instant multimedia. The lesson discusses Multimedia Authoring Tools and
their various types: Card and page based authoring tools, Icon-Based Authoring Tools,
Time-Based Authoring Tools and Object-oriented tools.

3.13 LESSON END ACTIVITY


Discuss among your group how to select an authoring tool for a multimedia
presentation on any topic related to your course and implement that tool to make
multimedia presentation.

3.14 KEYWORDS
DDE and OLE: Dynamic Data Exchange (DDE) and Object Linking and Embedding
(OLE) are two methods for linking data objects among Windows applications.
Office Suites: Office suites integrate into a single package the various productivity
tools essential to running a business. Suites offer the convenience of a common
interface with similar menus, commands, and toolbars, and they also allow you to
share data among the applications in that suite using OLE and DDE.
WordPerfect: A tool palette and drawing commands which allows to create and edit
graphics with the standard Macintosh drawing tools, as well as create Bezier curves
and polygons; there's also a free rotation tool.
Database: A program which can store, sort, retrieve, and organize many types of
information. Like spreadsheets, databases can exist in a digital environment without
ever needing to be printed to paper.

3.15 QUESTIONS FOR DISCUSSION


1. Discuss and differentiate Authoring Tools with Programming Tools.
2. What are the Cross-Platform Authoring Notes?
3. How you would make choice of the right tool for a multimedia project?
4. What are Office Suites? Discuss their role.

Check Your Progress: Model Answers


CYP 1
1. MS Office is the most efficient suite of applications for document creation,
communication and business information analysis. For many functions, the
business platform has evolved from paper to the Web. Microsoft Office
extends desktop productivity to the web, streamlining the way you work
and making it easier to share, access and analyze information so you get
better results. Office offers a multitude of new features. Of particular
importance for this release are the features that affect the entire suite. These
Office-wide, or shared features hold the key to the new realm of
functionality enabled by Office. Office offers a new Web-productivity
work style that integrates core productivity tools with the Web to
streamline the process of sharing information and working with others.
Contd….

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

60 It makes it easier to use an organization's intranet to access vital business


Multimedia and its Applications
information and provides innovative analysis tools that help users make
better, timelier business decisions. Office delivers new levels of resiliency
and intelligence, enabling users and organizations to get up and running
quickly, stay working and achieve great results with fewer resources. The
components of MS Office are as follows:
MS Word
MS Excel
MS PowerPoint
MS Access
2. MS Word is a powerful word processor that allows you to create:
Memos
Fax coversheets
Web pages
Reports
Mailing labels
Brochures
Tables, and
Many other professional and business applications.
Microsoft's Word for Macintosh and Windows provides essentially the
same user interface on both platforms and offers special multimedia
features. You can make and import various image formats, including PICT,
TIFF, BMP, and EPS, to place them in your document. You can add
QuickTime movies to your document; control the movie's playback
characteristics (forward, backward, start, and stop); and perform simple
editing with cut, copy, and paste commands. In the Windows version, AVI
movies can also be played within your Word document. You can import
digitized sounds, and you can record voice comments from an internal
microphone, saving the recording (with a portion of the text or an icon as
an identifier) for playback. Annotations can be searched for sound effects
and content, edited, and even saved as separate files in any of four formats.
With Word for Windows, you can also create links to other programs using
OLE.
MS Word provides easy graphics handling, calculation of the data tables,
ability to create a mailing list, list sorting and efficient file management.
Major features of the Word: Major features of the Word are as follows:
a. Auto Summarize Feature: This feature automatically summarizes the
key points in the document. Word determines the most important
sentences and gives a custom summary based on the analysis.
b. Auto Complete Feature: Auto Complete Feature automatically offers
suggestions to complete the word or the phrase that has been typed
partially. To accept suggestions, press the Enter Key and the Word
automatically replaces the partially typed word with the complete
word. Word automatically completes the current date, a day of the
week, a month other than the current one, your name and the company
name and the AutoText entries.
Contd….

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

c. Automatic Grammar Checking: It marks the incorrect grammar with a 61


Making Instant Multimedia
green wavy line as you type.
d. Letter Wizard: This helps you format and enter the key information for
the letters to ensure that they are consistent and professional. It lets you
write quickly and easily and also to add to your letter.
e. Office Assistant: The MS Office uses IntelliSense Natural-Language
Technology. The assistant anticipates the kind of help you require and
suggests the Help topics on the work that you are doing. This office
assistant provides the visual examples and the step-by-step instructions
for the specific tasks.
f. Smart Spelling Features: Recognizes your name, your organization’s
name and the professional names of varying ethnicity.
Recognizes your writing pattern and does not mark some patterns as
errors in the document.
Ignores the Internet and the file addresses as error in the spellings.
g. Natural Language Grammar Checker: This provides improved
syntactical analysis, better rewrite suggestions and user friendly
grammar styles.
h. Spelling and Grammar Checking Combination Feature: This feature
eliminates the separate dialog boxes and provides the interface that lets
you proofread the document online.
i. Hyperlinks Features: This links to the Microsoft Outlook, HTML or
the other files on any internal and external Web site or file server.
3. MS Excel is a spreadsheet package. When you start excel, a blank
workbook appears in the document window. The workbook is the main
document using excel for storing and manipulating the data. A workbook
has individual worksheets each of consisting of data. Each worksheet is
made up of 256 columns and 65,536 rows. Using a special template
document, you can create a slide show with Microsoft Excel (in both the
Macintosh and Windows versions) to present worksheets, charts, and
graphics. You can apply video and audio transition effects between slides,
adjusting speed and the method of slide advance. The SLIDES.XLA file
must be installed in the Windows version, and the Slideshow Add-In file
for the Macintosh. QuickTime and AVI movies can be linked to Microsoft
Excel documents.
Features of Excel
a. The multiple Undo feature can Undo up to the last 16 actions.
b. When you quit Microsoft Excel with multiple files open, you get a
YES to ALL option. You can choose this option to save all the files
before exiting, instead of being prompted to close each open file.
c. Conditional Formats dynamically apply a different font style, pattern
and the border to the cells whose values fall outside or with in the limit
specified by you. This lets you quickly spot areas of interest without
reading through tables of values.
d. The Hyperlinks Feature helps you to create hyperlinks that connect to
other office files on the system, your network. A hyperlink can be text
in the cell, a graphic or you can write a formula that creates a
hyperlink.
Contd….

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

62 e. The Web Queries feature allows you to create and run the queries to
Multimedia and its Applications
retrieve data available on the World Wide Web.
f. The Internet assistant wizard steps you through the process of saving
the worksheet data and the charts in the HTML format. You can save
the data and the chart as a complete new Web Page or add them to an
existing Web Page.
g. The new Share Workbook feature lets multiple users open a workbook
on the network and edit the document simultaneously.
h. CellTips and the ScrollTips automatically display the comments added
to cell.
i. The worksheet has expanded to include 65,536 rows and you can type
up to 32,000 characters in a cell.
j. Natural Language formulas allow you to create formulas that use row
and the column headers instead of the range references.
k. The Auditing and the Validation facility allow you to circle the invalid
data and to see at a glance all the entries that don’t meet your validation
rules.
l. The enhanced Get External Data features enable you to query Access
and the other databases either on the system or a network or the
Internet or intranet resources.
4. MS Access is the relational database application in the Microsoft Office
Professional. With Access, you can perform the following tasks:
Organize data into manageable related units.
Enter, modify and locate data.
Extract subsets of data based on the specific criteria.
Create custom forms and reports.
Automate common database tasks.
Graph data relationships.
Microsoft Access (Windows only) is a relational database application
available on its own or as part of the Microsoft Office Professional bundle
of products. Following are the major features of Access:
The Publish to the Web Wizard converts your Access information to a
dynamic Internet or intranet site including query pages.
The Outlook Journal helps you to track when a database file was
opened or closed, or when an object was printed.
A new Hyperlink data type is supported to allow insertion of links to
other objects, documents, or Internet performance.
Improved design features include the ability to create forms with
multiple tabs.
Lightweight Forms and Reports load without loading Visual Basic for
Applications, leading to faster performance.
User-Level Security Wizard creates a secured copy of the database.
The Visual basic code for the objects has been updated with the
methods, properties and other language elements.
Contd….

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

The multiple pages button allows you to select the number of pages to 63
Making Instant Multimedia
preview.
The Performance Analyzer analyses database objects and suggest ways
to make them fast.

CYP 2
1. Various types of basic tools for creating and editing multimedia elements
are:
Painting and Drawing tools
Image editing tools
OCR software
3-D Modeling and Animation tools
Sound editing programs
Animation, Video and Digital movies
2. Selection criteria for image editing applications are:
Conversion of major image data types and industry standard file formats.
Direct input from scanners etc.
Employment of virtual memory scheme.
Multiple window scheme.
Image and balance control for brightness, contrast etc.
Masking undo and restore features.
Multiple video, Anti-aliasing, sharpening and smoothing controls.
Color mapping controls.
Geometric transformations.
All colour palettes.
Support for third party special effects plug ins.
Ability to design in layers that can be combined, hidden and recorded.
3. Authoring tools usually refers to computer software that helps multimedia
developers create products. Authoring tools are different from computer
programming languages in that they are supposed to reduce the amount of
programming expertise required in order to be productive. Some authoring
tools use visual symbols and icons in flowcharts to make programming
easier. Others use a slide show environment.
4. Authoring tools are grouped based on metaphor used for sequencing or
organising multimedia elements and events
i. Card or Page Based Tools
ii. Icon Based or Event Driven Tools
iii. Time Based and Presentation Tools
iv. Object Oriented Tools

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

64
Multimedia and its Applications 3.16 SUGGESTED READINGS
Dhiraj Sharma, Foundations of IT, Excel Books, 2008.
Vaughan, Multimedia Making IT Work, Fifth Edition, Tata McGraw-Hill.
John F. Koegel Bufford, Multimedia Systems, Pearson Education, 2003.
Judith Jeffloate, Multimedia in Practice (Technology and Applications), PHI, 2003.
Ze-Nian Li and Mark S. Drew, Fundamentals of Multimedia, Prentice-Hall, 2004.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

65
LESSON Multimedia Building Blocks
(Using Text)

4
MULTIMEDIA BUILDING BLOCKS (USING TEXT)

CONTENTS
4.0 Aims and Objectives
4.1 Introduction
4.2 Multimedia Building Blocks
4.2.1 Text
4.2.2 Graphic Images
4.2.3 Animation
4.2.4 Sound
4.2.5 Interactive Links
4.3 Text in Multimedia
4.4 Types of Fonts
4.4.1 Special Font Types
4.4.2 Font Styles
4.4.3 Font Size
4.4.4 Weight of Font
4.4.5 Tracking of Font
4.4.6 Cases of Fonts
4.4.7 More Styles of Fonts
4.4.8 Points to Remember while Choosing the Fonts
4.5 Buttons
4.6 Setting Fields
4.7 Portrait versus Landscape
4.8 HTML Documents
4.9 Symbols and Icons
4.10 Animating Text
4.11 Adobe Type Manager
4.12 Let us Sum up
4.13 Lesson End Activity
4.14 Keywords
4.15 Questions for Discussion
4.16 Suggested Readings

4.0 AIMS AND OBJECTIVES


After studying this lesson, you would be able to understand:
z Multimedia building blocks
z Types of font’s symbols and icon
z Animation text and adobe type manager

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

66
Multimedia and its Applications 4.1 INTRODUCTION
The Multimedia building blocks are Text, Graphic Images, Animation, Sound and
video. This lesson discusses the role and use of Text in multimedia projects. Text is one
of the important multimedia building blocks. The lesson throws light on various aspects
of text usage such as types of fonts, special fonts, font styles, font size, weight of font
etc.

4.2 MULTIMEDIA BUILDING BLOCKS


Multimedia usually involves text, graphics, animation, video and sound. Above all, it
requires interactive links to make it possible for programme integration.

4.2.1 Text
Similar to printed publications and other media, text is the basic element of
communication and it is essential for any multimedia programme. In fact, multimedia
packages often involve the conversion of a book to computerised form, allowing the
user to look up information quickly with built-in interactive links.

4.2.2 Graphic Images


By graphic images, we generally mean a still image such as a photograph or line
drawing. As humans, we find visual objects more interesting and easier to be
perceived than text. However, graphic files are larger than text files and consequently
require more computer storage space. This is one of the reasons that multimedia
applications require a large hard disk drive or equivalent storage capabilities such as a
CD-ROM.

4.2.3 Animation
Animation refers to moving graphic images or videos - for example, the movement of
a mechanism. Just as a photograph is a powerful communicating tool, a small
movie/video clip is even more powerful and is especially useful for illustrating
concepts that involve moving objects. As animation files require much more storage
space than ordinary graphic files involving a single image, this often necessitates the
use of a CD-ROM drive or a large hard disk drive.

4.2.4 Sound
It can substantially reinforce our understanding of information presented together with
text and graphic images. The incorporation of sound in a multimedia programme can
provide the user with information not possible using other methods. As with graphic
images and animation, sound files are very large and require lots of disk space.

4.2.5 Interactive Links


An important function of multimedia is its interactive nature. This means that the user
can manipulate screen "objects" such as clicking a button or highlighted text with a
"mouse" and cause the programme to respond in a certain predetermined way. A
"button" is a screen object with a label that indicates what action it activates. For
example, the user may click on a "Pause" or "Replay" button to control the animation
display. Or there may be a screen button indicating "Sound" that when clicked on
causes the programme to play a recording of instructions or a musical tune. It is this
interactive nature of multimedia that makes it extremely useful in providing
information to the user. Unlike a book, which is designed to be read from page to page
(sometimes called "linear" information), multimedia allows users to access

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

information any way they choose (sometimes called "non-linear" information access). 67
Multimedia Building Blocks
Because of this, multimedia is a more flexible and effective way to learn. (Using Text)
In this lesson, we will discuss Text in multimedia – one of the important multimedia
building blocks.

4.3 TEXT IN MULTIMEDIA


The most important part while learning about text in a word processing is its
representation. This is in term of word processing called a font or the style in which
the text is to be displayed. The different type of types are called fonts and are mostly
available in Windows based computers.

4.4 TYPES OF FONTS


There are mainly two types of fonts: Serif and Non-serif fonts. Have a look at the
same text in the following two paragraphs.
This text is written to display the difference between serif and non-serif fonts. The two
types do look similar but are quite different if looked at it very minutely.
This text is written to display the difference between serif and non-serif fonts.
The two types do look similar but are quite different if looked at it very
minutely.
The main difference lies in the characters formation. Notice above the character y in
minutely. In the first para the y has extended lines in the two lines whereas no there is
no extension in y in the second para. It is the question of one's liking. You can create
your multimedia text in either of the two depending upon your choice.
Times, New Century Schoolbook, Bookman, and Palatino are examples of serif fonts.
Helvetica, Arial, Optima, and Avant Garde are examples of non-serif fonts. Mostly
the serif fonts are used for the text and the non-serif fonts are used for headings. But,
there is no hard and fast rule for this.
Fonts are the style of type used to display text. Historically, they were artist/craftsman
designs. These days, computers make it easy for screen and report designers to choose
fonts to best communicate information.
It is usually safe to assume that Arial, Times New Roman, and Courier New are
installed on a Windows system. If the font is not installed, Word will substitute the
closest match.
Gothic: every stroke same width, no special endings (serifs) on strokes; e.g.,
Arial, Verdana: Verdana was designed to be exceptionally easy to read on a
screen.
Roman: Thick vertical strokes, thin horizontal, "serifs" at ends; e.g., Times New
Roman, Georgia.
Mono-Space: Every character same width; e.g., Courier
New, Andale Mono, these fonts are used by typewriters,
text terminals and impact printers.
Gothic fonts are also known as "Sans Serif". Note that typical Gothic and Roman
fonts have variable-width characters and are known as proportionally spaced fonts.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

68
Multimedia and its Applications
4.4.1 Special Font Types
Comic Sans MS is often used on Web pages, almost never in business reports whether
screen or paper. Aria l Bla c k a nd I m pa c t a re use d for spe c ia l
purpose s on print e d out put .

4.4.2 Font Styles


For most (but not all) available fonts, one can also set specific styles, including:
super
Normal Bold Italic Bold Italic script subscript reverse

4.4.3 Font Size


Is the height of a capital letter, measured in "points". There are 12 points per "pica" or
10th of a horizontal inch, 72 points per vertical inch.
For fonts like Courier, the 12-point size means 10 CPI, formerly called “pica” pitch,
the 10-point size means 12 characters per inch (horizontally), formerly called “elite”
pitch. For other fonts, the point size does not necessarily relate to horizontal spacing,
because each character's width is proportional to its shape. (Compare "W" to "I".)
However, numbers tend to be a fixed width.
<-- 1" --><-- 1" --><-- 1" --><-- 1" -->
1234567890abcdefghijklmnopqrstuvwxyz Courier 12pt,
10CPI
123456789012abcdefghijklmnopqrstuvwxyz Courier 10pt, 12CPI
123456789012abcdefghijklmnopqrstuvwxyz Times Roman 12pt
With printed output, people can easily read 12-point type, especially if it is Times
New Roman or a similar serif font, like those used in newspapers, magazines, books,
and common reading material. Smaller type sizes are difficult for many people over
the age of 40 or who are farsighted.
With displayed (screen) output, most people can read 12-point type, especially if it is
a "sans serif" font like Arial. On screen, serifs tend to blur one's vision because of the
"halation" effect on our eyes of light. (Seeing a "halo" around a street light is typical
halation). Be aware that as resolution goes up on a GUI display, the apparent font size
goes down. A 12pt font is always the same size on printed output. It varies widely in
actual size when displayed in VGA (640x480) – coarse and blocky but big – to XGA
(1024 × 768) – fine and smooth but small.
Designers like small font sizes, because we wish to get as much as possible onto the
page or screen. But eyestrain is a worse penalty than turning the page, or clicking the
"Next" button!

4.4.4 Weight of Font


By weight here it means the thickness of the font. The normal text size is 11 points
which is being used in this book. But, you can go upto 700 points which probably will
not fit on this page. The following example would give you some idea of the width of
the font.

Y Y Y Y
12 points 24 points 36 points 48 points

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Each font supports from 4 point to 700 points. 69


Multimedia Building Blocks
(Using Text)

Figure 4.1: Text Size

4.4.5 Tracking of Font


You can use tracking to make the space between the two characters small or large.
Like for example if you type A and V next to each other the space between them is
very glaring, especially in higher points.

AV AV
This space can be adjusted using the method called kerning. Notice the second way of
writing the characters in the above text, the characters space between the two
characters is quite valid.

4.4.6 Cases of Fonts


There are mainly three ways of representing the characters. Either in Caps, Normal or
Small caps. Depending upon the needs you can use either of them in your text.
Various examples of the cases of the same font are shown here:
All Caps THIS LINE IS IN ALL CAPS
Normal This line is in normal case
Small Caps THIS LINE IS IN SMALL CAPS CASE

4.4.7 More Styles of Fonts


Various styles are used while creating text. This can be for making the text to out
stand the other text. Various options are:
Normal This line is in normal case
Bold This line is in bold
Italics This line is in italics
Underline This line is underlined

4.4.8 Points to Remember while Choosing the Fonts


You must remember the following points before finalising the text:
z For small type, use the most legible font available. Decorative fonts that cannot be
read are useless.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

70 z Use as few different faces as possible in the same work, but vary the weight and
Multimedia and its Applications
size of your typeface using italic and bold styles where they look good.
z Using too many fonts on the same page is called ransom note typography.
z In text blocks, adjust the leading for the most pleasing line spacing. Lines too
tightly packed are difficult to read.
z Vary the size of a font in a proportion to the importance of the message you are
delivering.
z In large-size headlines, adjust the spacing between letters (kerning) so that the
spacing feels right. Big gaps between large letters can turn your title into a
toothless wait. You may need to ken by hand, using a, bitmapped version of your
text.
z To make your type stand out or be more legible, explore the effects of different
colors and of placing the text on various backgrounds. Try reverse type for a stark,
white-on-black message.
z Use anti-aliased text where you want a gentle and blended look for titles and
headlines. This can give a more professional appearance. Anti-aliasing blends the
colors along the edges of the letters (called dithering) to create a soft transition
between the letter and its background.
z Try drop caps and initial caps to accent your words. Most word processors and
text editors will let you create drop caps and SMALL CAPS in your text. Adobe
and other make initial caps. The letters are actually, carefully drawn artwork and
are available in special libraries as encapsulated PostScript files (EPSF).
z If you are using centered type in a text blocks, keep the number of lines to a
minimum.
z For attention-grabbing results, try graphically altering and distorting the text.
Wrap your word onto a sphere, bend it into a wave or splash it with rainbow
colors.
z Experiment with drop shadows. Place a copy of the word on top of the original
and offset the original up and over a few pixels. Then color the original gray
(or any other color). The word may become more legible and provide much
greater impact. At Web sites, shadowed text and graphics on a plain white
background add depth to a page.
z Surround headlines with plenty of white space. White space is a designer's term
for roomy blank areas; programmers call the invisible character made by a space
(ASCII 32) or a tab (ASCII 9) white space.
z Pick the fonts that seem right to you for getting your message across; then double-
check your choice against other opinions. Learn to accept criticism.
z Use meaningful words or phrases for links and menu items.
z Text links (anchors) on Web pages can accent your message: they stand out by
color and underlining. The ink colors consistently throughout a site and avoid
iridescent green on red or purple on puce.
z Bold or emphasize text to highlight ideas or concepts, but do not make text look
like a link or a button when it is not.
z On a Web page, put vital text elements and mends in the top 320 pixels. Studies of
surfer habits have discovered that only 10 percent to 15 percent of surfers ever
scroll any page.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

71
4.5 BUTTONS Multimedia Building Blocks
(Using Text)
In most modern cultures a doorbell is recognized by its context (next to the door itself,
possibly lit); but if you grew up in a high-rise apartment, you may have seen 50 or
more buttons at the entrance. Unless you knew that yours was the third from the top
on the left, you could find your button only by reading the printed or scrawled name
beside it.
In multimedia, buttons are the objects, such as blocks of text, a pretty blue triangle, or
a photograph, that make things happen when they are clicked. They were invented for
the sole purpose of being pushed or prodded with cursor, mouse, key or finger - and to
manifest properties such as highlighting or other visual or sound effects to indicate
that you hit the target. On the Web, text and graphic art may be buttons.
Remember that the rules for proper selection of text and fonts in your projects apply to
buttons as well as headlines, bullet items and blocks; of text. The automatic button-
making tools supplied with multimedia and, HTML page authoring systems are
useful, but in creating the text for you; they offer little opportunity to fine-tune the
look at the text. Character and world-wrap highlighting and inverting are
automatically applied to your buttons as needed by the authoring system. These
default buttons and styles: may trite, but by using common button styles, shapes,
borders and highlights, you increase the probability that users will know what to do
with them – especially when they are also labeled.
Your button fonts will need to travel with your project. The following are the most
popular fonts that should be there in your computer. These are perhaps safest for
button labeling and for page design on the Web; you can expect them to be available
on most personal computers.
Arial
Bookman Old Style
Century
Century Gothic
Courier New
Book Antiqua
Bookshelf Symbol
Cosmic Sans MS
Haettenschweiler
Lucida Console
Garamond
Impact
Marlett
Monotype Corsiva
Monotype Sorts
MS Outlook
MS Sans Serif
MS Serif Symbol
Tahoma
Trebuchet MS

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

72 Times New Roman


Multimedia and its Applications
Verdana
Webdings
WingDings
Before you can use a font, it must be recognized by the computer's operating system.
If you want to use fonts other than those installed with your operating system, you
will need to install them. TrueType fonts delivered as core fonts for Windows are
installed in four weights and styles. When you install applications, other fonts are
often added to your collection. Pick a font for buttons that is, about all, legible; then
adjust the text size of the labels to provide adequate space between the button's rim
and the text.
You can choose from many styles of buttons and several standard methodologies for
highlighting. You will want to experiment to get the right combinations of font,
spacing and colors for just the right look.
In HTML the <font> tag can be used to specify a font to be displayed (if the font is
present on the system). In a list of choices, you can include the names of Windows
fonts; if the font is not found on the local computer, the browser's default font will be
displayed:
<font face= “Verdana, Arial, Helvetica”>
Although in HTML4 you can specify a base font size, color and face for displaying
text on a Web page, you have on guarantee that font is installed in the user's system. If
it is missing, a browser will attempt to substitute a similar font but the look is not
guaranteed to be the same as the one you have designed. Provide a way to download
the font to the end user's computer if the right look is important to you,
In most authoring platforms, it is easy to make your own buttons from bitmaps or
drawn objects. In a message-passing authoring system, where you can script activity
when the mouse button is up or down over an object, you can quickly replace one
bitmap with another highlighted or colored version of the bitmap to show that the
button has been "pushed" or the mouse is over it. Making your own buttons from
bitmaps or drawn objects gives you greater design power and creative freedom and
also insure against the missing font problem.
On the other hand, this custom work may require a good deal more time. Interesting
text and graphic buttons for the Web can be created as GIF or JPG bitmaps that, when
clicked, link to other pages. There is no easy provision in HTML for highlighting or
animating these graphic images but typically the destination address (URL) is
displayed in the status window of the browser when the mouse is over a linked image
or text element. So users know first if the mouse is over an active button and second,
where that button will take them if they click, If you take a short step beyond vanilla
HTML with a few lines of JavaScript you can program your button image to switch
with another when the mouse is over it or "pushing down". Whether default or
custom, treat the design and labeling of your buttons as an industrial art project:
buttons are the part of your project the user touches.

4.6 SETTING FIELDS


You are already working uphill when you design text to be read on the screen.
Experiments have shown that reading text on a computer screen is slower and more
difficult than reading the same text in hard copy or book form. Indeed, many users, it
seems, would rather print out the reports and e-mail messages and read them on paper
than page through screens of text.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Reading hard copy is still more comfortable. Research has shown that when people 73
Multimedia Building Blocks
text on a computer screen they blink only 3 to 5 times per minute, but they blink 20 to (Using Text)
25 times per minute when reading text on paper. This reduced eye movement may
cause dryness, fatigue and possibly damage to the eyes. Research also suggests that
monitors should be placed lower than eye level.
Unless the very purpose of your multimedia project or Web site is to display large
blocks of text, try to present to the user only a few paragraphs of text per page. Use a
font that is easy to read rather than a prettier font that is illegible. Try to display whole
paragraph on the screen and avoid breaks where users must go back and forth between
pages to read the entire paragraph.

4.7 PORTRAIT VERSUS LANDSCAPE


Traditional hard copy and printed documents in the taller-than-wide orientation are
simply not readable on a typical monitor with a wider-than all aspect ratio. The taller-
than-wide orientation used for printed documents is called portrait; this is the 8.5 by
11 inch size unique to the United States or the internationally designed standard A4
size 8.27 by 11.69 inches. The wider-than-tall orientation normal to monitors is called
landscape. Shrinking an 11-inch-tall portrait page of text into your available monitor
height usually yields illegible chicken tracks.
There are four possible solutions if you are working with a block of text that is taller
than what will fit:
z Put the text into a scrolling field. This is the solution used by Web browsers.
z Put the text into a single field or graphic image in a project window and let the
user move the whole window up or down upon command. This is most
appropriate when you need to present text with page breaks and formatting
identical to the printed document. This is used by Adobe's popular Acrobat
Reader for displaying PDP files.
z Break the text into fields that fit on monitor-sized pages and design control
buttons to flip through these pages.
z Design your multimedia project for a special monitor that is taller than it is wide
(portrait). Such "page view" monitors are expensive; they are used for commercial
print-based typesetting and layout.

Check Your Progress 1


1. What are Multimedia building blocks? Explain each in brief.
……………………………………………………………………………….
……………………………………………………………………………….
2. What is the role of interactive links in Multimedia?
……………………………………………………………………………….
……………………………………………………………………………….
3. What are the major points to remember while choosing the fonts?
……………………………………………………………………………….
……………………………………………………………………………….

4.8 HTML DOCUMENTS


The standard document format used for pages on the Web is called Hypertext Markup
Language (HTML). In an HTML document you can specify typefaces, sizes, colors
and other properties by "marketing up" the text in the document with tags. The
process of marking up documents is simple: Where you want text to be bold, surround

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

74 it with the tags <B>; the text between the tags will then be displayed by your browser
Multimedia and its Applications
application in bold type.
Where you have a header, surround it with <H1> and </H1>; for an ordered list of
things (1, 2, 3,…Or a, b, c,…etc.) surround your list with <OL> and </OL>. There are
many tags you can use to layout a page.
The remarkable growth of the Web is straining the old designs for displaying text on
computers. Indeed, while marked-up text tiles (HTML documents) remain at the
foundation of Web activity, when you visit a well-designed Web site, you often
discover graphic images, animations and interactive workarounds contrived to avoid
displaying text. The neat paragraphs, indented lists and formats for text documents for
which HTML was originally intended are evolving into multimedia documents, not
text documents and the HTML method and standard is consequently suffering great
stress.
As features and tags and plug-ins and special scripts are tacked onto or embedded into
HTML to satisfy the demand for multimedia interfaces, at some point HTML, will
need to be redesigned from the ground up - as a multimedia delivery tool, not just a
text display tool with assorted attachments. Indeed, this redesign is currently under
way in the form of Dynamic HTML (DHTML).
HTML doesn’t provide you much flexibility to make pretty text elements but you may
be able to lay out pleasing documents using block quote indents, tables, frames and
horizontal rules. Pretty text in HTML documents is typically done as graphical
bitmaps that are placed within the HTML document's layout with image tags, <IMG>.
<! DOCTYPE HTML PUBLIC“-//W3C//DTD HTHL 4.0 Transitional//EN”>
<HTML> <HEAD>
<TITLE>This is a test page</TITLE>
<P>Study carefully during your studies
</HEAD>
</HTML>
The default display font is a preference that can be set in the browser and is one
known to be available on the viewer's machine. So some viewers may read your
words in serif Times Roman, others in sans serif Helvetica or Arial. Dynamic HTML
however, uses cascading style sheets (CSS) to define choices ranging from line height
to margin width to font face. A font face, if not found, is degraded to the next best
match. Indeed, using CSS, you can define the following text properties: font-weight,
font-family, fontsize, font-size-adjust, font-variant, font-style, font-stretch, text-
decoration, text-transform, text-shadow, letter-spacing, word-spacing, line-height,
vertical-align, text-indent, text-align and direction.

4.9 SYMBOLS AND ICONS


Symbols are concentrated text in the form of stand-alone graphic constructs. Symbols
convey meaningful messages. The Windows hourglass cursor tells you to wait while
the computer is processing. Though you may think of symbols as belonging strictly to
the realm of graphic art, in multimedia you should treat them as text – or visual
words – because they carry meaning. Symbols such as the familiar trash can and
hourglass are – more properly called icons; these are symbolic representations of
objects and processes common to the graphical user interfaces of many computer
operating systems. Certainly text is more efficient than imagery and pictures for
delivery a precise message to users.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

On the other hand, pictures, icons, moving images and sounds are most easily recalled 75
Multimedia Building Blocks
and remembered by viewers. With multimedia, you have the power to blend both text (Using Text)
and icons (as well as colors, sounds, images and motion vide) to enhance the overall
impact and value of your message. Word meanings are shared by millions of people,
but the special symbols you design for multimedia project are not; these symbols must
be learned before they can be useful message carriers. Some symbols are more widely
used and understood than others, but readers of even these common symbols had to
grow accustomed to their meanings. Learning a system of symbols can be as difficult
as lessons in any foreign language.
Here are some symbols you may already know:

Figure 4.2: Symbols

Nonetheless, a few symbols have emerged in the interactive multimedia world as an


accepted lexicon of navigation cues that do not need text. These symbols are by no
means universal, but some of them have roots from the days of teletypewriters, others
from early HyperCard and videodisc development and yet others from the consumer
electronic world. Even for these common symbols, text labels are often added to the
graphic icons to avoid uncertainty.

4.10 ANIMATING TEXT


There are plenty of ways to retain a viewer's attention when displaying text. For
example, you can animate bulleted text and have it "fly" onto the screen. You can
"grow" a headline a character at a time. For speakers, simply highlighting the
important text works well as a pointing device. When there are several points to be
made, you can stack keywords and flash them past the viewer in a timed automated
sequence.
You might fly in some keywords, dissolve others, rotate or spin others and so forth,
until you have a dynamic bulleted list of words that is interesting to watch. But be
careful- don't overdo the special effects, or they will become boring.
Powerful but inexpensive applications such as Xaos Tools’ TypeCaster let you create
3D text using both TrueType and Type 1 Adobe fonts. You can also use Illustrator or
FreeHand EPS (Encapsulated PostScript) outline files to create still images in 3D and
then animate the results to create QuickTime movies with broadcast-quality rendering.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

76
Multimedia and its Applications 4.11 ADOBE TYPE MANAGER
It is required to display Type I PostScript fonts at all size without jaggier. This
software is available for Windows. Once installed ATM works automatically with
word processing, page layout, spreadsheet and graphics applications including
multimedia authoring systems.

Figure 4.3: Adobe Type Manager

Check Your Progress 2


1. What are HTML Documents?
……………………………………………………………………………….
……………………………………………………………………………….
2. How can Text be Animating? Explain.
……………………………………………………………………………….
……………………………………………………………………………….

4.12 LET US SUM UP


The Multimedia building blocks are Text, Graphic Images, Animation, Sound and
video.
This lesson discusses the role and use of Text in multimedia projects. Text is one of
the important multimedia building blocks. Similar to printed publications and other
media, text is the basic element of communication and it is essential for any
multimedia programme. In fact, multimedia packages often involve the conversion of
a book to computerised form, allowing the user to look up information quickly with
built-in interactive links. The lesson throws light on various aspects of text usage such
as types of fonts, special fonts, font styles, font size, weight of font, points to
remember while choosing the fonts etc.

4.13 LESSON END ACTIVITY


Take any of your typed assignment and apply all the font styles and special printable
fonts to make it a multimedia assignment.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

77
4.14 KEYWORDS Multimedia Building Blocks
(Using Text)
Animation: Animation refers to moving graphic images or videos.
Interactive Link: This means that the user can manipulate screen objects such as
clicking a button or highlighted text with a mouse and cause the programme to
respond in a certain predetermined way.
Button: A button is a screen object with a label that indicates what action it activates.

4.15 QUESTIONS FOR DISCUSSION


1. What is the role of Text in Multimedia applications?
2. Explain different types of text fonts.
3. How Symbols and Icons can add value in Multimedia applications?

Check Your Progress: Model Answers


CYP 1
1. Multimedia usually involves text, graphics, animation, video and sound.
Text: Text is the basic element of communication and it is essential for any
multimedia programme. In fact, multimedia packages often involve the
conversion of a book to computerised form, allowing the user to look up
information quickly with built-in interactive links.
Graphic Images: By graphic images, we generally mean a still image such
as a photograph or line drawing. However, graphic files are larger than text
files and consequently require more computer storage space.
Animation: Animation refers to moving graphic images or videos - for
example, the movement of a mechanism. Just as a photograph is a powerful
communicating tool, a small movie/video clip is even more powerful and is
especially useful for illustrating concepts that involve moving objects.
Sound: It can substantially reinforce our understanding of information
presented together with text and graphic images. The incorporation of
sound in a multimedia programme can provide the user with information
not possible using other methods.
2. An important function of multimedia is its interactive nature. This means
that the user can manipulate screen "objects" such as clicking a button or
highlighted text with a “mouse” and cause the programme to respond in a
certain predetermined way. A "button" is a screen object with a label that
indicates what action it activates. For example, the user may click on a
"Pause" or "Replay" button to control the animation display. Or there may
be a screen button indicating "Sound" that when clicked on causes the
programme to play a recording of instructions or a musical tune. It is this
interactive nature of multimedia that makes it extremely useful in providing
information to the user. Unlike a book, which is designed to be read from
page to page (sometimes called "linear" information), multimedia allows
users to access information any way they choose (sometimes called
"non-linear" information access). Because of this, multimedia is a more
flexible and effective way to learn.
3. You must remember the following points before finalising the text.
™ Use as few different faces as possible in the same work, but vary the
weight and size of your typeface using italic and bold styles where they
look good.
Contd….
Downloaded by Wahitha Banu (wahitha19@gmail.com)
lOMoARcPSD|19697633

78 ™ Using too many fonts on the same page is called ransom note
Multimedia and its Applications
typography.
™ In text blocks, adjust the leading for the most pleasing line spacing.
Lines too tightly packed are difficult to read.
™ Vary the size of a font in a proportion to the importance of the message
you are delivering.
™ In large-size headlines, adjust the spacing between letters (kerning) so
that the spacing feels right.
™ To make your type stand out or be more legible, explore the effects of
different colors and of placing the text on various backgrounds.
™ Use anti-aliased text where you want a gentle and blended look for
titles and headlines. This can give a more professional appearance.
™ Try drop caps and initial caps to accent your words.
™ If you are using centered type in a text blocks, keep the number of lines
to a minimum.
™ For attention-grabbing results, try graphically altering and distorting
the text. Wrap your word onto a sphere, bend it into a wave or splash it
with rainbow colors.
™ Use meaningful words or phrases for links and menu items.

CYP 2
1. The standard document format used for pages on the Web is called
Hypertext Markup Language (HTML). In an HTML document you can
specify typefaces, sizes, colors and other properties by "marketing up" the
text in the document with tags. The process of marking up documents is
simple: Where you want text to be bold, surround it with the tags <B>; the
text between the tags will then be displayed by your browser application in
bold type. The remarkable growth of the Web is straining the old designs
for displaying text on computers. Indeed, while marked-up text tiles
(HTML documents) remain at the foundation of Web activity, when you
visit a well-designed Web site, you often discover graphic images,
animations and interactive workarounds contrived to avoid displaying text.
The neat paragraphs, indented lists and formats for text documents for
which HTML was originally intended are evolving into multimedia
documents, not text documents and the HTML method and standard is
consequently suffering great stress.
2. There are plenty of ways to retain a viewer's attention when displaying text.
For example, you can animate bulleted text and have it "fly" onto the
screen. You can "grow" a headline a character at a time. For speakers,
simply highlighting the important text works well as a pointing device.
When there are several points to be made, you can stack keywords and
flash them past the viewer in a timed automated sequence. You might fly in
some keywords, dissolve others, rotate or spin others and so forth, until you
have a dynamic bulleted list of words that is interesting to watch. But be
careful- don't overdo the special effects, or they will become boring.

4.16 SUGGESTED READINGS


Dhiraj Sharma, Foundations of IT, Excel Books, 2008.
Vaughan, Multimedia Making IT Work, Fifth Edition, Tata McGraw-Hill.
John F. Koegel Bufford, Multimedia Systems, Pearson Education, 2003.
Judith Jeffloate, Multimedia in Practice (Technology and Applications), PHI, 2003.
Ze-Nian Li and Mark S. Drew, Fundamentals of Multimedia, Prentice-Hall, 2004.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

79
LESSON Multimedia Building Blocks
(Using Sound)

5
MULTIMEDIA BUILDING BLOCKS (USING SOUND)

CONTENTS
5.0 Aims and Objectives
5.1 Introduction
5.2 Sound in Multimedia
5.2.1 Sampling Rate
5.2.2 Sound Capturing and Delivery
5.3 Audio Manipulation Techniques
5.3.1 Trimming
5.4 Types of Sound/Audio Files
5.4.1 MIDI Files
5.4.2 Digital Sound
5.5 Digitisation and File Sizes
5.5.1 Bit Resolution
5.5.2 Compression Method (Codec)
5.5.3 Channels
5.6 Management of Audio Files in Multimedia Projects
5.6.1 Streaming
5.6.2 Major Audio File Formats
5.7 Adding Sound to a Multimedia Project
5.7.1 Process of Adding Sound
5.8 Let us Sum up
5.9 Lesson End Activities
5.10 Keywords
5.11 Questions for Discussion
5.12 Suggested Readings

5.0 AIMS AND OBJECTIVES


After studying this lesson, you would be able to understand:
z Audio Manipulation Techniques
z Management of Audio Files in Multimedia Projects
z The major file Audio

5.1 INTRODUCTION
The Multimedia building blocks are Text, Graphic Images, Animation, Sound and
video. This lesson discusses the role and use of sound in multimedia projects. Sound

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

80 is one of the important multimedia building blocks. The lesson throws light on various
Multimedia and its Applications
aspects of sound usage such as Sampling rate, Sound Capturing and Delivery, Audio
manipulation techniques, Trimming, Types of Sound/Audio Files etc.

5.2 SOUND IN MULTIMEDIA


As discussed in the previous lesson, multimedia usually involves text, graphics,
animation, video and sound. Sound is perhaps the most important part of a multimedia
project. It is one of the basics of the project. The sound can be very distinctive. Any
sound can make a pleasant listening or it can sound disturbing too if made at a litter
higher pitch. So while making the multimedia project it is imperative that the sound
should be perfect. It should not disturb or should give pleasant listening.
In this lesson, we would learn how the sound is created and embedded into the
multimedia project. Sound is like waves. When something vibrates in the air by
moving back and forth, it creates waves of pressure. These waves spread like the
ripples tossed into a still pool, and when they reach your eardrums, you experience the
changes of pressure, or vibrations, as sound.
Sound waves differ in pressure level (amplitude) and in frequency and pitch.
Acoustics is the branch of science that studies sound. Sound pressure levels are
measured in decibels; a decibel measurement is actually the ratio between a chosen
reference point on a logarithmic scale and the level that is actually experienced. When
you increase the sound output power, there is only a 6 dB increase; when you make
the sound 100 times more intense, the increase in dB is not hundredfold, but only
20 dB. The decibel scale, with some examples is shown below:
Sound pressure levels (loudness or volume) cane be measured. Sound is measured in
decibels (db), the pressure or volume:
z Typical voice conversation is 70db, a soft whisper is 30db, a jackhammer is
120db
z Decibels measure the energy required to make sounds logarithmically
z A jackhammer’s noise is about 1 watt, but voice conversation is 1/100,000 watt.
Sound/audio files are in some ways less complex than moving image formats, since
they are essentially a moving waveform that can be represented more easily than
moving images. Digital sound represents the analogue waveforms using sampling
methods that convert the shape of the wave over time into a series of numbers. The
most common method for quantifying sound (as this process is called) is Pulse Code
Modulation, a technique that has existed since 1937.

5.2.1 Sampling Rate


The number of times per second the audio signal is measured. Sampling rate has a
direct effect on sound quality. CD sound is sampled at 44,100 times per second
(44.1 kHz), while DAT (digital audio tape) supports rates of 32, 44.1 and 48 kHz.
Sampling rates of 96 kHz and 192 kHz are becoming more common with the
increasing use of 24 bit signals.

5.2.2 Sound Capturing and Delivery


Audio is captured using a microphone. Sound travels through the air in waves with a
particular amplitude, wavelength and frequency. Editing a sound wave involves
changing these characteristics. A short piece of sound is often referred to as a
sequence. Some common file formats for audio include WAV, MP3 and WMA.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

81
Multimedia Building Blocks
(Using Sound)

Figure 5.1: Sound Properties

5.3 AUDIO MANIPULATION TECHNIQUES


Digital media involves the processing of digital data. Processing involves editing the
data using manipulation techniques for each data type.
Audio signals from a computer are converted into analog sound waves for
transmission through speakers. Each sound wave has an amplitude, wavelength and
frequency. The amplitude is the height of the wave. It gives the sound its volume.

Amplify increases or decreases the volume of sound by changing the height of the
wave. Amplifying ensures the audio will be clearly heard or not heard.
Mute does not play the audio
Equalisers or filters are used to make adjustments to the strength of sounds at different
frequencies
Stretch changes the frequency (pitch) and duration of the audio signal. For
example, you can use stretch to change a song to a higher key
Noise removal reduces background noise with minimal reduction in sound quality. It is
often used to reduce the background noise of the microphone
Delete silence is used to remove periods of silence between words or other sounds
Echo is used to add an echo to a sound. For example, you can create the echo
‘Hello-ello-llo-lo-o’
Fading is used to change from one audio to the next. Fade-in gradually increases
the audio volume and fade-out gradually decreases the audio volume.

The wavelength is the distance between the ends of one complete cycle of a wave. It
gives the sound its pitch or note. The frequency of the wave is the number of

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

82 wavelengths in one second and is measured in hertz (Hz). Sounds are edited in many
Multimedia and its Applications
different ways and affect the amplitude, wavelength and frequency of the wave.

Original
waveform

Sampling
frequency

Sampled data

Reconstructed
waveform

Figure 5.2: Waves

5.3.1 Trimming
Removing blank space or dead air, as it is called, from the front of a recording and any
unnecessary extra time off the end is your first sound editing task. Trimming even a
few seconds, here and there might make a big difference in your file size. Trimming is
typically accomplished by dragging the mouse cursor over a graphical representation
of your recording and choosing a menu command such as Cut, Clear, Erase, or
Silence.

5.4 TYPES OF SOUND/AUDIO FILES


Two major ways to create and deliver sounds are MIDI and digital audio. MIDI
sequence is a file containing the note information and not the details of the sound
wave.

5.4.1 MIDI Files


MIDI (Musical Instrument Digital Interface, pronounced ‘middy’) is a standard
connection for computers and electronic music instruments. MIDI allows up to sixteen
instruments to be played simultaneously. A musician uses a MIDI instrument to play
music and the computer to store and edit the music. MIDI sequences require less
storage than audio files as they only contain the note information and are easier to
edit. The quality of the sound in MIDI sequence is dependent on the synthesiser used
to play it.
z MIDI (Musical Instrument Digital Interface) is to audio what vector-based
graphics are to images and Postscript is to text while digital audio (such as WAVE
files) are analogous to bitmaps.
z MIDI is a notation (similar to a musical score) and communications standard for
describing how electronic instruments and synthesizers play musical sound.

Advantages
z MIDI files tend to be much smaller than digitized wave form files.
z They can be stretched or edited more easily.
z They may sound better if the playback quality of the instruments is better.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Disadvantages 83
Multimedia Building Blocks
z Doesn’t provide reliable playback, depends on what’s available (Using Sound)

z They can’t play speech


z Windows 3.1 introduced standards for MIDI playback
z Authorware and other multimedia authoring tools support MIDI
Sound is best described as rapid changes in air pressure (vibrations), or as longitudinal
waves that propagate through a medium (such as air). We hear a sound because the
changes in air pressure (or the longitudinal waves) interact with our eardrums which
vibrate in response, and the vibrations are interpreted by the brain as sound. Sound
waves are usually measured in terms of their frequency and amplitude, although they
have other characteristics which affect the sound. Frequency is a measurement, in
Hertz (Hz), of the number of vibrations that occur per second. Optimally, people can
hear from 20Hz to 20000Hz (20kHz) although this range decreases with age.
Amplitude is a measure of how many air molecules are affected by the vibration of a
sound (see Figure 5.3). In analogue audio equipment sound waveforms are
represented by voltage variations.

Figure 5.3: Amplitude and Wavelength

Essentially digital sound consists of a digital reproduction of the vibrating waves that
make up sound.

Check Your Progress 1


1. What is Sampling rate of sound?
……………………………………………………………………………….
……………………………………………………………………………….
2. How the Sound can be Captured and Delivered?
……………………………………………………………………………….
……………………………………………………………………………….
3. What is Sound Trimming?
……………………………………………………………………………….
……………………………………………………………………………….

5.4.2 Digital Sound


Digital sound files can be created entirely within the digital domain, or they can be
created through digitisation of existing analogue sound material. But however the
sound is created, it can only be experienced by us as an analogue signal, not a digital

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

84 one. Digital sound represents waveforms as a series of samples, taken at specific time
Multimedia and its Applications
intervals, whose values are given as binary numbers. The sample rate is the number of
times per second that the analogue signal is measured. The sampling rate influences
the quality of digital sound. The standard encoding (sampling method) for digital
audio and digital video is Pulse Code Modulation (PCM).
The difference between analogue audio signals and digital sound is best shown by the
following diagram:

Figure 5.4: Quantisation of Analogue Sound

Clearly, the greater the sampling rate the smaller the size of each ‘stair’ and the closer
the digital sample is to the original analogue waveform.
The conversion of analogue data to digital and back to analogue is accomplished by
special chips. A chip that converts analogue to digital is called an ADC – (an
Analogue to Digital Converter). The ADC measures the amount of current at each
sampling interval and converts it to a binary number. This is the process by which
analogue sound is digitised. At the other end of the process is a chip called a DAC –
(a Digital to Analogue Converter). This chip takes the binary numbers which represent
the sampled sound and converts it to an output voltage which can then be transmitted
through audio speakers as analogue waveforms which our ears can hear.
Digital audio is more common for multimedia applications. Digitized sound is
sampled sound
z Every nth fraction of a second, a sample of analog sound is taken and stored in
binary form
z Sampling rate: how often the sound sample is taken
z Sampling size: how much information is stored for each sample
z The more often you sample and more data per sample, the higher your quality and
resolution
z The value of each sample is rounded off to the near integer (this is called
quantization)
z An 8-bit sampling size provides 256 bits to describe dynamic range of amplitude
z A 16-bit sampling size provides over 65 thousand bits for dynamic range, but
significantly increases space requirements

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

z If amplitude is greater than intervals available, clipping at top and bottom of 85


Multimedia Building Blocks
waves occurs: (Using Sound)
™ Produces background hissing noise or other distortions.
Three most common frequencies are 44.1 kHz (kiloHertz), CD-quality, 22.05 and
11 kHz.
Formula for determining the size (in bytes) of a digital recording:
z Sampling rate x duration in seconds x (bit resolution / 8) x 1 (mono) or 2 (stereo)
E.g., for a 10 second recording at 20.05 kHz, 8-bit resolution, monophonic (good
for speech):
22050 x 10 x 8/8 x 1 = 220,500 bytes
z For good music quality at 44.1 kHz, 16-bit resolution, stereo:
44100 x 10 x 16/7 x 2 = 1,764,000 bytes
z This is the “Red Book” standard for CD-quality audio – but it’s expensive for
multimedia!
z Note that most but not all PCs have 16 bit sound cards (not all have sound cards!)
Compression can help a lot:
z Macromedia introduced Shockwave Audio (SWA) in Authorware 5
(Authorware Xtras->Other->Convert WAV to SWA)
z Authorware 5 also supports Voxware (VOX) compression (supposedly better for
voice)
z Xtras needed (Wavread.x32 for WAV in Windows95/98/NT, swaread.x32)
z ReadPlayer’s ReadAudio is another popular compression format for the Web
z Flash converts the wave files to 128kbit mp3 format, altering sampling rates
™ Flash 5 lets you import MP3 files, so it’s better to convert wave files to MP3
outside of Flash—any idea why? (Saves space in .fla Flash source code files)
z Downside: compression may further degrade sound, with more clipping

Advantages of Digital Audio


There have been ongoing debates for years concerning the merits of digital audio
versus high-end analogue systems. Some listeners question whether digital sound
quality is quite as good as analogue sound, but it can be very good indeed, and most
people are unable to tell the difference in sound quality. The obvious advantages of
digital audio are discussed below, briefly.
Wider dynamic range: Digital sound recorded at a bit rate of 16 bits has a dynamic
range of 96dB, compared to the dynamic range of most analogue systems of around
80dB. Higher bit rates provide higher dynamic ranges, although the practical limit is
about 110dB, reached with 24 bit sound.
Less equipment noise: Analogue sound equipment introduces electromagnetic
frequency interference as audio signals travel through the physical circuits. This sort
of signal degradation is unlikely in digital systems.
Faithful copying/reproduction: Every time analogue sound is copied data losses
occur and noise is introduced (see above). Over analogue sound generations quality
decays noticeably. These copying/ reproduction problems do not occur in digital audio
systems.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

86 Error Correction
Multimedia and its Applications
Error correction in digital audio playback equipment is both a blessing and a curse.
CD and DVD players both contain error correcting mechanisms to allow uninterrupted
audio (and video) playback in spite of bit level errors. Unfortunately, this makes
playback of such media on such equipment an unreliable method of determining
integrity of the digital signals. Indeed by the time errors can be heard the bitstream has
decayed to such an extent that recovery through re-copying is impossible.

Issues with Digital Audio


Quantization error: Quantization in digital audio refers to the sampling representation
of the analogue waveform by integers, the range of possible integer values being
denoted by the bit level of the sampling. Thus, the 16-bit sampling of audio CDs
allows for a possible 65,536 integer values (216). Quantization errors are introduced in
digital audio because the waveform sampling of analogue signals needs to represent
the value of each sample as a whole number (integers). The converter selects a whole
number that is closest to the signal level at the instant of sampling. This produces
small rounding errors in the sampling that eventually cause noticeable distortion in the
digital signal.
Dithering: Dithering is the introduction of 'erroneous' signal or random noise which is
added to an audio signal for the purpose of minimizing quantization error. Dither is
routinely used in processing of both digital audio and digital video data. Many
‘audiophiles’ are horrified at the notion of introducing deliberate errors in digital
audio signals and believe that this lowers the sound quality of digital audio. In
practice, most human ears are unable to tell the difference between analogue and
digital sound, although they could hear the loss in quality of digital sound that had not
been dithered.
Encoding/Compression: Encoding is the process of converting uncompressed digital
audio to a compressed format such as MP3. The algorithm used in the encoding (and
decoding) software is referred to as a codec—as in coding/decoding. There is often
more than one codec for a particular format, and different codecs can vary widely in
quality and speed, even for the same format.
Compression is necessary in digital audio because digitisation of analogue sound or
creation of digital sound files quickly produces files of very large size. File size is
related to sampling rate, bit rate, number of channels, and time. So, 3 minutes of
sound sampled at 44.1kHz, using 16-bits and in stereo (the most frequently used
quality) will result in a file size of around 31 Megabytes (uncompressed), or
approximately 1Mb per minute of sound. MP3 compresses files up to about 11:1
without noticeable loss of quality, so our 31Mb file will be reduced to around 3 Mb
using MP3. Newer codecs such as AAC offer the possibility of even greater
compression ratios while supposedly retaining better sound quality.
Lossy vs Lossless Compression: Lossless compression for audio files is a very
specialized area, as lossless techniques used for text and image files do not work with
sound. Generally, lossless compression for sound works by encoding or representing
repeating data in a way that takes up less space than the original but allows the full
reconstruction of the original bitstream. The highest compression ratios for lossless
compression of audio are about 2:1, i.e. an uncompressed file of 30 Mb would be
compressed using a lossless method to about 15Mb.
Lossy compression removes unnecessary or redundant information completely, so will
always lead to a loss of quality, although the differences may be undetectable to the
human ear. Lossy compression ratios of up to 10:1 result in generally insignificant
quality loss, although many people can tell the difference between an MP3 file at
128 bits (a compression of about 10:1) and original CD audio. Newer compressed

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

formats, such as AAC, can achieve similar or better compression ratios with much less 87
Multimedia Building Blocks
loss of quality. (Using Sound)

File Formats for Preservation of Digital Sound


For digital audio, how one preserves depends on many factors, over most of which
those responsible for preservation have no control. In the best case, we would have
raw unedited master source files or sound streams to preserve, but it is unlikely that
such objects will be offered for preservation in most cases. Digital audio is the product
of particular creation or digitisation processes, and the objects available for
preservation are dependent to a large extent on the purpose for which the digital audio
was created, the specific processes involved in its creation, and end user requirements.
As with moving image preservation, there are two issues to consider: bitstream
encoding and file format.
All other things being equal, the best object to preserve is the one using an encoding
that retains the highest bit depth and sampling rate, and with the least amount of
compression. Thus, we would preserve a sound stream in LPCM format in preference
to the same data stream encoded in MP3 format. A file in Ogg Vorbis or AAC format
would be preferred over the same sound file encoded with MP3. We would preserve
an MP3 file encoded at a rate of 192 kbits in preference to the same MP3 file encoded
at a rate of 128 kbits. Thus, for bitstream encoding the following requirements are
recommended:
z Higher sampling rate (usually expressed as kHz, e.g., 96kHz) preferred over lower
sampling rate.
z 24-bit sample word-length preferred over shorter.
z Linear PCM (uncompressed) preferred over compressed (lossy or lossless).
z Higher data rate (e.g. 128 kilobits per second) preferred over lower data rate for
same compression scheme and sampling rate.
z AAC compression preferred over MPEG-layer 2 (MP3)
z Surround sound (5.1 or 7.1) encoding only necessary if essential to creator's
intent. In other cases, uncompressed encoding in stereo is preferred
It is recommended that where possible digital audio be saved either in broadcast wave
format without compression, and using LCPM encoding, or as AIFF files, again using
LCPM encoding.

5.5 DIGITISATION AND FILE SIZES


Digitising is the process of generating digital data. Digital data is represented using
the binary number system. It consists of one of two digits, 0 and 1 (bit). The fact that
all data is represented as a series of bits means that a computer can organise and
transmit data of any type. It deals with data as 0s and 1s irrespective of the original
format of the data.
There is a different process to digitise each data type.
Sound is digitised using a method called sampling. Sampling converts a sound wave
to audio. It has three important characteristics called the sampling rate, bit resolution
and the number of channels:

Sampling Rate
Sampling rate is the number of times a sample (slice) is taken from the sound wave.
During a sample the amplitude of the wave is measured and converted to a number.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

88 The higher the sampling rate the better the sound but the larger the file size (see Table
Multimedia and its Applications
below).

Sampling Rate Bit Resolution Best use for Audio Files


11.025 kHz 8-bit Low quality—acceptable for voice
22.05 kHz 8-bit Acceptable quality with low file sizes
44.1 kHz 16-bit CD quality—minimum for serious audio production
48 kHz 16-bit Standard for most digital systems such as DAT recorders
96 kHz 24-bit Serious music production—present recording standard
192 kHz 32-bit Standard under development

5.5.1 Bit Resolution


Bit resolution (or sample size) is the number of bits per sample. The most common
sampling sizes are 8-bit, 16-bit or 24-bit sound. The larger the bit resolution the better
the quality sound. Voice is often produced using 8-bit sound. CD quality stereo sound
requires at least 16-bit resolution and larger file sizes. Bit depth is the number of bits
that are used each second to represent the audio signal. Bit depth determines the
dynamic range of sound. CDs use 16 bits which gives a dynamic range of 96 decibels,
while higher quality audio currently uses 24 bits, giving a dynamic range of around
110 decibels.

5.5.2 Compression Method (Codec)


The manner in which the digital audio file has been compressed for transmission and
storage. Audio compression results in loss of audio quality and can render sound files
less robust (i.e. more prone to bit corruption).

5.5.3 Channels
The number of channels greatly influences the size of the resulting file. Stereo files
are usually twice the size of mono files, while surround sound, which has 5 channels,
can be much larger still.
Mono uses one channel and stereo uses two channels (left and right) of sound. Stereo
sound results in better sound.
To determine the file size of audio in kilobytes we use the following formula:
File size = Sample rate × Bit resolution × Time(s) × Channels
8 × 1024

5.6 MANAGEMENT OF AUDIO FILES IN MULTIMEDIA


PROJECTS
Following are the major considerations involved in managing audio files and
integrating them into multimedia projects:
z Because sounds are time based, you may need to consider what happens to sounds
that are playing in your project when the user goes to a different location
z Appropriate use of sound requires technical considerations of disk space or
bandwidth as well as the authoring system to use various file formats and
compression algorithms
z Do not use equipment and standards that exceed what your project requires
z Keep track of your audio files, and be sure to back them up
z Regularly test the sound and image synchronization of your project

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

z Evaluate your sounds RAM requirements as well as your users’ playback setup 89
Multimedia Building Blocks
z Be sure you understand the implications of using copyrighted material. You are (Using Sound)
breaking the law of you record and use copyrighted material without first securing
the appropriate rights from the owner or publisher
z You can purchase and use digitised clip sounds with an unlimited-use, royalty free
licence.

5.6.1 Streaming
Streaming is the process that allows video or audio frames to be loaded and played
before later frames are loaded. As the frames are loaded, some are being played,
others are being discarded from memory and still others are being loaded into RAM.
The user believes the whole video/music file is loaded but the memory only retains a
few frames at a time. The method depends heavily on the computers memory and the
connection speed.
This process enables audio and video files to be displayed as quickly as possible and
avoids the user having to wait long periods for the full file to download. i.e. the file is
not permanently stored on the computer.
Streaming uses the bandwidth efficiently as it sends data at the speed the computer
can play the content. The bandwidth is the quantity of information that can be sent
through a communication medium. Streaming is widely used on the Web.

5.6.2 Major Audio File Formats


Following are the most common types of audio file formats:

Advanced Audio Coding (.aac; .m4a)


An open standard developed by the Motion Picture Experts Group (MPEG) as a high
quality encoding for audio streams, originally part of the MPEG-2 standard. AAC
uses perceptual audio encoding. The AAC compression approach adopted as a part of
MPEG-2 has been further refined within MPEG-4, where the format is referred to as
M4A. MPEG hopes that AAC/M4A will eventually replace MP3. Widely used for
QuickTime audio files, notably in Apple’s iTunes service.

Compact Disc Audio (CDDA)


The encoding format using pulse code modulation (PCM) for audio bitstreams
recorded onto commercial Compact Discs, also known as the Redbook standard.
Audio in this format cannot be directly stored on computer hard disks but needs to be
converted to another format first.

Digital Audio Compression (AC-3; Dolby Digital)


A flossy format for the encoding of ‘surround sound’, developed by Dolby
Laboratories principally for cinemas and ‘home theatres’. AC-3 was standardised by
the US Advanced Televisions Systems Committee as standard A/52. AC-3 is widely
used in home cinema systems, commercial cinemas, DVDs, and has now been
adopted as the audio component for High Definition Television (HDTV).

Linear Pulse Code Modulated Audio (LCPM)


Pulse code modulation (PCM) with linear quantization. A bitstream encoding for
digital audio streams. Linear PCM is essentially an uncompressed format resulting
from sampling of the source audio. Widely used on audio CDs, digital audio tape, and
is the default encoding format for audio streams in AIFF and WAVE wrapper formats.
It is unusual to find LCPM files existing independently, i.e. outside wrapper formats.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

90 MPEG-1 Layer –3 (.mp3)


Multimedia and its Applications
A compressed format developed as the encoding for audio streams within MPEG-1
files. MP3 provides a representation of audio data encoded using PCM. File sizes vary
depending on sampling and bit rate. Typical compression of 10:1. Widely used across
the Internet, for non-commercial copying and recording of audio files to CD, and
interchange of audio files.

Real Audio (.ra; .rm; .ram)


A proprietary file format developed by RealNetworks for audio encoding. The Real
Audio format provides for a number of codecs, including a lossless compression. One
of the most widely used streaming formats especially in web applications. Compresses
sound up to 10:1. Sound quality is passable, but not high.

Standard Musical Instrument Digital Interface (MIDI) File (.smf; .mid)


A bitstream encoding format used for MIDI "messages", i.e. the instructions that tell a
digital ‘instrument’ (eg. a synthesiser) how to play a piece of music. MIDI is a very
good encoding for instrumental music. MIDI was widely adopted but requires some
form of General MIDI player to reproduce the sounds.

Wave (.wav)
A proprietary Microsoft audio wrapper format that is the standard for storing audio on
Windows PCs. A subtype of the RIFF format method so is similar to Apple’s AIFF
format. Most common encoding format used with Wave files is PCM encoding. Wave
is an uncompressed format so file sizes are large. WAVE is the basis for the European
Broadcast Union’s Broadcast Wave (BWF) format.

Windows Media Audio Format (.wma)


A proprietary compressed audio format developed by Microsoft to compete with
MP3. WMA uses one of the Windows Media Audio codecs for encoding audio data.
WMA files are often encapsulated within an ASF wrapper file. Used for streaming
audio.

Extensible Media Format (.xmf)


An audio wrapper format developed by the MIDI Manufacturers’ Association. An
XMF file can contain audio encoded as standard MIDI Files, DLS instrument files,
WAV or other digital audio files. The purpose of the XMF format is encapsulate all
the resources needed to present a musical piece, an interactive webpage soundtrack, or
any other piece of media using pre-produced sound elements.

Liquid Audio Secure Download (.la1; .lqt)


Liquid Audio is an audio player that has its own proprietary encoder and file format.
Similar to MP3, it compresses file for ease of delivery over the Internet. Liquid Audio
files are compressed but purport to be of the same quality as commercially recorded
CDs. It is fairly widely used for music files.

Music Module Formats (MODS)


File formats created by a variety of software applications called ‘trackers’
(e.g. Soundtracker, the first such application). Module formats store musical data in a
form similar to a spreadsheet. A MOD file contains a set of instruments in the form of
samples, a number of patterns indicating how and when the samples are to be played,
and a list of what patterns to play in what order. The format allows for up to 32
channels of music playback and 31 instruments can be represented. There are many

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

types of MODS formats, each with their own file extensions e.g., 669, .amf, .dmf, .far, 91
Multimedia Building Blocks
.it, .med, .mod, etc. (Using Sound)

Ogg Vorbis
An open audio encoding format designed for efficient streaming and compression of
audio streams, associated with the Ogg wrapper format. The term "ogg" is often used
to refer just to the audio file format Ogg Vorbis, i.e. Vorbis-encoded audio in an Ogg
container. Quite widely adopted as a format for sound in more recent computer games,
and by the open source community more generally. Ogg Vorbis can be used within
other wrapper formats.

PARIS (Professional Audio Recording Integrated System) (.paf)


A file format specific to the Ensoniq PARIS digital audio editing system and so
hardware dependent. Not widely used now but at one period it was a very common
audio format. PARIS files can contain 8, 16 or 24 bit audio data.

SUN Audio (.au, .snd)


An older format developed by SUN Microsystems and still used on Unix systems.
Specifies an arbitrary sampling rate. Can contain 8, 16, 24 & 32 bit data. In
comparison to other 8 bit samples it has a larger dynamic range. Slow decompression
rates because of large file size, but audio is of high quality.

5.7 ADDING SOUND TO A MULTIMEDIA PROJECT


Apple was the first company to have sound and video interleaved into the computer. It
outpaced Microsoft's Audio/Video Interleaved technology, with the release of
QuickTime. QuickTime has become a standard file format for displaying digitized
motion video from hard disk or CD-ROM without special hardware.
Digital audio data is interleaved with video information in the file and when it is
played back the audio stays synchronized to the motion picture. You can use
QuickTime just to play stereo sounds and MIDI; the video part of QuickTime is not
required. QuickTime will display many graphic image formats as well.
Whether you're working on a Macintosh or in Windows, you will need to follow
certain steps to bring an audio recording into your multimedia project:

5.7.1 Process of Adding Sound


Here is a brief overview of the process:
1. Decide what kind of sound is needed (such as background music, special sound
effects and spoken dialog). Decide where these audio events will occur in the flow
of your project. Fit the sound cues into your storyboard or make up a cue sheet.
2. Decide where and when you want to use either digital audio or MIDI data.
3. Acquire source material by creating it from scratch or purchasing it.
4. Edit the sounds to fit your project.
5. Test the sounds to be sure they are timed properly with the project's images. This
may involve repeating steps 1 through 4 until everything is in sync.
When it's time to import your compiled and edited sounds into your project, you'll
need to know how your particular multimedia software environment handles sound
data. Each program handles it a bit differently, but the process is usually fairly
straightforward: just tell your software which file you want to play and when to play
it. This is usually handled by an importing or linking process during which you
identify the files.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

92 Scripting languages such an OpenScript (ToolBook), Lingo (Director) and


Multimedia and its Applications
ActionScript (Flash) provide a greater level of control over audio playback, but you'll
need to know about the programming language and environment. In authoring
environments, it is usually a simple matter to play a sound when the user clicks a
button, but this may not be enough.
If the user changes screens while a long file is playing, for example, you may need to
program the sound to stop before leaving the current screen. If the file to be played
cannot be found on the hard disk, you may need to code an entire section for error
handling and file location. Sample code is generally provided in both primed and
online documentation for software that includes sound playback.

Check Your Progress 2


1. Differentiate between Lossy and Lossless compression.
……………………………………………………………………………….
……………………………………………………………………………….
2. What is Bit resolution?
……………………………………………………………………………….
……………………………………………………………………………….

5.8 LET US SUM UP


In this lesson we have learnt how the sound is created and embedded into the
multimedia project. Sound is like waves. When something vibrates in the air by
moving back and forth, it creates waves of pressure. These waves spread like the
ripples tossed into a still pool, and when they reach your eardrums, you experience the
changes of pressure, or vibrations, as sound. Sound is perhaps the most important part
of a multimedia project. It is one of the basics of the project. The sound can be very
distinctive. Any sound can make a pleasant listening or it can sound disturbing too if
made at a litter higher pitch. So while making the multimedia project it is imperative
that the sound should be perfect. It should not disturb or should give pleasant
listening. This lesson discusses the role and use of sound in multimedia projects.
Sound is one of the important multimedia building blocks. The lesson throws light on
various aspects of sound usage such as Sampling rate, Sound Capturing and Delivery,
Audio manipulation techniques, Trimming, Types of Sound/Audio Files etc.

5.9 LESSON END ACTIVITIES


1. Take any of your course assignment and attach any sound file to make it a
multimedia assignment.
2. Discuss among group members how you can use the power of sound to make a
difference between an ordinary multimedia presentation and a professionally
spectacular one.

5.10 KEYWORDS
Amplitude: The amplitude is the height of the wave. It gives the sound its volume.
Wavelength: The wavelength is the distance between the ends of one complete cycle
of a wave. It gives the sound its pitch or note. The frequency of the wave is the
number of wavelengths in one second and is measured in hertz (Hz).
MIDI: MIDI (Musical Instrument Digital Interface) is a standard connection for
computers and electronic music instruments.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

93
5.11 QUESTIONS FOR DISCUSSION Multimedia Building Blocks
(Using Sound)
1. What is Digital sound? What are the advantages of Digital Audio?
2. What is Error correction?
3. Explain the major issues with Digital Audio.
4. What are file formats for preservation of digital sound?
5. What is digitisation and file size?

Check Your Progress: Model Answers


CYP 1
1. The number of times per second the audio signal is measured. Sampling
rate has a direct effect on sound quality. CD sound is sampled at 44,100
times per second (44.1 kHz), while DAT (digital audio tape) supports rates
of 32, 44.1 and 48 kHz. Sampling rates of 96 kHz and
192 kHz are becoming more common with the increasing use of 24 bit
signals.
2. Audio is captured using a microphone. Sound travels through the air in
waves with a particular amplitude, wavelength and frequency. Editing a
sound wave involves changing these characteristics. A short piece of sound
is often referred to as a sequence. Some common file formats for audio
include WAV, MP3 and WMA.
3. Removing blank space or dead air, as it is called, from the front of a
recording and any unnecessary extra time off the end is your first sound
editing task. Trimming even a few seconds, here and there might make a
big difference in your file size. Trimming is typically accomplished by
dragging the mouse cursor over a graphical representation of your
recording and choosing a menu command such as Cut, Clear, Erase, or
Silence.
CYP 2
1. Lossless compression for audio files is a very specialized area, as lossless
techniques used for text and image files do not work with sound. Generally,
lossless compression for sound works by encoding or representing
repeating data in a way that takes up less space than the original but allows
the full reconstruction of the original bitstream. The highest compression
ratios for lossless compression of audio are about 2:1, i.e. an uncompressed
file of 30 Mb would be compressed using a lossless method to about 15Mb.
Lossy compression removes unnecessary or redundant information
completely, so will always lead to a loss of quality, although the
differences may be undetectable to the human ear. Lossy compression
ratios of up to 10:1 result in generally insignificant quality loss, although
many people can tell the difference between an MP3 file at 128 bits (a
compression of about 10:1) and original CD audio. Newer compressed
formats, such as AAC, can achieve similar or better compression ratios
with much less loss of quality.
2. Bit resolution (or sample size) is the number of bits per sample. The most
common sampling sizes are 8-bit, 16-bit or 24-bit sound. The larger the bit
resolution the better the quality sound. Voice is often produced using 8-bit
sound. CD quality stereo sound requires at least 16-bit resolution and larger
file sizes. Bit depth is the number of bits that are used each second to
Contd…

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

94 represent the audio signal. Bit depth determines the dynamic range of
Multimedia and its Applications
sound. CDs use 16 bits which gives a dynamic range of 96 decibels, while
higher quality audio currently uses 24 bits, giving a dynamic range of
around 110 decibels.

5.12 SUGGESTED READINGS


Dhiraj Sharma, Foundations of IT, Excel Books, 2008.
Vaughan, Multimedia Making IT Work, Fifth Edition, Tata McGraw-Hill.
John F. Koegel Bufford, Multimedia Systems, Pearson Education, 2003.
Judith Jeffloate, Multimedia in Practice (Technology and Applications), PHI, 2003.
Ze-Nian Li and Mark S. Drew, Fundamentals of Multimedia, Prentice-Hall, 2004.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

95
Multimedia Building Blocks
(Using Graphics and Images)

UNIT III

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

96
Multimedia and its Applications

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

97
LESSON Multimedia Building Blocks
(Using Graphics and Images)

6
MULTIMEDIA BUILDING BLOCKS (USING GRAPHICS
AND IMAGES)

CONTENTS
6.0 Aims and Objectives
6.1 Introduction
6.2 What is Graphics?
6.3 Classification of Graphics and Images
6.4 Applications of Graphics and Images
6.4.1 Presentation Graphics
6.4.2 Painting and Drawing
6.4.3 Photo Editing
6.4.4 Scientific Visualisation
6.4.5 Image Processing
6.4.6 Education, Training, Entertainment and Computer Aided Design (CAD)
6.4.7 Simulations
6.4.8 Animation and Games
6.5 Let us Sum up
6.6 Lesson End Activities
6.7 Keywords
6.8 Questions for Discussion
6.9 Suggested Readings

6.0 AIMS AND OBJECTIVES


After studying this lesson, you would be able to:
z Describe graphics and images, their features and characteristics
z Discuss applications of graphics in various fields
z Describe various types of tools required to work with graphic systems

6.1 INTRODUCTION
It is said that a picture is worth a thousand words and in the era of computers we can
add on to it or we may as well revise the saying to ‘a computer is worth a million
pictures!’ So, you can estimate the power of a computer as a communication system.
Now, with the advances in computer hardware and software, graphics and images has
come a full circle and, more and more people are teaching and learning,
communicating and sharing their ideas through the medium of graphics and images.
By graphics, we mean any sketch, drawing, special artwork or other material that
pictorially depict an object or a process or otherwise conveys information, as a

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

98 supplement to or instead of written descriptions, and the utilization of computers to


Multimedia and its Applications
accomplish such tasks leads to a new discipline of computer graphics.
Traditionally, graphics has referred to engineering drawings of buildings, bridges,
machine parts etc. and scientific drawings such as x-y curves, network and process
flowcharts. In recent decades, graphics has ventured into industrial design, advertising
and other artistic endeavours. During the last few years, even newspapers and
periodicals aimed at the common man have begun to utilise graphics to present
quantitative news such as selection results and production statistics. Computer
graphics can do all this and more. In fact, the power and easy availability of computer
graphics have increased the use of pictures to replace and augment words to describe,
educate, or inform a wide variety of audiences, on a wide variety of subjects.
In this lesson, we shall concentrate on the graphic capabilities and potential of the
digital computer plus we will discuss the meaning of the term graphic images and its
types, in addition to which, we will also discuss the hardware used for practical
application of graphics in different streams of life.

6.2 WHAT IS GRAPHICS?


Every image or picture is in fact a graph and when different mathematical tricks are
used to manipulate some change in its properties like shape, size, motion etc., through
the help of computers then, the representation is nothing but computer graphics, so we
can say that “Computer Graphics (CG) is the field of visual computing, where one
utilizes computers both to generate visual images synthetically and to integrate or alter
visual and spatial information sampled from the real world.” or “Computer Graphics
is the pictorial representation manipulation of data by a computer” or “Computer
Graphics refers to any sketch, drawing, special artwork or other material generated
with the help of computer to pictorially depict an object or a process or otherwise
convey information, as a supplement to or instead of written descriptions”. Computer
Graphics is a complex and diversified field.
A Picture is a fundamental cohesive concept in Computer Graphics. Each picture
consists of points called pixels (Picture-element). If we consider a complex picture,
then complex database for pixels are considered, hence, complex algorithm are
required to access them. These complex database contain data organised in various
data structures such as ring structures, B-tree etc.
There are many algorithms, which can be materialised to produce graphical effects on
the screen through several graphical tools based on different languages that are
available in the market.

6.3 CLASSIFICATION OF GRAPHICS AND IMAGES


Computer graphics can be broadly divided into the following classes:
z Business Graphics or the broader category of Presentation Graphics, which refers
to graphics, such as bar-charts (also called histograms), pie-charts, pictograms
(i.e., scaled symbols), x-y charts, etc. used to present quantitative information to
inform and convince the audience.
z Scientific Graphics, such as x-y plots, curve-fitting, contour plots, system or
program flowcharts etc.
z Scaled Drawings, such as architectural representations, drawings of buildings,
bridges, and machines.
z Cartoons and artwork, including advertisements.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

z Graphics User Interfaces (GUIs) which are the images that appear on almost all 99
Multimedia Building Blocks
computer screens these days, designed to help the user utilise the software without (Using Graphics and Images)
having to refer to manuals or read a lot of text on the monitor.
We will discuss the various classes of computer graphics mentioned above in the
following sections of this lesson.
The most familiar and useful class of computer graphics involves movies and video
games. Movies generally need graphics that are indistinguishable from physical
reality, whereas video games need graphics that can be generated quickly enough to
be perceived as smooth motion. These two needs are incompatible, but they define
two-ways of communications between users and computations. In video games, the
subject matter of computations is generally characters chasing and shooting at each
other.
A more familiar use of computer graphics exists for interacting with scientific
computations apart from movies and games. This familiarisation of the use of
computer graphics has influenced our life, through simulations, virtual reality,
animation, we can extend the scope of education, entertainment, analysis etc. So, in
global terms Computer graphics can be categorised in two ways:
Interactive Computer Graphics which is interactively used by users e.g., games.
Passive Computer Graphic which has no option for users to interact or use computer
graphics e.g., movies.

6.4 APPLICATIONS OF GRAPHICS AND IMAGES


Research in computer graphics covers a broad range of application including both
photorealistic and non-photorealistic image synthesis, image-based modeling and
rendering and other multi-resolution methods, curve and surface design, range
scanning, surface reconstruction and modeling, motion capture, motion editing,
physics-based modeling, animation, interactive 3D user interfaces, image editing and
colour reproduction. Work is going on in various fields but computer vision is a hot
topic, where research tackles the general problem of estimating properties of an object
or scene through the processing of images, both 2D photographs and 3D range maps.
Within this broad scope, we investigate efficient ways to model, capture, manipulate,
retrieve, and visualise real-world objects and environments.
Once you get into computer graphics, you’ll hear about all kinds of applications that
do all kinds of things. This section will discuss not only the applications but also the
software suitable for that type of application, so it is necessary to give you an
understanding of what various applications do. While working on a project, you may
need images, brochures, a newsletter, a PowerPoint presentation, poster, DVD etc.
Thus, the question arises what software do I need to get my job done. The section will
help to straighten all of that out in your head. Hopefully, if you know what does what,
you won’t waste money duplicating purchases, and when other designers or
co-workers are talking shop, you’ll know what is going on.
Graphic design applications are basically broken down on the basis of a few
considerations. The first two considerations are, “Is your project for print, or web”.
When I say web, what I really mean is monitor based publishing. This means that you
are going to see your work on a computer screen, and television, or a big screen
projector. So, as you read through this section, whenever we say “web based”, we
mean monitor based. Beyond print and web, here are the various categories that we
can think of that various applications would fit into; Image Manipulation; Vector
Graphics; Page Layout; Web sight development; Presentation Software; Video
Editing; DVD Production; Animation and Interactivity etc. If you are creating, or
learning to create graphic design, computer art, or maybe “Digital Media” is the term
that we should use, then it’s a good thing to understand the function of each

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

100 application. There are many applications in the market and most of them are
Multimedia and its Applications
expensive.
A few of the various application areas that are influenced by Computer graphics are:
z Presentation Graphics
z Painting and Drawing
z Photo Editing
z Scientific Visualisation
z Image Processing
z Education, Training, Entertainment and CAD
z Simulations
z Animation and Games
Let us discuss these fields one by one.

6.4.1 Presentation Graphics


The moment you are going to represent yourself or your company or product or
research paper etc. simply standing and speaking is quite ineffective. Now, in such a
situation where no one stands with you, your ultimate companions are the slides
which have some information either in the form of text, charts, graphs etc., which
make your presentation effective. If you think more deeply, these are nothing but the
ways some curves (text/graph/charts) which are used in some sequence and to create
such things, graphics is the ultimate option, this application of graphics is known as
presentation graphics, which can be done very effectively through computers
now-a-days. There are some softwares which helps you present effectively. Such
application softwares are known as Presentation Graphics softwares – which is a
software that shows information in the form of a slideshow (A slideshow is a display
of a series of chosen images, which is done for artistic or instructional purposes.
Slideshows are conducted by a presenter using an apparatus which could be a
computer or a projector).
Three major functions of presentation graphics are:
z an editor that allows text to be inserted and formatted,
z a method for inserting and manipulating graphic images, and
z a slide show system to display the content.
The program that helps users to create presentations such as visual aids, handouts, and
overhead slides to process artwork, graphics, and text and produce a series of ‘slides’–
which help speakers get their message across are presentation graphics softwares.
Example programs include some softwares like Apple’s Keynote, Openoffice’s (Star
Office-by Sun microsystems) Impress, Microsoft Powerpoint and (for multimedia
presentations, incorporating moving pictures, and sounds) Macromedia Director.
Custom graphics can also be created in other programs such as Adobe Photoshop or
Adobe Illustrator and then imported. With the growth of video and digital
photography, many programs that handle these types of media also include
presentation functions for displaying them in a similar “slide show” format.
Similar to programming extensions for an Operating system or web browser, “add
ons” or plug-ins for presentation programs can be used to enhance their capabilities.
For example, it would be useful to export a PowerPoint presentation as a Flash
animation or PDF document. This would make delivery through removable media or
sharing over the Internet easier. Since PDF files are designed to be shared regardless

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

of platform and most web browsers already have the plug-in to view Flash files, these 101
Multimedia Building Blocks
formats would allow presentations to be more widely accessible. (Using Graphics and Images)
We may say that Presentation graphics is more than just power point presentation
because it includes any type of slide presentation, bar chart, pie chart, graphs and
multimedia presentation. The key advantage of this software is that it help you show
abstracts of representation of work.
There are some softwares like canvas that improves the presentation created through
powerpoint or keynote software. Although these software packages contain a lot of
handy features, they lack many vector and image creation capabilities, therefore,
creating a need for a graphic/illustration program. Scientists, engineers, and other
technically-oriented professionals often call upon Canvas and its host of vector, image
editing, and text features to create the exciting visual components for their
presentation projects.
General questions that strike many graphic designers, students, and engineers rushing
to import their illustrations and images into presentations are:
z What resolution should be used?
z Which file format is best?
z How do I keep the file size down?
Let us discuss in brief the suitability of the technique (Vector or Bitmap), and the file
format appropriate to the creation of a better presentation.
Resolution
Graphic illustrations are used in presentations to help convey an idea or express a
mood, two kinds of illustration graphics are:
1. Vector, and
2. Bitmap.
You may wonder which one of these is a better format when exporting to some
software PowerPoint or Keynote or impress. The truth is that there are different
situations that call for different methods, but here are some things to look out for. For
instance, vectors are objects that are defined by anchor points and paths, while
bitmapped graphics are digital images composed of pixels. The advantage of using
vector graphics is that they are size independent, meaning that they could be resized
with no loss in quality. Bitmapped graphics, on the other hand, provide a richer depth
of colour but are size dependent and appear at the stated 72 dpi size.

Image File Formats


Say, we want an image of a Fly. The wings are partially transparent and to represent
that in our presentation what be problematic if proper file format is not there. This
choice of file format is hidden in the software that you may be using. Two cases for
the same situation are discussed below:
z The right file format that will allow us to create a transparent background in
Keynote presentation. Even though Keynote could import all common file formats
such as GIF, JPG, and BMP, there is one format that will work particularly well
which is .PSD. Using .PSD (Photoshop format) we are able to easily place a
transparent image, even partially transparent sections of the image, such as the
wings of the fly, as well as retain their properties.
z The right file format that will allow us to create a transparent background in
PowerPoint. Even though PowerPoint could import all common file formats such
as GIF, JPG, and BMP, there are two particular file formats that will work

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

102 exceptionally well: TIFF and PNG. Using TIFF (Tagged-Image File Format) or
Multimedia and its Applications
PNG (Portable Network Graphic), we could easily remove the unwanted
background quickly and easily in PowerPoint, a feature not available to the other
mentioned file formats.
TIFF or PNG: TIFF has been around longer than PNG, which was originally
designed to replace GIF on the Web. PowerPoint works well with both these files
when creating transparent backgrounds but generally PNG creates smaller file sizes
with no loss of quality.

6.4.2 Painting and Drawing


When we talk about graphics, we mean pictures, and pictures can be either
illustrations or photographs. If you want to get graphics into a Web page or
multimedia presentation, you either have to create them in some kind of graphics
application by drawing or painting them right there in the application, or bringing
them into the application via a digital camera or scanner, and then editing and saving
them in a form suitable to your medium. Many software applications offer a variety of
features for creating and editing pictures on the computer. Even multimedia authoring
and word processing programs include some simple features for drawing on the
computer.
So, painting and drawing application in computer graphics allows the user to pick and
edit any object at any time. The basic difference is as follows:
Drawing in a software application means using tools that create “objects,” such as
squares, circles, lines or text, which the program treats as discrete units. If you draw a
square in PowerPoint, for example, you can click anywhere on the square and move it
around or resize it. It’s an object, just like typing the letter “e” in a word processor.
i.e., a drawing program allows a user to position standard shape (also called symbols,
templates, or objects) which can be edited by translation, rotations and scaling
operations on these shapes.
Painting functions, on the other hand, don’t create objects. If you look at a computer
screen, you’ll see that it’s made up of millions of tiny dots called pixels. You’ll see
the same thing in a simpler form if you look at the colour comics in the Sunday
newspaper — lots of dots of different colour ink that form a picture. Unlike a drawing
function, a paint function changes the colour of individual pixels based on the tools
you choose. In a photograph of a person’s face, for example, the colours change
gradually because of light, shadow and complexion. You need a paint function to
create this kind of effect; there’s no object that you can select or move the way you
can with the drawn square i.e., a painting program allows the user to paint arbitrary
swaths using brushes of various sizes, shapes, colour and pattern. More painting
program allows placement of such predefined shapes as rectangles, polygon and
canvas. Any part of the canvas can be edited at pixel level.
The reason why the differences are important is that, as noted earlier, many different
kinds of programs offer different kinds of graphics features at different levels of
sophistication, but they tend to specialise in one or the other. For example:
1. Many word processors, like Word, offer a handful of simple drawing functions.
They aren’t that powerful, but if all you need is a basic illustration made up of
simple shapes to clarify a point, they’re fine.
2. Some programs specialise in graphics creation. Of these, some are all-purpose
programs, like KidPix, which offers both drawing and painting functions. KidPix
is targeted specifically at children; it has a simplified interface and lacks the
sophisticated functions a professional artist might want.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Other programs, like Adobe PhotoShop, specialise in painting functions, even 103
Multimedia Building Blocks
though they may include drawing functions as well. Painter is a paint-oriented (Using Graphics and Images)
program that offers highly sophisticated, “natural media” functions that
approximate the effects of watercolours or drawing with charcoal on textured
paper. Other graphics programs, such as Adobe Illustrator, specialise in drawing
for professional artists and designers; AutoCAD is used mainly for technical and
engineering drawing.
3. Page layout, presentation, multimedia authoring and Web development programs
usually contain a variety of graphics functions ranging from the simple to the
complex, but their main purpose is composition, not image creation or editing.
That is, they allow you to create or import text and graphics and, perhaps, sound,
animation and video.
Most of the graphics features in these types of programs are limited to drawing
functions because they assume that you will do more complex work in a program
dedicated to other functions (e.g., writing in a word processor, editing photos in a
paint program), then import your work to arrange the different pieces in the
composition program. (Some multimedia authoring systems, however, also offer
painting and drawing functions.)
The differences in composition programs are mainly in the form of their output:
Page layout programs, such as PageMaker and QuarkXPress, are for composing
printed pages; presentation and multimedia authoring programs, such as
PowerPoint and HyperStudio, are for slide shows and computer displays; and
Web development applications, like Netscape Composer, are for, well, Web
pages.
4. What if you are going to make a magazine, newspaper, book or maybe a
multipage menu for a restaurant. In that case, we need a page layout program. The
well known softwares in page layout are:
(a) QuarkXPress
(b) PageMaker (Adobe)
(c) In design (Adobe)
(d) Publisher (Microsoft)
The Queen of Page Layout 0020is QuarkXPress, owned by QuarkXPress and In
design is the King owned by Adobe and finally there is Microsoft Publisher,
which is very easy to use.
5. To create posters, brochures, business cards, stationary, coffee mug design, cereal
boxes, candy wrappers, half gallon jugs of orange juice, cups, or anything else
you see in print, most designers are going to use vectorised programs to make
these things come to life. Vectors are wonderful because they print extremely
well, and you can scale them up to make them large, or scale them down to make
them small, and there is no distortion. Adobe Illustrator is the King of Vector
Programs, hands down. In Adobe Illustrator, you can create a 12 foot, by 12 foot
document. If we are going to make anything that is going to be printed, we are
doing it in Illustrator. Anything that you create in Illustrator, and the text you use,
will come out great. The thing is, Illustrator is hard to learn. It is not an intuitive
program at all. This is because vectors use control points called paths and anchor
points. To someone new, they are hard to understand, find, and control. That’s
another story. If you are making a poster, you would make your logo, artwork and
text in Illustrator. You would still manipulate your images in Photoshop, and then,
“place” them to the Illustrator.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

104 Check Your Progress 1


Multimedia and its Applications
1. What are the application areas of Computer Graphics? Write short notes on
each.
………………………………………………………………………………
……………………………………………………………………………
2. What are the file formats available for Presentation Graphics?
………………………………………………………………………………
………………………………………………………………………………
3. Write the full form of (1) TIFF (2) PNG.
………………………………………………………………………………
………………………………………………………………………………

6.4.3 Photo Editing


Photo editing programs are paint programs—it’s just that they include many
sophisticated functions for altering images and for controlling aspects of the image,
like light and colour balance. For the most part, any paint program can open and
display a digital photo image, but it will probably not offer the range and depth of
features that a true photo-editing program like PhotoShop does. KidPix is a general
graphics program and PhotoShop is an image-editing program. PhotoShop is the
standard used by almost all professional artists and image editors. Key graphic
application involves image editing or manipulation, no matter what type of design you
are creating, you are going to manipulate some images. You might change the content
of the images, crop, resize, touchup, falsify, fade in, fade out, and/or whatever.
Anything that you are going to do to change an image will get done in an image
editing or image manipulation application.
There are three big players in image manipulation:
1. PhotoShop (Adobe)
2. FireWorks (Macromedia)
3. Corel (owned by Corel)
Almost everything you see in print or on the web has gone through PhotoShop. It is
the king of image manipulation. With PhotoShop you can make anything look real.
Photoshop comes bundled with a program called, “ImageReady”.
ImageReady helps your created animated gif, web site rollover effects, image maps
and more. Most people that own PhotoShop use less than 10 per cent of its powerful
tools. Fireworks is a super image manipulation application. The thing is, if you open
the program, many of the icons, the tool bar, the panels and many of the options in the
drop down menus look just like options in PhotoShop. It kind of looks like somebody
copied the other person’s application.
z Video editing is in a new and revolutionary stage. Computers really weren’t ready
to edit video affordably until right now. Right now, if you have a fast computer
and a lot of storage memory, you can create video segments just like anything you
see on TV. And, it works well. I would say that the most popular video editing
programs are:
IMovie (Apple)
Adobe Premiere (Adobe)
Final Cut Pro (Apple)
Studio Version 9 (Pinnacle Systems)

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

z Web Design and Editing 105


Multimedia Building Blocks
To make and edit a website, the big three softwares are: (Using Graphics and Images)

1. DreamWeaver (MacroMedia)
2. Frontpage (MicroSoft)
3. Go Live (Adobe)
4. Netscape Composer (Netscape).
Most web developers use DreamWeaver. It is a super tool. It will write your html, css,
javascript and create your forms. Frontpage is known for writing lots of code that you
don’t need.

6.4.4 Scientific Visualisation


It is difficult for the human brain to make sense out of the large volume of numbers
produced by a scientific computation. Numerical and statistical methods are useful for
solving this problem. Visualisation techniques are another approach for interpreting
large data sets, providing insights that might be missed by statistical methods. The
pictures they provide are a vehicle for thinking about the data.
As the volume of data accumulated from computations or from recorded
measurements increases, it becomes more important that we be able to make sense out
of such data quickly. Scientific visualisation, using computer graphics, is one way to
do this. Scientific visualisation involve interdisciplinary research into robust and
effective computer science and visualisation tools for solving problems in biology,
aeronautics, medical imaging, and other disciplines.
The profound impact of scientific computing upon virtually every area of science and
engineering has been well established. The increasing complexity of the underlying
mathematical models has also highlighted the critical role to be played by Scientific
visualisation. It, therefore, comes as no surprise that Scientific visualisation is one of
the most active and exciting areas of Mathematics and Computing Science, and indeed
one which is only beginning to mature.
Scientific visualisation is a technology which helps to explore and understand
scientific phenomena visually, objectively, quantitatively. Scientific visualisation
allow scientists to think about the unthinkable and visualise the unviable. Through this
we are seeking to understand data. We can generate beautiful pictures and graphs; we
can add scientific information (temperature, exhaust emission or velocity) to an
existing object thus becoming a scientific visualisation product.
Thus, we can say scientific visualisation is a scientists tool kit, which helps to
simulate insight and understanding of any scientific issue, thus, helping not only in
solving or analysing the same but also producing appropriate presentations of the
same. This concept of scientific visualisation fits well with modeling and simulation.
The Figure 6.1 describes steps for visualisation of any scientific problem under
consideration, these steps are followed recursively to visualize any complex situation.
Hence, computer graphics has become an important part of scientific computing. A
large number of software packages now exist to aid the scientist in developing
graphical representations of their data. Some of the tools or packages used to express
the graphical result for modeling and simulation of any scientific visualisation are:
z Matlab (by The Math Works Inc.)
z Mathematica or Maple (graphical computer algebra system)
z Stella (models dynamic systems)

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

106 z IDS (Interactive Data Systems) by Research System Inc.


Multimedia and its Applications
z AVS (Application Visualisation System) by Advance visual System Inc.
z Excel.

6.4.5 Image Processing


Modern digital technology has made it possible for the manipulation of multi-
dimensional signals with systems that range from simple digital circuits to advanced
parallel computers. The goal of this manipulation can be divided into three categories:
1. Image Processing image in -> image out
2. Image Analysis image in -> measurements out
3. Image Understanding image in -> high-level description out
Image understanding requires an approach that differs fundamentally from the theme
of this section. Further, we will restrict ourselves to two-dimensional (2D) image
processing although, most of the concepts and techniques that are to be described can
be extended easily to three or more dimensions.
We begin with certain basic definitions. An image defined in the “real world” is
considered to be a function of two real variables, for example, a(x,y) with a as the
amplitude (e.g., brightness) of the image at the real coordinate position (x,y). An
image may be considered to contain sub-images sometimes referred to as regions-of-
interest, ROIs, or simply regions. This concept reflects the fact that images frequently
contain collections of objects each of which can be the basis for a region. In a
sophisticated image processing system it should be possible to apply specific image
processing operations to selected regions.
Thus, one part of an image (region) might be processed to suppress motion blur while
another part might be processed to improve colour rendition. The amplitudes of a
given image will almost always be either real numbers or integer numbers. The latter
is usually a result of a quantisation process that converts a continuous range (say,
between 0 and 100%) to a discrete number of levels. In certain image-forming
processes, however, the signal may involve photon counting which implies that the
amplitude would be inherently quantised. In other image forming procedures, such as
magnetic resonance imaging, the direct physical measurement yields a complex
number in the form of a real magnitude and a real phase.
A digital image a[m,n] described in a 2D discrete space is derived from an analog
image a(x,y) in a 2D continuous space through a sampling process that is frequently
referred to as digitisation. Let us discuss details of digitization.
The 2D continuous image a(x,y) is divided into N rows and M columns. The
intersection of a row and a column is termed a pixel. The value assigned to the integer
coordinates [m,n] with {m=0,1,2,...,M –1} and {n=0,1,2,...,N –1} is a[m,n]. In fact, in
most cases a(x,y) – which we might consider to be the physical signal that impinges
on the face of a 2D sensor – is actually a function of many variables including depth
(z), colour (), and time (t). The effect of digitisation is shown in Figure 6.1.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

107
Multimedia Building Blocks
(Using Graphics and Images)

Figure 6.1: Effects of Digitization

The image shown in Figure 6.1 has been divided into N = 30 rows and M = 30
columns for digitisation of a continuous image. The value assigned to every pixel
(pixel at coordinates [m=10, n=3]) is the average brightness in the pixel rounded to
the nearest integer value. The process of representing the amplitude of the 2D signal at
a given coordinate as an integer value with L different grey levels is usually referred
to as amplitude quantisation or simply quantisation.

6.4.6 Education, Training, Entertainment and Computer Aided


Design (CAD)
CAD (or CADD) is an acronym that, depending on who you ask, can stand for:
z Computer Aided Design.
z Computer Aided Drafting.
z Computer Assisted Design.
z Computer Assisted Drafting.
z Computer Assisted Design and Drafting.
In general acronym for CAD is Computer-Aided Design. In CAD interactive graphics
is used to design components and systems of mechanical, electrical, and electronic
devices. Actually CAD system is a combination of hardware and software that enables
engineers and architects to design everything from furniture to airplanes. In addition
to the software, CAD systems require a high-quality graphics monitor; a mouse, light
pen or digitised tablets for drawing; and a special printer or plotter for printing design
specifications.
CAD systems allow an engineer to view a design from any angle with the push of a
button and to zoom in or out for close-ups and long-distance views. In addition, the
computer keeps track of design dependencies so that when the engineer changes one
value, all other values that depend on it are automatically changed accordingly.
Generally we use CAD as a tool for imparting education and training to the engineers,
so that, they can produce beautifully carved and engineered pieces in bulk with the
same amount of finishing and perfection. Generally a few terms are used repeatedly
with CAD and they are CAM and CNC. Let us discuss “What are CAD/CAM and
CAD/CNC(or NC)”? The term CAD/CAM is a shortening of Computer-Aided Design
(CAD) and Computer-Aided Manufacturing (CAM). The term CAD/NC (Numerical
Control) is equivalent in some industries.
CAD/CAM software uses CAD drawing tools to describe geometries used by the
CAM portion of the program to define a tool path that will direct the motion of a
machine tool to machine the exact shape that is to be drawn on the computer.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

108 Let us discuss the terms in brief.


Multimedia and its Applications
z Numerically-Controlled Machines: Before the development of Computer-aided
design, the manufacturing world adopted tools controlled by numbers and letters
to fill the need for manufacturing complex shapes in an accurate and repeatable
manner. During the 1950s these Numerically-Controlled machines used the
existing technology of paper tapes with regularly spaced holes punched in them
(think of the paper roll that makes an old-fashioned player piano work, but only
one inch wide) to feed numbers into controller machines that were wired to the
motors positioning the work on machine tools. The electro-mechanical nature of
the controllers allowed digital technologies to be easily incorporated as they were
developed. NC tools immediately raised automation of manufacturing to a new
level once feedback loops were incorporated (the tool tells the computer where it
is, while the computer tells it where it should be).
What finally made NC technology enormously successful was the development of
the universal NC programming language called APT (Automatically Programmed
Tools). Announced at MIT in 1962, APT allowed programmers to develop
postprocessors specific to each type of NC tool so that, the output from the APT
program could be shared among different parties with different manufacturing
capabilities. Now-a-days many new machine tools incorporate CNC technologies.
These tools are used in every conceivable manufacturing sector, like CNC
technology is related to Computer Integrated Manufacturing (CIM), Computer
Aided Process Planning (CAPP) and other technologies such as Group
Technology (GT) and Cellular Manufacturing. Flexible Manufacturing Systems
(FMS) and Just-In-Time Production (JIT) are made possible by Numerically-
Controlled Machines.
z CAD and CAM: The development of Computer-aided design had little effect on
CNC initially due to the different capabilities and file formats used by drawing
and machining programs. However, as CAD applications such as SolidWorks and
AutoCad incorporate CAM intelligence, and as CAM applications such as
MasterCam adopt sophisticated CAD tools, both designers and manufacturers are
now enjoying an increasing variety of capable CAD/CAM software. Most
CAD/CAM software was developed for product development and the design and
manufacturing of components and moulds, but they are being used by architects
with greater frequency. Thus, a CAD program introduces the concept of
real-world measurement. For example, a car or building can be drawn as if it were
life-size, and later arranged into sheets and printed on paper at any desired scale.

6.4.7 Simulations
Computer simulation is the discipline of designing a model of an actual or theoretical
physical system, executing the model on a digital computer, and analysing the
execution output. Simulation embodies the principle of “learning by doing” – to learn
about the system we must first build a model of some sort and then operate the model.
The use of simulation is an activity that is as natural as a child who role plays.
Children understand the world around them by simulating (with toys and figures)
most of their interactions with other people, animals and objects. As adults, we lose
some of this childlike behaviour but recapture it later on through computer simulation.
To understand reality and all of its complexity, we must build artificial objects and
dynamically act our roles with them. Computer simulation is the electronic equivalent
of this type of role playing and it serves to drive synthetic environments and virtual
world. Within the overall task of simulation, there are three primary sub-fields: model
design, model execution and model analysis.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

109
To simulate something physical, you will first need to create a mathematical model, Multimedia Building Blocks
which represents that physical object. Models can take many forms including (Using Graphics and Images)
declarative, functional, constraint, spatial or multimodel. A multimodel is a model
containing multiple integrated models each of which represents a level of granularity
for the physical system. The next task, once a model has been developed, is to execute
the model on a computer – that is, you need to create a computer program which steps
through time while updating the state and event variables in your mathematical model.
There are many ways to “step through time”. You can, for instance, leap through time
using event scheduling or you can employ small time increments using time slicing.
You can also execute (i.e., simulate) the program on a massively parallel computer.
This is called parallel and distributed simulation. For many large-scale models, this is
the only feasible way of getting answers back in a reasonable amount of time.
You may want to know why to do simulation? Is there any other way to do the tasks?
To discuss these issues lets briefly discuss the cases in which simulation is essential.
There are many methods of modeling systems which do not involve simulation but
which involve the solution of a closed-form system (such as a system of linear
equations).

6.4.8 Animation and Games


In our childhood, we have all seen the flip books of cricketers which came free along
with some soft drink, where several pictures of the same person in different batting or
bowling actions are sequentially arranged on separate pages, such that when we flip
the pages of the book the picture appears to be in motion. This was a flipbook (several
papers of the same size with an individual drawing on each paper so the viewer could
flip through them). It is a simple application of the basic principle of physics called
persistence of vision. This low tech animation was quite popular in the 1800s when
the persistence of vision (which is 1/16th of a second) was discovered. This discovery
led to some more interesting low tech animation devices like the zoetrope, wheel of
life, etc. Later, depending on many basic mathematics and physics principles, several
researches were conducted which allowed us to generate 2d/3d animations.
The difference between animation and graphics is that animation adds to graphics the
dimension of time which vastly increases the amount of information to be transmitted,
so some methods are used to handle this vast information and these methods are
known as animation methods which are classified as:
First Method: In this method, the artist creates a succession of cartoon frames, which
are then combined into a film.
Second Method: Here, the physical models are positioned to the image to be recorded.
On completion, the model moves to the next image for recording and this process is
continued. Thus the historical approach of animation has classified computer
animation into two main categories:
(a) Computer-Assisted Animation usually refers to 2D systems that computerise the
traditional animation process. Here, the technique used is interpolation between
key shapes which is the only algorithmic use of the computer in the production of
this type of animation equation, curve morphing (key frames, interpolation,
velocity control), image morphing.
(b) Computer Generated Animation is the animation presented via film or video,
which is again based on the concept of persistence of vision because the eye-brain
assembles a sequence of images and interprets them as a continuous movement
and if the rate of change of pictures is quite fast then it induces the sensation of
continuous motion.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

110
Multimedia and its Applications This motion specification for computer-generated animation is further divided into
two categories:
Low Level Techniques (Motion Specific) are used to control the motion of any
graphic object in any animation scene fully. Such techniques are also referred as
motion specific techniques because we can specify the motion of any graphic object in
the scene. Techniques such as interpolation, approximation etc., are used in motion
specification of any graphic object. Low level techniques are used when animator
usually has a fairly specific idea of the exact motion that s/he wants.
High Level Techniques (Motion Generalised) are techniques used to describe the
general motion behaviour of any graphic object. These techniques are algorithms or
models used to generate motion using a set of rules or constraints. The animator sets
up the rules of the model, or chooses an appropriate algorithm, and selects initial
values or boundary values. The system is then set into motion and the motion of the
objects is controlled by the algorithm or model. This approach often relies on fairly
sophisticated computation such as, vector algebra and numerical techniques among
others.
So, the animation concept can be defined as: A time based phenomenon for imparting
visual changes in any scene according to any time sequence. The visual changes could
be incorporated through the Translation of the object, scaling of the object, or change
in colour, transparency, surface texture etc.
It is to be noted that computer animation can also be generated by changing camera
parameters such as its position, orientation, focal length etc. plus changes in the light
effects and other parameters associated with illumination and rendering can produce
computer animation too.

Check Your Progress 2


1. What is Photo Editing? What are the softwares used for image editing?
………………………………………………………………………………
………………………………………………………………………………
2. What do you understand by term scientific visualization? Name some
software used in this area.
………………………………………………………………………………
………………………………………………………………………………
3. What is image processing? Give some areas of importance in which the
concept of image processing is use.
………………………………………………………………………………
………………………………………………………………………………

6.5 LET US SUM UP


In this lesson, we have discussed the conceptual meaning of graphics and images, with
its application in various fields right from presentation graphics to animation and
games. We have also discussed the variety of software and their respective file
formats used in various applications of computer graphics. In the end, we have
discussed the working of various input and output devices. In this lesson, we have
concentrated on the graphic capabilities and its potential in multimedia applications.

6.6 LESSON END ACTIVITIES


1. Take any of your course assignment and attach image files such as JPG, GIF,
BMP to make it a multimedia assignment.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

2. Discuss among group members how you can use the power of imaging to make a 111
Multimedia Building Blocks
difference between an ordinary multimedia presentation and a professionally (Using Graphics and Images)
spectacular one.

6.7 KEYWORDS
Computer Graphics: Computer Graphics is the pictorial representation manipulation
of data by a computer.
CAD: CAD is an acronym that stands for Computer Aided Design.
CAM: CAM is an acronym that stands for Computer-Aided Manufacturing (CAM).
Computer Simulation: Computer simulation is the discipline of designing a model of
an actual or theoretical physical system, executing the model on a digital computer,
and analysing the execution output.

6.8 QUESTIONS FOR DISCUSSION


1. Differentiate between Drawing and Painting.
2. What is Adobe Illustrator used for?
3. Give some softwares that are suitable for Page Layout generators.
4. Is Powerpoint the only presentation graphics software available? If no, then, name
some softwares and their developers otherwise provide features of the Powerpoint
softwares.
5. Is CAD useful only for designing drawings?

Check Your Progress: Model Answers


CYP 1
1. A few of the various applications areas which are influenced by Computer
graphics are:
™ Presentation Graphics
™ Painting Drawing
™ Photo Editing
™ Scientific Visualisation
™ Image Processing
™ Digital Art
™ Education, Training, Entertainment and CAD
™ Simulation
™ Animation and games.
2. GIF, JPG, BMP, PSD, TIFF, PNG etc.
3. TIFF or PNG: TIFF has been around longer than PNG, which was
originally designed to replace GIF on the Web. PowerPoint works well
with both of these files when creating transparent backgrounds but
generally PNG creates smaller file sizes with no loss of quality.

CYP 2
1. Photo-editing stream involves programs, which are not just paint
programs—but they include many sophisticated functions for altering
images and for controlling aspects of the image, like light and colour
balance. Some of the professionally used software for photo editing are

Contd….

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

112 PhotoShop (Adobe), FireWorks (Macro Media), Corel (owned by Corel)


Multimedia and its Applications
etc.
2. Scientific visualisation involves interdisciplinary research into robust and
effective computer science and visualisation tools for solving problems in
biology, aeronautics, medical imaging, and other disciplines. The profound
impact of scientific computing upon virtually every area of science and
engineering has been well established. Some examples of the software used
in this field are:
Matlab (by The Math Works Inc.)
Mathematica or Maple (graphical computer algebra system)
Stella (models dynamic systems)
IDL (Interactive Data Systems) by Research System Inc.
AVS (Application Visualisation System) by Advance visual System Inc.
3. Image Processing means image in processed image out. Images are the
final products of most processes in computer graphics. The ISO
(International Standards Organisation) defines computer graphics as the
sum total of methods and techniques for concerning data for a graphics
device by a computer. It summarise and explain computer graphics as
converting data into images, which is known as visualisation.

6.9 SUGGESTED READINGS


Dhiraj Sharma, Foundations of IT, Excel Books, 2008.
Vaughan, Multimedia Making IT Work, Fifth Edition, Tata McGraw-Hill.
John F. Koegel Bufford, Multimedia Systems, Pearson Education, 2003.
Judith Jeffloate, Multimedia in Practice (Technology and Applications), PHI, 2003.
Ze-Nian Li and Mark S. Drew, Fundamentals of Multimedia, Prentice-Hall, 2004.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

113
LESSON Multimedia Building Blocks
(Using Animation)

7
MULTIMEDIA BUILDING BLOCKS (USING ANIMATION)

CONTENTS
7.0 Aims and Objectives
7.1 Introduction
7.2 Basics of Animation
7.2.1 Traditional and Historical Methods for Production of Animation
7.2.2 Traditional Animation Techniques
7.2.3 Sequencing of Animation Design
7.2.4 Types of Animation Systems
7.3 Types of Animations
7.4 Computer Animation Tools
7.5 Let us Sum up
7.6 Lesson End Activity
7.7 Keywords
7.8 Questions for Discussion
7.9 Suggested Readings

7.0 AIMS AND OBJECTIVES


After studying this lesson, you would be able to:
z Describe the basic properties of animation
z Classify the animation and its types
z Discuss how to impart acceleration in animation
z Give examples of different animation tools and applications

7.1 INTRODUCTION
The word Animation is derived from ‘animate’ which literally means ‘to give life to’,
‘Animating’ a thing means to impart movement to something which can’t move on its
own. In order to animate something, the animator should be able to specify, either
directly or indirectly, how the ‘thing’ is to move through time and space.
Before dealing with complexities of animation, let us have a look at some basic
concepts of Animation. In our childhood, we all have seen the flip book of cricketers
which came free along with some soft drink, where several pictures of the same
person in different batting or bowling actions are intact sequentially on separate pages,
such that when we flip the pages of the book the picture appears to be in motion, this
was a flipbook (several papers of the same size with an individual drawing on each
paper so the viewer could flip through them). It is a simple application of the basic

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

114 principle of Physics called Persistence of Vision. This low tech animation was quite
Multimedia and its Applications
popular in the 1800s when the Persistence of vision (which is 1/16th of a second) was
discovered. This discovery led to some more interesting low tech animation devices
like, the zoetrope, wheel of life, etc. Later, depending on many basic mathematics and
physics principles, several researches were conducted which allowed us to generate
2D/3D animations.

7.2 BASICS OF ANIMATION


7.2.1 Traditional and Historical Methods for Production of Animation
The transformations involved in computer graphics are related to space and not to
time. Here, lies the basic difference between Animation and graphics. The difference
is that animation adds to graphics, the dimension of time, which vastly increases the
amount of information to be transmitted, so some methods are used to handle this vast
information and these methods are known as animation methods.

First Method
Here, artist creates a succession of cartoon frames, which are then combined into a
film.

Second Method
Here, the physical models are positioned to the image to be recorded. On completion
the model moves to the next image for recording and this process is continued.
Thus, the historical approach of animation has classified computer animation into two
main categories:
1. Computer-assisted animation usually refers to 2D systems that computerise the
traditional animation process. Here, the technique used is interpolation between
key shapes which is the only algorithmic use of the computer in the production of
this type of animation equation, curve morphing (key frames, interpolation,
velocity control), image morphing.
2. Computer generated animation is the animation presented via film or video, which
is again based on the concept of persistence of vision because the eye-brain
assembles a sequence of images and interprets them as a continuous movement
and if the rate of change of pictures is quite fast then it induce the sensation of
continuous motion.

7.2.2 Traditional Animation Techniques


Before the advent of computer animation, all animation was done by hand, which
involves an enormous amount of work. You can have an idea of work by considering
that each second of animation film contains 24 frames (film) then, one can imagine
the amount of work in creating even the shortest of animated films. Before, creating
any animation the first step is to design the concerned storyboard which is the first
sight of what a cartoon or a piece of animation is going to look like. It appears as a
series of strip comics, with individual drawings of story lines, scenes, characters, their
emotions and other major part of movie. Now, let us discuss a couple of different
techniques, which were developed for creating animation by hand.

Key Frames
After a storyboard has been laid out, the senior artists go and draw the major frames of
the animation. These major frames are frames in which a lot of change takes place.
They are the key points of the animation. Later, a bunch of junior artists draw in the
frames in between. This way, the workload is distributed and controlled by the key

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

frames. By doing work this way, the time in which an animation can be produced is 115
Multimedia Building Blocks
cut dramatically, depending on the number of people working on the project. Work (Using Animation)
can be done simultaneously by many people, thus cutting down the time needed to get
a final product out.

Cel Animation
By creating an animation using this method, each character is drawn on a separate
piece of transparent paper. A background is also drawn on a separate piece of opaque
paper. Then, when it comes to shooting the animation, the different characters are
overlaid on top of the background in each frame. This method also saves time in that
the artists do not have to draw in entire frames, but rather just the parts that need to
change such as individual characters, even separate parts of a character's body are
placed on separate pieces of transparent paper. For further understanding, let us have
an example. Say you want to show that an aeroplane is flying. You can draw an
aeroplane on a transparent sheet and on another opaque sheet you can have clouds.
Now, with the opaque sheet as background you can move the transparent sheet over it,
which gives the feeling of flying aeroplane. These traditional techniques were
extended to the era of computer animation techniques and hence different animation
systems are evolved. We cannot say which technique is better because different
techniques are used in different situations. In fact all these animation techniques are
great, but they are most useful when all of them are used together.
Cel animation by itself would not help out much if it wasn't for key frames and being
able to distribute the workload across many people. Now, let us also discuss the
computer animation methods, which are in wide use, two of the typical computer
animation methods are ‘frame’ animation and sprite animation.

Frame Animation Non-interactive Animation Rectangular Shape (Cartoon Movies)


This is an “internal” animation method, i.e., it is animation inside a rectangular frame.
It is similar to cartoon movies: a sequence of frames that follow each other at a fast
rate, fast enough to convey fluent motion. It is typically pre-compiled and non-
interactive. The frame is typically rectangular and non-transparent. Frame animation
with transparency information is also referred to as “cel” animation. In traditional
animation, a cel is a sheet of transparent acetate on which a single object
(or character) is drawn.

Sprite Animation Interactive, may be Non-rectangular (Computer Games)


In its simplest form it is a 2D graphic object that moves across the display. Sprites
often can have transparent areas. Sprites are not restricted to rectangular shapes. Sprite
animation lends itself well to be interactive. The position of each sprite is controlled
by the user or by an application program (or by both). It is called “external”
animation. We refer to animated objects (sprites or movies) as “animobs”. In games
and in many multimedia applications, the animations should adapt themselves to the
environment, the program status or the user activity. That is, animation should be
interactive. To make the animations more event driven, one can embed a script, a
small executable program, in every animob. Every time an animob touches another
animob or when an animob gets clicked, the script is activated. The script then decides
how to react to the event (if at all). The script file itself is written by the animator or
by a programmer.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

116
Multimedia and its Applications Check Your Progress 1

1. What do you mean by animation? What are the different ways to produce it?
…………………………………………………………………………….
…………………………………………………………………………….
2. What do you mean by computer generated and computer assisted
animations?
…………………………………………………………………………….
…………………………………………………………………………….
3. Differentiate between:
(a) Low level and high-level animation techniques
(b) Key frame and Cel animation
…………………………………………………………………………….
…………………………………………………………………………….
4. Which animation technique is better, Key frame or Cel animation?
…………………………………………………………………………….
…………………………………………………………………………….

7.2.3 Sequencing of Animation Design


Till now we have discussed a lot about the traditional and current trends of computer
generated animation but now it is time to practically discuss the necessary sequencing
of animation steps which works behind the scenes of any animation. This sequencing
is a standard approach for animated cartoons and can be applied to other animation
applications as well.
General Steps of designing the animation sequence are as follows:
1. Layout of Storyboard: Storyboard layout is the action outline used to define the
motion sequence as a set of basic events that are to take place. It is the type of
animation to be produced which decides the storyboard layout. Thus, the
storyboard consists of a set of rough sketches or a list of basic ideas for the
motion.
2. Definition of Object: The object definition is given for each participant object in
action. The objects can be defined in terms of basic shapes, associated movements
or movement along with shapes.
3. Specification of Key Frame: It is the detailed drawing of the scene at a certain
time in the animation sequence. Within each key frame, each object is positioned
according to time for that frame. Some key frames are chosen at the extreme
positions in the action; others are spaced so that the time interval between key
frames is not too great. More key frames are specified for intricate motion than for
simple, slowly varying motions.
4. In-between Frames Generation: In-between frames are the intermediate frames
between the key frames. The number of in between frames is dependent on the
media to be used to display the animation. In general, film requires 24 frames per
second, and graphic terminals are refreshed at the rate of 30 to 60 frames per
second. Typically the time interval for the motion are set up so that there are 3 to
5 in-betweens for each pair of key frames. Depending upon the speed specified for
the motion, some key frames can be duplicated.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

There are many applications that do not follow this sequence like, real time computer 117
Multimedia Building Blocks
animations produced by vehicle driving or flight simulators, for instance, display (Using Animation)
motion sequence in response to setting on vehicle or aircraft controls, plus the
visualization applications are generated by the solution of numerical models. And for
frame-by-frame animation each frame of the scene is separately generated and stored.
Later the frames can be recorded on film or they can be consecutively displayed in
“real time playback” mode.

7.2.4 Types of Animation Systems


We have discussed that the sequencing of animation is useful in developing any
animation. This sequencing is more or less the ‘same in all animation systems’. Before
proceeding to the types of animation in the next section, let us study the types of
Animation Systems. So let us discuss a few animation systems, which are generally
used:

Key Frame Systems


This technique is for low-level motion control. Actually these systems include
languages which are designed simply to generate the in-betweens from the user-
specified key frames.
Usually, each object in the scene is defined as a set of rigid bodies connected at the
joints and with a limited number of degrees of freedom. Key frame systems were
developed by classical animators such as Walt Disney. An expert animator would
design (choreograph) an animation by drawing certain intermediate frames, called
Key frames. Then other animators would draw the in-between frames.
The sequence of steps to produce a full animation would be as follows:
1. Develop a script or story for the animation.
2. Layout a storyboard, that is a sequence of informal drawings that shows the form,
structure, and story of the animation.
3. Record a soundtrack.
4. Produce a detailed layout of the action.
5. Correlate the layout with the soundtrack.
6. Create the "key frames" of the animation. The key frames are those where the
entities to be animated are in positions such that intermediate positions can be
easily inferred.
7. Fill in the intermediate frames (called “in-betweening” or “tweening”).
8. Make a trial “film” called a “pencil test”.
9. Transfer the pencil test frames to sheets of acetate film, called “cels”. These may
have multiple planes, e.g., a static background with an animated foreground.
10. The cels are then assembled into a sequence and filmed.
With computers, the animator would specify the key frames and the computer would
draw the in-between frames (“tweening”). Many different parameters can be
interpolated but care must be taken in such interpolations if the motion is to look
“real”. For example, in the rotation of a line, the angle should be interpolated rather
than the 2D position of the line endpoint. The simplest type of interpolation is linear,
i.e., the computer interpolates points along a straight line. A better method is to use
cubic splines for interpolation. Here, the animator can interactively construct the
spline and then view the animation.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

118 From the above discussion, it is clear that in key frame systems the in-between frames
Multimedia and its Applications
can be generated from the specification of two or more key frames, and among them
we can set the motion path of the object under consideration by describing its
kinematics description as a set of spline curves. For complex scenes we can separate
the frames into individual components or objects called cels (Celluloid
transparencies). In these complex scenes, we can interpolate the position of individual
objects between any two times. And in this interval the complex objects in the scene
may suffer from various transformations like the shape or size of object may change
over time, etc., or the entire object may change to some other object. These
transformations in a key frame system lead to Morphing, Zooming, Partial motion,
Panning (i.e., shifting of background/foreground to give the illusion that the camera
seems to follow the moving object, so that the background/foreground seems to be in
motion), etc.

Morphing
Transformation of object shapes from one form to another is called morphing (short
form of metamorphism). Morphing methods can be applied to any motion or transition
involving a change in shape.

Scripting Systems
Scripting Systems are the earliest type of motion control systems. Scripting systems
allow object specifications and animation sequence to be defined with a user input
script, and from this script, a variety of various objects and motions can be
constructed. So, to write the script the animator uses any of the scripting languages.
Thus, the user must learn this language and the system. Some scripting systems are
PAWN (An embedded scripting language formerly called Small) with syntax similar
to C, ASAS (Actor Script Animation Language), which has a syntax similar to LISP.
ASAS introduced the concept of an actor, i.e., a complex object which has its own
animation rules. For example, in animating a bicycle, the wheels will rotate in their
own coordinate system and the animator doesn't have to worry about this detail.
Actors can communicate with other actors by sending messages and so can
synchronize their movements. This is similar to the behavior of objects in object-
oriented languages.

Parameterised Systems
These are the systems that allow objects motion characteristics to be specified as part
of the object definitions. The adjustable parameters control such objects
characteristics as degree of freedom, motion limitations, and allowable shape changes.

7.3 TYPES OF ANIMATIONS


Procedural Animation
This type of animation is used to generate real time animation, which allows a more
diverse series of actions to happen. These actions be created using some could
otherwise predefined animation procedures are used to define movement over time.
There might be procedures that use the laws of physics (Physical i.e., modeling based)
or animator-generated methods. Some example of procedural animation is collision
which is an activity that is the result of some other action (this is called a “secondary
action”), for example throwing a ball which hits another object and causes the second
object to move; simulating particle systems (smokes water etc.) hair and for dynamics.
In computer video games it is often used for simple things like players head rotation to
look around, etc.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

119
Representational Animation Multimedia Building Blocks
(Using Animation)
This technique allows an object to change its shape during the animation. There are
three sub-categories to this. The first is the animation of articulated objects, i.e.,
complex objects composed of connected rigid segments. The second is soft object
animation used for deforming and animating the deformation of objects, e.g., skin
over a body or facial muscles. The third is morphing which is the changing of one
shape into another quite different shape. This can be done in two or three dimensions.

Stochastic Animation
This uses stochastic processes (A stochastic process can be considered as a random
function). This randomness could be in time or space variable of function, the
randomness in time leads to stochastic animation to control groups of objects, such as
in particle systems. Examples are fireworks, fire, waterfalls, etc., or speech audio
signal, medical data ECG, BP, etc. or Random walk.

Behavioural Animation
Used to control the motion of many objects automatically. Objects or “actors” are
given rules about how they react to their environment. The primary difference is in the
objects being animated, instead of simply procedurally controlling the position of tiny
objects. This type of animation is generally used to animate flocks, school, herds and
crowds. Examples are schools of fish or flocks of birds where each individual behaves
according to a set of rules defined by the animator. So as to generate these types of
animations, we need to have familiarisation with some general functions which every
animation software is suppose to have. In general animation functions include a
graphic editor, a key frame generator, an in-between generator, and standard graphic
routines. The graphic editor allows us to design and modify object shapes using
splines surfaces, Constructive Solid Geometry (CSG) methods and other
representational schemes. In the development of an animation sequence some steps
are well suited for computer solutions, these include object manipulations, rendering,
camera motions and the generation of in-betweens. Animation packages such as wave
front provide special functions for designing the animation and processing individual
objects.
Some general functions available in animation packages are:
z Object Function to store and manage the object database, where the object shapes
and associated parameters are stored and updated in the database.
z Object Function for motion generation and object rendering. Motions can be
generated according to specified constraints using 2D and 3D transformations.
Standard functions can then be applied to identify visible surfaces and apply the
rendering algorithms.
z Object function to simulate camera movements, standard motions like, zooming,
panning, tilting etc. Finally the specification for the key frames, the in-between
frames can be automatically generated.

7.4 COMPUTER ANIMATION TOOLS


To create different types of animation discussed above, we need to have special
software and hardware too. Now, the basic constraint is about the choice of proper
hardware and software out of the many available in the market. Thus, the basic
problem is to select or design animation tools, which are expressive enough for the
animator to specify what s/he wants to specify which, at the same time, are powerful
or automatic enough so that the animator doesn’t have to specify the details that s/he
is not interested in. Obviously, there is no single tool that is going to be right for every
animator, or for every animation or even for every scene in a single animation. The

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

120 appropriateness of a particular animation tool depends on the effect desired by the
Multimedia and its Applications
animator.
An artistic piece of animation will probably require different tools (both software and
hardware) to simulate reality. Along with software we need to have some special
hardware to work with the concerned software.
Here is a short list of some 3D animation software:
z Softimage ( Microsoft)
z Alias/Wavefront ( SGI)
z 3D studia MAX (Autodesk)
z Lightwave 3D (Newtek)
z Prism 3D
z Animation Software (Side Effects Software)
z HOUDINI (Side Effects Software)
z Apple’s Toolkit for game developers
z Digimation etc.
Computer animation can be done on a variety of computers. Simple cell animation
requires nothing more than a computer system capable of simple graphics with proper
animation software. Unfortunately, most of the computer animation that you see on
television and in other areas is done on extremely sophisticated workstations. Only the
most popular and most well known software have been explained, since, a huge
number of software are available in the market, so it would be practically impossible
to name all computer animation programs because there are so many of them.
Check Your Progress 2
1. When do we need to use computer graphics in computer animation?
…………………………………………………………………………….
…………………………………………………………………………….
2. What do you think which type of animation system will be suitable to
generate cartoon films and which one will be suitable to generate
computer games?
…………………………………………………………………………….
…………………………………………………………………………….
3. What are animobs, in which system of animation they are used?
…………………………………………………………………………….
…………………………………………………………………………….
4. What do we mean by Morphing and Panning? What is their significance
in animation?
…………………………………………………………………………….
…………………………………………………………………………….

7.5 LET US SUM UP


In this lesson, we have discussed the meaning of animation, with its application in
various fields. Computer animation is a time based phenomenon of imparting visual
changes in a scene according to any time sequence, the visual changes could be
incorporated through positional changes, in object size, color, transparency, or surface
texture etc. We have also discussed the method and software used in various

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

applications of computer animation. In this lesson, we have concentrated on the 121


Multimedia Building Blocks
computer animation capabilities and its potential in multimedia applications. (Using Animation)

7.6 LESSON END ACTIVITY


Discuss among group members how you can use the power of animation to make a
difference between an ordinary multimedia presentation and a professionally
spectacular one.

7.7 KEYWORDS
Computer animation: Computer animation is a time based phenomenon of imparting
visual changes to a scene according to any time sequence. The visual changes could
be incorporated through positional changes, in object size, colour, transparency, or
surface texture, etc.
Morphing: Transformation of object shapes from one form to another is called
morphing.
Panning: It is the shifting of background/foreground to give the illusion that the
camera seems to follow the moving object, so that the background/ foreground seems
to be in motion.

7.8 QUESTIONS FOR DISCUSSION


1. Discuss the need and significance of Animation.
2. What are the major steps to create an animation?
3. What are different types of Animation Systems?

Check Your Progress: Model Answers


CYP 1
1. Computer animation is a time based phenomenon of imparting visual
changes to a scene according to any time sequence. The visual changes
could be incorporated through positional changes, in object size, colour,
transparency, or surface texture, etc.
Production of animation is done by two methods: First method is by
artists creating a succession of cartoon frames, which are then combined
into a film. Second method is by using physical models which are
positioned to the image, where the image is recorded; then the model is
moved to the next image for its recording, and this process is continued.
2. Two main categories of computer animation:
(a) Computer-assisted animation which usually refers to two dimensional
systems that computerize the traditional animation process.
Interpolation between key shapes is typically the only algorithmic use
of the computer in the production of this type of animation.
(b) Computer generated animation is the animation presented via film or
video. This is possible because the eye-brain assembles a sequence of
images and interprets them as a continuous movement. Persistence of
motion is created by presenting a sequence of still images at a fast
enough rate to induce the sensation of continuous motion. Motion
specification for computer-generated animation is divided into two
categories: Low level techniques (techniques that aid the animator in
Contd…

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

122 precisely specifying motion) and High level techniques (techniques


Multimedia and its Applications
used to describe general motion behaviour).
3. (a) Low level techniques: Low level techniques provide aid to the
animator in precisely specifying the motion. It involves techniques
such as shape interpolation, algorithms which help the animator fill in
the details of the motion. Here the animator usually has a fairly
specific idea of the exact motion that he or she wants.
High level techniques: High level techniques are used to describe
general motion behavior. These techniques are algorithms or models
used to generate a motion using a set of rules or constraints. The
animator sets up the rules of the model, or chooses an appropriate
algorithm, and selects initial values or boundary values. The system is
then set into motion and the motion of the objects is controlled by the
algorithm or model. This approach often relies on fairly sophisticated
computation such as vector algebra and numerical techniques and
others.
(b) Cel Animation: When creating an animation using this method, each
character is drawn on a separate piece of transparent paper. A
background is also drawn on a separate piece of opaque paper. Then,
when it comes to shooting the animation, the different characters are
overlaid on top of the background in each frame. This method also
saves time in that the artists do not have to draw in entire frames, but
rather just the parts that need to change such as individual characters.
Even separate parts of a character's body are placed on separate
pieces of transparent paper
Key Frames: After a storyboard has been laid out, the senior artists
go and draw the major frames of the animation. These major frames
are frames in which a lot of change takes place. They are the key
points of the animation. Later, a bunch of junior artists draw in the
frames in between. This way, the workload is distributed and
controlled by the key frames. By doing work this way, the time in
which an animation can be produced is cut dramatically, depending
on the number of people working on the project. Work can be done
simultaneously by many people, thus cutting down on the time
needed to get the final product out.
4. We cannot say which technique is better because different techniques are
used in different situations. In fact, all these animation techniques are
great, but they are most useful when they are all used together. Cel
animation by itself would not help out much if it wasn’t for key frames
and being able to distribute the workload across many people.

CYP 2
1. Whenever we require to have some realistic display in many applications
of computer animation like, accurate representation of the shapes of sea
waves, thunderstorm or other natural phenomenon, which can be
described with some numerical model, the accurate representation of the
realistic display of scene measures the reliability of the model. Computer
Graphics are used to create realistic elements which are intermixed with
live action to produce animation. But in many fields realism is not the
goal, like physical quantities are often displayed with pseudo colours or
abstract shapes that change over time.
2. Frame animations is an “internal” animation method, i.e., it is an
animation inside a rectangular frame where a sequence of frames follow
Contd…
Downloaded by Wahitha Banu (wahitha19@gmail.com)
lOMoARcPSD|19697633

each other at a fast rate, fast enough to convey fluent motion. And it is 123
Multimedia Building Blocks
best suited for cartoon movies. Sprite animation is an interactive – (Using Animation)
external animation where the animated object interaction script is written
by the programmer, every time an animob touches another animob or
when an animob gets clicked, the script is activated and decides what is to
be done. These features are useful in the gaming systems.
3. Animated objects (sprites or movies) are refered as “animobs”, which are
used in the gaming applications designed using sprite animation. That is
these are programmable animated objects, which can respond to the
interactive environment according to the scripts written by the
programmers.
4. Morphing is short form of metamorphism which means transformation of
object shapes from one form to another. Morphing methods can be
applied to any motion or transition involving a change in shape. Panning
means shifting of background/foreground to give the illusion of camera in
motion following a moving object, so that the background/foreground
seem to be in motion. Both techniques are widely used in animation
application.

7.9 SUGGESTED READINGS


Dhiraj Sharma, Foundations of IT, Excel Books, 2008.
Vaughan, Multimedia making IT Work, Fifth Edition, Tata McGraw Hill.
John F. Koegel Bufford, Multimedia Systems, Pearson Education, 2003.
Judith Jeffloate, Multimedia in Practice (Technology and Applications), PHI, 2003.
Ze-Nian Li and Mark S. Drew, Fundamentals of Multimedia, Prentice-Hall, 2004.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

124
Multimedia and its Applications
LESSON

8
MULTIMEDIA BUILDING BLOCKS (USING VIDEO)

CONTENTS
8.0 Aims and Objectives
8.1 Introduction
8.2 Basics of Video
8.3 The Raster
8.4 How the Colours Work in a Video?
8.5 International Standards
8.6 Digital Video
8.7 Digital Editing/Non-Linear Editing
8.8 Video Codecs
8.9 DV Formats
8.10 Video Standards
8.11 Let us Sum up
8.12 Lesson End Activity
8.13 Keywords
8.14 Questions for Discussion
8.15 Suggested Readings

8.0 AIMS AND OBJECTIVES


After studying this lesson, you would be able to:
z Define the term video and video standards
z Understand type of format used in video production and video format
z Understand technology availability
z Differentiate between standards
z List out characteristics/features of video format

8.1 INTRODUCTION
The term “multimedia” was first used in the 60s to describe presentations combining
photographic slides and audiotape. At this time, all systems were based on analog
system. Major development came in 1978 with the launch of the Philips Laservision
videodisc. Laservision led to the first interactive video applications and gave birth to
the digital compact disc. In this system, analog system was phased out gradually by
digital system. Multimedia is defined as integration of text, audio, graphics/images,
animation, video, and control elements. Preciously, multimedia is computer-based

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

system integration and presentation control of different media data types, i.e., text, 125
Multimedia Building Blocks
graphics, still images, video, animation, and audio in one or more combinations. (Using Video)

8.2 BASICS OF VIDEO


A visual image is really a pattern of light reflected off the objects we see. The pick-up
device in a video camera reads these patterns of light and describes the picture by
creating variations in the electric signal passing through it that correspond to the
variations in brightness and hue in the image. That is, the signal produced by the pick-
up is an Analog of the light patterns reflected from the subject. A television, or video
monitor, reverses the process. It reads the variations in an electric signal and creates a
corresponding pattern of light in the picture tube. A picture tube and the pick-up
devices in video cameras are transducers: they translate information from one form of
energy (light) to another (electricity). In audio, microphones and speakers are also
transducers — they translate sound into and electricity and vice versa.

8.3 THE RASTER


The camera turns the image on the surface of the pickup into an electric signal by
scanning the surface of the pickup. It reads the surface much like you read this page. It
starts in the upper left hand corner, reads across a line horizontally, then when it gets
to the right hand edge, it drops down and starts reading another line from left to right.
The entire pattern of scanned lines is called the raster. As the scan passes each point
on the pickup device, it modulates the output video signal according to the amount of
light striking that spot. After the scan finishes the last line on the bottom, the camera
adds some timing information called a sync pulse to the outgoing signal, then it goes
back to the top left and starts over. Thus, the video picture is translated into a
continuous single electric signal which fluctuates according to the brightness changes
at each point in the raster. The picture tube reverses the process. The surface of the
picture tube is covered with phosphors that give off light when they are struck by
electrons. The tube radiates an electron beam that recreates the raster by scanning over
the phosphor covered surface. The strength of the beam varies with the fluctuations in
the video signal, causing the phosphors at each point in the raster to glow more or less
brightly.

Thus, a representation of the original image is recreated. Note, that at any one exact
moment, the video system is only reading or reproducing one tiny dot of information
in the raster. We see the video as a complete picture due to the phenomenon of
persistence of vision. This means our brain tends to see something for a brief moment
even after it is gone. Thus the video picture is quicker than the eye, scanning over the
entire raster and replacing each individual dot with another before we can register that
the first one is gone. The video system used for television broadcasting in the U.S.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

126 (called NTSC) uses interlace scanning. It breaks the image down into 525 horizontal
Multimedia and its Applications
scan lines, but it doesn’t scan them in order from top to bottom. Instead of going to
line 2 after it finishes lines 1, it skips a line and goes to line three, then lines five,
seven and so on. After the scan gets to the bottom doing the odd numbered lines, it
goes back to the top and scans the even numbered lines. Thus, two scans down the
raster are required to make a complete image. NTSC video scans one complete image,
525 lines, 30 times each second. This is called a video frame. Each frame is made up
of the two odd and even half-scans. These are called video fields. Since there are 30
frames a second in NTSC video, there are 60 fields per second. Interlacing is done to
aid the persistence of vision effect and reduce the sensation of flicker. The video
systems in many computers, for instance, are not interlaced. A non-interlaced screen is
less apt to show the visible horizontal scan lines common to NTSC TV images, but if
you sit back a ways from a computer; you will probably notice an irritating flicker
effect.
After a camera has translated an image into a video signal, the signal can be displayed
on a waveform monitor, which shows its electronic characteristics. Remember, the
camera translates a visual image into an electric signal by registering each spot on the
raster as more or less illuminant. The peaks on the waveform represent the brightest
areas of the picture. TV signals must keep both whites and blacks within a certain
maximum range — if the whites are too strong (peaks too high) or the base level of
black set improperly, a variety of technical problems in picture transmission can
occur. The waveform monitor is used to adjust cameras and studio lighting to produce
the optimum quality signal. Cameras have two different controls to adjust the
waveform. The pedestal control sets the black level. It is almost always adjusted so
that the darkest parts of the picture register at the 7% level on the waveform monitor
scale. The gain control sets the overall brightness of the picture. It must be set so the
highest white peaks remain under 100%, and it is usually adjusted so these peaks fall
just under this level.

8.4 HOW THE COLOURS WORK IN A VIDEO?


What we have been describing so-far is a monochrome (black and white) video
system. Color devices are more complicated. Color video works by breaking down
light into its primary component colors: red, green and blue (the term RGB refers to
video systems that maintain separate signal paths for each primary color). A color
picture tube is covered with a pattern of phosphors that glow red, green and blue.
Most color tubes have three, rather than one, electron beams, one for each primary
color. The beams must pass through a shadow mask to reach the phosphors. The
beams are aligned so that as they pass through the mask, the beam carrying the red
signal will only fall on the red phosphors, and so on.
Shadow Screen
Mask Holes Phosphors

RGB
Electron
Beams

There are several different systems in use in color cameras. Cameras designed for
amateur use employ a single pickup system that more or less reverses the process of
the picture tube. Professional cameras use a three pickup system, in which the light
coming through the lens is separated into the primary colors optically by the use of

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

prisms or dichroic mirrors. Each primary color is then directed to a separate 127
Multimedia Building Blocks
monochrome pickup, and the camera electronics combines the signals from the three (Using Video)
pickups into a single output. This system is more costly, but produces higher
resolution images.
Regardless of how the signal is created in the camera, broadcast TV signals are always
combined and/or broken down into two components for processing:
1. luminance, how bright a pixel is, and
2. chrominence — what hue it is.
This is not the only way a color signal could be created. Computers, for example,
generally use the RGB method. This defines each pixel in three values: so much red +
so much green + so much blue. The luminence/chrominence system was created to
make color signals compatible with black and-white TV sets. All the BW TV needs,
of course, is the luminance information, and the luminance portion of the color signal
is exactly the same as black and white signals were before color TV was invented.
Broadcast signals combine luminance and chrominance information into a single
signal by laying the chrominence information over the luminence information in a
way that black-and-white sets cannot read. This is called composite video.
Non-broadcast signals can be transmitted in other ways. Some video recording
systems make connections through YC video, which keeps the luminance (Y) and
chrominence (C) signals separate, requiring more complex cables with more
conductors. This system is also called S-Video, because it was first introduced widely
on S-VHS VCRs. S-Video is still used in better quality small format equipment such
as S-VHS and Hi-8, and the analog inputs and outputs of DV VCRs. Professional
video equipment typically processes video in YUV format, also known as component
video. Like RGB, this requires three signals — again Y stands for luminence, and U
and V are coded combinations of the color channels. This more complicated system
has some advantages from a transmission and processing standpoint, (though we will
leave the matter of why this is so to expert video engineers). YUV was the standard
format for Betacam VCRs, the professional standard before the introduction of digital
video, and digital gear made for broadcasters now generally employs YUV
connections for input and output to maintain compatibility with older equipment.
Either YUV or YC are clearly superior to composite video, even at the level of home
equipment. In composite video, the Y and C parts of the signal tend to bleed into one
another creating more video noise (visible as grain or fuzziness) and a phenomenon
called "dot-crawl" easily visible when displaying color bars, that makes edges less
distinct. In order to get all the available quality out of a YC signal, though, the parts
must be kept separate all the way from the source to the screen. Thus, for example, the
YC signal from a DV VCR will probably look better on a TV with an S-Video (Y/C)
than on a set that has only a composite input.

8.5 INTERNATIONAL STANDARDS


Throughout the world, there are several different standards for video signals, which
differ in how many scanning lines the image is divided into, how often the scan
repeats itself, how color is processed, and other technical refinements. Three main
standards are used for television broadcasting and video recording: PAL, SECAM and
NTSC, though each has a couple sub variants from country to country. In the United
States all broadcast TV and home video recorders use NTSC. The NTSC system
contains 525 scanning lines, and scans 29.97 frames a second. PAL and SECAM scan
25 frames a second and contain 625 scanning lines. The color system used by PAL
and SECAM is more advanced, and produces more standardized results (a joke among
video engineers is that NTSC stands for Never The Same Color).

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

128 In general PAL and SECAM sets do not have tint controls as US sets do, because they
Multimedia and its Applications
don't need them. The different standards are completely incompatible, requiring
special equipment to convert one sort of signal to another. European VCRs will not
play U.S. tapes and vice versa, even though the same VHS tape is used in both
machines. A U.S. TV won’t work in Europe, even with a voltage adaptor, and so on.
Cameras see differently than you do. The human eye is a much more sensitive
instrument than a video camera. It has much greater resolution, and a much greater
contrast range – the ability to discern different shades of gray between the lightest and
darkest areas of an image. Never assume that a video image is going to look anything
like what you see with your naked eye. You may see something in fine detail when the
camera can’t. A scene that looks fine to the eye might well be too high contrast to
register without dark areas going completely black, or light areas being overexposed.
Video gives you a viewfinder that shows you what you’re actually getting, more or
less. So remember to discount how thinks look to your eye, and judge them by how
they look in the viewfinder, because if they don’t look good in the viewfinder, they
sure aren’t going to look good on the screen.

Check Your Progress 1


1. What do you mean by raster?
………………………………………………………………………………
………………………………………………………………………………
2. How the colours work in a video?
………………………………………………………………………………
………………………………………………………………………………
3. What are the International standards for video?
………………………………………………………………………………
………………………………………………………………………………

8.6 DIGITAL VIDEO


Video is an electrical signal that has minute fluctuations corresponding to the changes
in luminance and chrominence as the transducer scans the raster. The signal is an
ANALOG of the details of the image. Whenever an analog signal is passed through
piece of equipment, and especially when it is recorded or played back, the equipment
makes little errors and/or adds extraneous stuff. Thus, the signal degrades. Every time
you copy an analog program it gets worse. You've seen this is you've watched VHS
copies of a tape made from other VHS copies, or heard it if you've listened to cassette
audio tapes made from other cassette tapes.
A digital signal works differently. Instead of tracing out the changes along the raster,
it measures each pixel, assigns it a numerical value, and records this in binary code —
a series of pulses representing ones and zeros. Even though we talk about video
images as a series of pixels — which might correspond to the dot grid on the picture
tube or the grid on the pick-up device, an analog video signal records this information
as continually fluctuating waves. A digital signal substitutes a regular measurement of
this continual fluctuation for the complete pattern itself. If the measurement is taken
often enough, there are enough points to describe the original waveform very
precisely.
The benefit of a digital signal is that it is not affected by degradation in copying or
transmission. As long as the distortion along the way is not so large as to obscure the
difference between a zero space and a one pulse, what comes out at the end of a
digital transfer or copy is exactly the same as what goes in. Imagine you put a piece of

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

thin paper over the picture and traced the outline of the wave as exactly and precisely 129
Multimedia Building Blocks
as you could in one smooth motion. That's something like what each stage in an (Using Video)
analog system does. Except each new stage doesn't have the original model to follow,
only the tracing made by the previous stage. There's a lot of fine detail in that
waveform, and as it gets copied lots of little mistakes can occur. What's worse, they
compound themselves as the process goes from tracing to tracing and so on. The
weakest link in an analog system is the tape recorder used to store audio and/or video
signals. Analog recording needs lots of physical space on the tape to make an accurate
copy. Thus, high quality analog tape machine have always used wider tape stock, and
run the tape past the heads at a higher speed (meaning that, for, say, any given second
of signal, more tape runs past the head, allowing the recording of that section of signal
to be spread out over a wider surface). As such, high quality analog devices have
always been very expensive.
It is not true that a digital signal is inherently higher quality than an analog one, just
that it's easier to make good quality digital devices smaller and cheaper. The common
analog formats you are familiar with – VHS and 8 mm videotape, cassette audio tape –
are highly compromised in order to make them small enough and cheap enough for
the consumer market. They don't do a very good tracing job.
Digital signals don't trace, they record periodic measurements. This picture represents
the process of digitizing our original segment of audio signal. Imagine at exact
intervals you lay a ruler on the graph and draw a straight line from the baseline up or
down to the waveform edge, as we have begun to do at the left here. As you draw
each line, though, you use a square to read across to the vertical hash marks on the left
and write down the number of the mark the line is closest to in height. In digital
electronics, making these measurements is called sampling. Then what the system
passes along and records is the series of numbers that results. These numbers are
expressed in binary code, a series of ones and zeroes. In order to pass along the coded
signal without error, the system only has to be precise enough to tell the difference
between one and zero, between 'on' and 'off'.

8.7 DIGITAL EDITING/NON-LINEAR EDITING


Thus, digital's superiority over analog comes with multiple copies and surviving
difficult transmission stages. This means digital video can be edited (which is
basically a process of selective copying) without degradation. Another key advantage
to digital video is that the data can be stored and processed on a computer system in a
manner that allows for nonlinear editing (NLE). Analog signals can only be stored on
videotape, and videotape cannot be edited physically (unlike film, which is actually
cut into pieces and reassembled in editing). Video editing can only be accomplished
by laying one section after another onto the tape, in linear order, like making a party
tape on cassette by copying songs from different CDs. If, after you've recorded six
songs, you decide you don't like the second one and want to remove it, you can't just
snip it out and close up the gap, you have to go back to that point and do everything
over again. In contrast a computer has all the video you've digitized available as more
or less randomly accessible bits of data, and the editing program can instruct the
computer to play chunks of data in any order – thus, "non-linear", you can make any
kind of change in that second entry, without having to redo anything else past that
point, no matter how many more chunks you've placed in the timeline after it. (Some
of you may have done non-linear editing of sorts by re-arranging play lists of MP3
files stored on your computer.)
The process of sampling an analog signal to create a digital bit stream is called AD
Conversion. The process of turning digital data streams back into analog signals that
be used by picture tubes and speakers is called AD Conversion.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

130
Multimedia and its Applications The quality of a digital signal varies, first of all, by two key factors:
1. How often the measurement is taken, which is called the sampling rate, and
2. How many possible steps exist on the scale of measurement, called bit depth.
Here is another view of the sampling scale used in the last example. Each possible
level measurement number is represented by a horizontal line. Each moment of
measurement is represented by a vertical line. The digital signal will be created, in
effect by completely filling in cells in the grid. in this case, while the result will
generally resemble the original, as you look closely, you'll see that there are a number
of small changes that occur within the space of one cell. If we digitize on this scheme,
all that detail will be lost. But this is a pretty coarse grid. It only has 32 possible
vertical steps, and it only measures the wave about 8400 times per second. Sample
rates are measured in bits – the number of binary digits used to record a number —
and each additional bit increases the number of possible steps in the scale by a power
of two: 1 bit=2 steps. 2 bits =4 steps, 3 bits=8 steps, 4 bits=16 steps, etc. So our audio
wave here was sampled on a 6 bit scale.
The smallest bit depth used for lo-fi computer audio is 8 bit (256 steps), the standard
for DV audio is 16 bits, which would but over 65,000 gradiations in the same space
where our example has only 32! The standard sampling rate for DV is 48KHz
(KiloHertz —'kilo-' means 'thousand') or 48,000 samples per second.
Each increase in bit depth or sampling rate adds a lot more ones and zeroes to the
stream. The more often you sample the signal, and the greater the detail in the
measuring scale, the more data you create.
High quality video generates enough data that its difficult to cram it all through wires
or radio transmissions or onto recording tape. Thus, any digital format outside of a
big-time professional studio employs compression after the signal is sampled. That is,
the data goes through a microprocessor which uses some mathematical scheme to
throw away a certain number of the original bits without loosing the overall shape of
the subject. At the other end of the transmission (or at playback if compression is
applied during recording), the electronics decompresses the signal, trying to restore
the full original pattern based on a knowledge of the rules by which is was
abbreviated, applied in reverse. There are two general categories of compression:
lossless compression which uses a more limited set of abbreviations of a sort that the
original can always be reconstructed exactly, and lossy compression which shrinks the
data to much greater extent but sacrifices some fidelity to the original.
The particular scheme by which a signal is compressed and decompressed is called a
codec. Any codec involved in digital video or audio and most digital still images that
reaches the consumer level is lossy. Different codecs for digital media offer different
levels of quality and different characters to the distortions they may introduce. Most
codecs are scalable — meaning that when the signal is encoded it can be given more
or less compression. At a higher data rate, that is with less compression applied, the
loss may be virtually imperceptible, while at a lower data rate, with higher
compression, degradation may be obvious and objectionable. Some of you may be
familiar with the audio codec known as MP3 (actual name MPEG-2, Layer 3). This is
a lossy codec that applies a high degree of compression by means of a complex
“psycho-acoustic" model. At higher MP3 compression bitrates, the resulting sound
file is several times smaller than the original (CD audio is uncompressed), but may
pass as "CD quality" to the casual listener. On the other hand, you've probably heard
streaming RealAudio files on the net which don't sound CD clear. These are created
with the same basic codec but at a much higher compression rate, so the resulting files
will be many times smaller, the better to fit through the narrow Internet bandwidth.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

A similar issue exists with video equipment — we can only get so much data onto a 131
Multimedia Building Blocks
hard drive, pass so much at once through the CPU or the bus connecting the drive, get (Using Video)
so many bits onto a finite strip of videotape.

8.8 VIDEO CODECS


There are three main video codecs used in video production. Motion-JPEG, or
MJPEG for short, has traditionally been used in computer-based digital video editing
systems, designed to work with footage that been recorded with analog gear, and will
be output back to analog. MJPEG is a relatively simple and inefficient compression
scheme. While it is capable of creating very high quality digitizations of source
material, the compression rate must be kept very low (2:1 - 5:1), thus the data rate
very high. At higher compression rates, the images degrades considerably. Top-of-the-
line professional systems like Avid, Discrete Edit and Media 100 employ versions of
MJPEG. These systems are expensive in part because they must include a lot of
computing power to process and store the high data rates necessary. Before the
development of the DV codec, consumer and 'prosumer' computer editors also used
MJPEG — at higher compression rates, and lower quality. The key component of an
MJPEG editing system is the digitizing card inside the computer. Analog signals are
input to the card, which does an AD conversion and compresses the data, which is
then written to the computer's hard drive system (in professional systems, an array of
multiple drives). On play back, the process is reversed, and the card does the
decompression and DA conversionto recreate the picture.
MJPEG has two important advantages for non-linear editing. It compresses each video
frame individually, maintaining easy random access for editing and playback
(it doesn't need to know what was in frame 121 in order to understand frame 122 —
this is called intraframe compression). Also, the fact that it is scalable allows
professional editors to work in a low quality, high compression "draft" mode (20:1 or
higher) that conserves bandwidth. The editing system reads and records time-code
from the tape as the footage is digitized, and the software records all the editing
choices made, creating an Edit Decision List (EDL) — a complete set of instructions
about how the program is assembled relating time code from the source tapes to time
positions in the finished product noting all transitions used. Then the editor can set up
the system to automatically redigitize the video at a high-quality rate, converting just
the pieces needed for the finished product, and automatically reassemble them
according to the plan defined in low resolution version. Say the hard drive on your
MJPEG system only holds 2 hours of footage at the quality you need to use. You're
cutting a one hour show, but you've got 20 hours of raw footage. You eliminate a
certain amount of that by reviewing and logging the tapes, deciding which parts are so
useless it's not worth transferring them to the computer. But you still may have 5-10
hours of potentially useful stuff you need to play with in the process of cutting down
to your final hour. So you digitize the material at a higher compression rate so you can
get all 10 hours on the drive, while still leaving working space. The image looks pretty
crappy, but its decent enough to make decisions about what footage to use, where to
make your cuts and so on. This is called an off-line edit — meaning it's not intended
for final viewing. After you finish the offline edit, the computer has a complete record
of how to assemble the production, stored in a program file, a document saved by your
editing application. This is a small file because it only contains instructions about how
pictures and sound should be assembled, not the pictures and sound themselves. The
pictures and sound are stored in separate media files (these take up a lot of space).
Once the off-line edit is finished and the program file is saved, you erase all of the
"draft" media files from the hard drive, and the computer guides you and your VCR
through the (mostly) automated process of redigitizing — putting just the hour or so
of clips you need in the best quality possible onto the now empty space on the hard

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

132 drive. This reconstructed high-resolution version is called an online edit — meaning
Multimedia and its Applications
its what you will send out for the audience to see.
MPEG2 is a more complex and efficient codec, also scalable. It's capable of achieving
similar visual qualities to typical MJPEG digitizations but at a higher compression
rate, using less bandwidth. There are many varieties of MPEG2, most of which
achieve higher efficiency by the use of interframe compression. Instead of making a
complete record of each frame, this method only records a complete key frame every
so often, say every fifth frame. For the intermediate frames, it records only how they
differ from the key frame — sort of like using ditto marks instead of repeating
everything when you move a line down in a paper with repetitive information. This
takes up a lot less space in terms of the data generated, but requires a lot more
processing power to make the comparisons.
DVDs and Digital TV employ a variant of MPEG2. MPEG2 has primarily been a
distribution format, not used much in editing because of the extra processing power
required, and the fact that interframe compression makes random access of different
points in the program more difficult. "DV" does not mean any form of Digital Video
as the initials might suggest, but rather a specific codec. The DV codec is used in
consumer and low-to-mid-grade professional digital VCRs and camcorders. In a
camcorder, it digitizes and compresses the video signal right in the camera stage, and
records the resulting bitstream of ones and zeros on the tape instead of an analog
signal. Unlike MJPEG and MPEG, DV is not scalable. It digitizes at a fixed data rate
(35Mbits per second, not that you need to know that), which amounts to about 5:1
compression on average. This lack of flexibility is one factor that allows the codec to
be engineered with less complicated hardware, making it cost effective. It is also more
efficient than MJPEG — though not as efficient as common MPEG variants —
meaning that a DV image compressed at 5:1 generally looks better overall than an
MJPEG image compressed the same amount. DV achieves this sacrificing some bit-
depth in recording color information, while preserving full bit depth of luminance
information. Most scenes don't have a wide range of colors varying through small
incremental steps — a continuous gradient in other words — so the ability to record
fewer intermediate hues doesn't matter. The only time you may see the effect of the
DV codec is in some images that do have a continuous gradient — for example a sky
that fades from gradual bright light to deep blue. In this case, the codec may produce a
banding effect. This happens very rarely though. (MJPEG compression of the same
image at a similar data rate, in contrast, would probably look fuzzier or muddier, but
wouldn't show the banding.)
DV is a godsend to film/video education programs and all low-budget media makers
because it produces very high quality images with reasonable cost, and makes good
quality computer video editing also practical at a very reasonable cost. Compared to
the previous 'prosumer' formats, S-VHS and Hi-8, DV really narrows the gap between
"broadcast quality" and what individuals, small businesses and educational institutions
can afford.

8.9 DV FORMATS
Which brings us to an area of possible confusion. While "DV" refers primarily to a
codec, it also is the name of a format — meaning a particular system of recording
something to tape. For example, VHS and 8mm video are different formats - using
different size and shape tape cassettes, and recording information on the tape in
slightly different ways. There are no less than four different formats using the DV
codec. There is, first of all, straight DV, designed as a high end consumer format. Its
spec includes two possible tape cassette sizes — the "mini" DV tape you've probably
seen or used, and a mid-size tape, just a bit bigger than an audio cassette. Both
Panasonic and Sony decided the DV spec wasn't quite "professional" enough for their

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

"industrial" video products, and created variations, each of course incompatible with 133
Multimedia Building Blocks
the other. Sony's is called DVCAM and Panasonic's is called DVCPRO. These (Using Video)
formats use exactly the same codec as standard DV, and the same tape cassettes.
Where they differ is how they record the data on the tape. Both DVCAM and
DVCPRO take up more of the tape to record a given amount of data than does normal
DV.
While a mini cassette holds 60 minutes of footage, the same tape recorded with
DVCAM or DVCPRO holds only 42 minutes of footage. This is supposed to make the
tape more "robust", less subject to wear or errors. Frankly, I have yet to meet anyone
who has a complaint about the robustness of regular DV, so the variants seem mainly
to have economic importance, and serve as a nuisance for people trying to put systems
together with pieces from different manufacturers. Other manufacturers, such as JVC,
use standard DV in both their consumer and industrial lines. In fact, Panasonic's
industrial line includes some less-expensive models that are standard DV, not
DVCPRO. It gets very confusing. Standard DV is often called "mini-DV" because
most standard DV format gear will only accept the mini-size tapes and DVCAM and
DVCPRO machines will generally accept the larger tapes (except for small
camcorders like our pd100a). This is a bit of a misnomer, because neither tape size is
specific to the format. There are a few DV decks that accept the bigger tapes and all
DV, DVCAM and DVCPRO gear can use the "mini" tapes. (Though some tapes are
labeled "DV" by the manufacturer and some "DVCAM" or "DVCPRO", they're all
basically the same and interchangeable. This biggest difference is that the duration
listed on the packaging is calculated for the format in question — the same tape
labeled as a DV-60 sold by a consumer-market vendor would be labeled a DVCAM-
42 sold by a professional-market vendor.)
Sony offers several products similar in design in both their consumer line using DV,
and in their professional line, using DVCAM. For example, our pd100a DVCAM
camcorders are identical to the TRV-900 DV camcorder except for the format, and the
'consumer' DHR-1000 desktop VCR is the DV version of the 'professional' DSR-30
DVCAM VCR. DVCAM and DVCPRO machines will also play standard DV tapes.
Panasonic claims some DVCPRO decks will also play back DVCAM. DVCAM gear
will not playback DVCPRO. Sony consumer DV gear will play back DVCAM tapes,
but standard DV gear from other manufacturers usually plays back ONLY DV. With
the exception of two new pieces from Sony, which can record in either DV or
DVCAM, all DV-codec gear records only in its specified format. To offer a relevant
example: say you have a Canon digital camcorder. It records in DV format. That tape
will play just fine in the Sony DVCAM VCRs in our editing systems. Once you
compile your program and record it back to the Sony VCR though, you'll wind up
with a tape in DVCAM format — which won't playback in your Canon camcorder.
Not content with this level of confusion, Sony introduced yet another format using the
DV codec for their less-expensive consumer digital camcorders. This is Digital-8
which records DV codec data onto a Hi-8 format tape. Since the tape is a different
shape entirely, this is completely incompatible with the other digital formats (though
Digital-8 camcorders will also playback old analog 8mm tapes!). Again, the
differences noted here are all physical. Each of these systems uses exactly the same
codec, and thus — at least as far as the recorder stage of the system is concerned —
produces identical quality results.

8.10 VIDEO STANDARDS


A standard means rules or pre-set specifications or consensus between vendors and
manufacturers on items’ specifications that acceptable by industry or people who are
involved in video production. All standards are documented by authorized body like
ITU-T and ISO. Video standard mainly fall into two categories:

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

134 z PAL - Phase Alternating Line.


Multimedia and its Applications
z NTSC - National Television System Committee.
PAL standard widely used in UK and Europe. For North and South America, and
Japan, NTSC standard is complied to. However, there is another standard that used in
France. It is called SECAM. PAL uses 625 line with a frame rate of 25 frames/s.
Under NTSC standard, a single frame consists of 525 horizontal scan lines. The
picture is laid down on the screen in two passes. Using two passes to prevent
flickering which normally known as interlacing.
High Definition Television (HDTV) has 1200 lines, using a 16:9 aspect ratio (wide
screen) rather than the 4:3 ratio.
Actually a video is a presentation of continuous stream of images, and usually
incorporated with digital audio as well. Video consists of a series of frames slightly
different from each other that, when rapidly is played in sequence, make the object
appear to move, just like in animation. Multimedia projects may use individually cut
frames and other image files or digital video frames, usually within a small window
frame of the whole project.
Among the entire element media, video is most demanding where additional speed up
hardware (video cards, video compression cards, fast disk array) and storage memory
is concerned.
Digital video comes in several compressed formats, usually associated with platform,
which the format was originally designed. QuickTime, AVI and MPEG are the most
common formats found.

QuickTime
QuickTime is Apple Computer’s cross-platform technology for displaying temporal
information dynamically. The information can be stored as a variety of different data
types that include digitized video, audio, text, sprites, and music data and time code.
The data in these various media tracks may or not be compressed and they are
synchronized for consistent playback. The QuickTime architecture is scalable and
extensible, allowing for many different compression schemes.

AVI
AVI stands for Audio Video Interleave. It is a special case of the RIFF (Resource
Interchange File Format). AVI is defined by Microsoft. AVI is the most common
format for audio/video data on the PC. AVI is video for Windows. It is Microsoft’s
alternative to Apple’s QuickTime technology. Like QuickTime video for Windows
files (.AVI format) display temporal information dynamically – primarily digitized
video and audio.
Video for Windows is an entire system for handling video in Microsoft Windows.
It was part of MS Windows 3.1. The original Video for Windows is a collection of
16 bit windows utilities, dynamic link libraries, and other components. The AVI file
and file format is a central part of Video for Windows.

MPEG
MPEG, acronym for Motion Picture Experts Group, is a standard for video
compression and playback. MPEG is a highly asymmetrical file format; that is it takes
much longer to compress the video than it does to play it back. In addition, the
compression algorithms that produce MPEG files compress both interframe and
intraframe. The result is a highly efficient, but inflexible medium for digital video that
is ideal for playback over computer networks.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

135
MPEG4 Multimedia Building Blocks
(Using Video)
During the last decade, a spectrum of standards in digital video and multimedia has
emerged for different applications. These standards include the ISO JPEG for still
images [JPEG-90]; ITU-T H.261 for video conferencing from 64 kilobits per second
(kbps) to 2 Megabits per second (Mbps) [H261-91]; ITU-T H.263 for PSTN-based
video telephony [H263-95]; ISO MPEG-1 for CD-ROM and storage at VHS quality
[MPEG1-92]; the ISO MPEG-2 standard for digital TV [MPEG2-94]; and the recently
completed ISO/MPEG4 international standard for multimedia representation and
integration [MPEG4-98]. Two new ISO standards are under development to address
the next-generation still image coding (JPEG2000) and content-based multimedia
information description (MPEG7).
The successful convergence and implementation of MPEG1 and MPEG2 have
become a catalyst for propelling the new digital consumer markets such as Video CD,
Digital TV, DVD, and DBS. While the MPEG1 and MPEG2 standards were primarily
targeted at providing high compression efficiency for storage and transmission of
pixel-based video and audio, MPEG4 envisions to support a wide variety of
multimedia applications and new functionalities of object-based audio-visual (AV)
contents. The recent completion of MPEG4 version1 is expected to provide a stimulus
to the emerging multimedia applications in wireless networks, internet, and content
creation.
The MPEG4 effort was originally conceived in late 1992 to address very low bit rate
video (VLBR) applications at below 64 kbps such as PSTN-based videophone, video
email, security applications, and video over cellular networks. The main motivations
for focusing MPEG-4 at VLBR applications were:
z Applications such as PSTN videophone and remote monitoring were important,
but not adequately addressed by established or emerging standards. In fact, new
products were introduced to the market with proprietary schemes. The need for a
standard at rates below 64 kbps was eminent;
z Research activities had intensified in VLBR video coding, some of which have
gone beyond the boundary of the traditional statistical-based and pixel-oriented
methodology;
It was felt that a new breakthrough in video compression was possible within a
five-year time window. This ''quantum leap'' would likely make compressed video
quality at below 64 kbps adequate for many applications such as videophone.
Based on the above assumptions, a work plan was generated to have the MPEG-4
Committee Draft (CD) completed in 1997 to provide a generic audiovisual coding
standard at very low bit rates. Several MPEG-4 seminars were held in parallel with the
WG11 meetings, many workshops and special sessions have been organized, and
several special issues have been devoted to such topics. However, as of July 1994 in
the Norway WG11 meeting, there was still no clear evidence that a ''quantum leap'' in
compression technology was going to happen within the MPEG-4 timeframe. On the
other hand, ITU-T has embarked on an effort to define the H.263 standard for
videophone applications in PSTN and mobile networks. The need for defining a pure
compression standard at very low bitrates was, therefore, not entirely justified.
In light of the situation, a change of direction was called to refocus on new or
improved functionalities and applications that are not addressed by existing and
emerging standards. Examples include object-oriented features for content-based
multimedia database, error-robust communications in wireless networks, hybrid
nature and synthetic image authoring and rendering. With the technological
convergence of digital video, computer graphics, and Internet, MPEG-4 aims at

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

136 providing an audiovisual coding standard allowing for interactivity, high compression,
Multimedia and its Applications
and/or universal accessibility, with a high degree of flexibility and extensibility.
In particular MPEG-4 intends to establish a flexible content-based audio-visual
environment that can be customized for specific applications and that can be adapted
in the future to take advantage of new technological advances. It is foreseen that this
environment will be capable of addressing new application areas ranging from
conventional storage and transmission of audio and video to truly interactive AV
services requiring content-based AV database access, e.g. video games or AV content
creation. Efficient coding, manipulation and delivery of AV information over Internet
will be key features of the standard.
Check Your Progress 2
1. What is Digital Video?
………………………………………………………………………………
………………………………………………………………………………
2. What do you mean by Digital Editing?
………………………………………………………………………………
………………………………………………………………………………
3. What are the Video Codecs? What is their role?
………………………………………………………………………………
………………………………………………………………………………

8.11 LET US SUM UP


In this lesson, we have discussed the meaning of video, with its various formats and
application. Multimedia is computer-based system integration and presentation control
of different media data types, i.e., text, graphics, still images, video, animation, and
audio in one or more combinations. Video is an electrical signal that has minute
fluctuations corresponding to the changes in luminance and chrominence as the
transducer scans the raster. We have also discussed the standards and software formats
used in various video technologies. In this lesson, we have concentrated on the video
capabilities and its potential in multimedia applications.

8.12 LESSON END ACTIVITY


Discuss among group members how you can capture and use a video of any format to
make a difference between an ordinary multimedia presentation and a professionally
spectacular one.

8.13 KEYWORDS
Video: Video is an electrical signal that has minute fluctuations corresponding to the
changes in luminance and chrominence as the transducer scans the raster.
DV: Refers primarily to a codec, it also is the name of a format — meaning a
particular system of recording something to tape.
PAL: Phase Alternating Line, A video standard.
NTSC: National Television System Committee, a video standard.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

137
8.14 QUESTIONS FOR DISCUSSION Multimedia Building Blocks
(Using Video)
1. Why it is said that Video is a continuous stream of images? Discuss.
2. What is the latest video standard?
3. What are DV Formats?
4. Discuss the major video standards.

Check Your Progress: Model Answers


CYP 1
1. The camera turns the image on the surface of the pickup into an electric
signal by scanning the surface of the pickup. It reads the surface much
like you read this page. It starts in the upper left hand corner, reads across
a line horizontally, then when it gets to the right hand edge, it drops down
and starts reading another line from left to right. The entire pattern of
scanned lines is called the raster. As the scan passes each point on the
pickup device, it modulates the output video signal according to the
amount of light striking that spot. After the scan finishes the last line on
the bottom, the camera adds some timing information called a sync pulse
to the outgoing signal, then it goes back to the top left and starts over.
Thus, the video picture is translated into a continuous single electric
signal which fluctuates according to the brightness changes at each point
in the raster. The picture tube reverses the process. The surface of the
picture tube is covered with phosphors that give off light when they are
struck by electrons. The tube radiates an electron beam that recreates the
raster by scanning over the phosphor covered surface. The strength of the
beam varies with the fluctuations in the video signal, causing the
phosphors at each point in the raster to glow more or less brightly.
2. What we have been describing so-far is a monochrome (black and white)
video system. Color devices are more complicated. Color video works by
breaking down light into its primary component colors: red, green and
blue (the term RGB refers to video systems that maintain separate signal
paths for each primary color). A color picture tube is covered with a
pattern of phosphors that glow red, green and blue. Most color tubes have
three, rather than one, electron beams, one for each primary color. The
beams must pass through a shadow mask to reach the phosphors. The
beams are aligned so that as they pass through the mask, the beam
carrying the red signal will only fall on the red phosphors, and so on.
3. Throughout the world, there are several different standards for video
signals, which differ in how many scanning lines the image is divided
into, how often the scan repeats itself, how color is processed, and other
technical refinements. Three main standards are used for television
broadcasting and video recording: PAL, SECAM and NTSC, though each
has a couple sub variants from country to country. In the United States all
broadcast TV and home video recorders use NTSC. The NTSC system
contains 525 scanning lines, and scans 29.97 frames a second. PAL and
SECAM scan 25 frames a second and contain 625 scanning lines. The
color system used by PAL and SECAM is more advanced, and produces
more standardized results.

Contd…

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

138 CYP 2
Multimedia and its Applications
1. Video is an electrical signal that has minute fluctuations corresponding to
the changes in luminance and chrominence as the transducer scans the
raster. The signal is an ANALOG of the details of the image. Whenever
an analog signal is passed through piece of equipment, and especially
when it is recorded or played back, the equipment makes little errors
and/or adds extraneous stuff. Thus, the signal degrades. Every time you
copy an analog program it gets worse. You've seen this is you've watched
VHS copies of a tape made from other VHS copies, or heard it if you've
listened to cassette audio tapes made from other cassette tapes.
A digital signal works differently. Instead of tracing out the changes along
the raster, it measures each pixel, assigns it a numerical value, and records
this in binary code — a series of pulses representing ones and zeros. Even
though we talk about video images as a series of pixels — which might
correspond to the dot grid on the picture tube or the grid on the pick-up
device, an analog video signal records this information as continually
fluctuating waves. A digital signal substitutes a regular measurement of
this continual fluctuation for the complete pattern itself. If the
measurement is taken often enough, there are enough points to describe
the original waveform very precisely.
2. Thus, digital's superiority over analog comes with multiple copies and
surviving difficult transmission stages. This means digital video can be
edited (which is basically a process of selective copying) without
degradation. Another key advantage to digital video is that the data can be
stored and processed on a computer system in a manner that allows for
nonlinear editing (NLE). Analog signals can only be stored on videotape,
and videotape cannot be edited physically (unlike film, which is actually
cut into pieces and reassembled in editing). Video editing can only be
accomplished by laying one section after another onto the tape, in linear
order, like making a party tape on cassette by copying songs from
different CDs. If, after you've recorded six songs, you decide you don't
like the second one and want to remove it, you can't just snip it out and
close up the gap, you have to go back to that point and do everything over
again. In contrast a computer has all the video you've digitized available
as more or less randomly accessible bits of data, and the editing program
can instruct the computer to play chunks of data in any order — thus,
"non-linear", you can make any kind of change in that second entry,
without having to redo anything else past that point, no matter how many
more chunks you've placed in the timeline after it. (Some of you may
have done non-linear editing of sorts by re-arranging play lists of MP3
files stored on your computer.)
3. There are three main video codecs used in video production. Motion-
JPEG, or MJPEG for short, has traditionally been used in computer-based
digital video editing systems, designed to work with footage that been
recorded with analog gear, and will be output back to analog. MJPEG is a
relatively simple and inefficient compression scheme. While it is capable
of creating very high quality digitizations of source material, the
compression rate must be kept very low (2:1 - 5:1), thus the data rate very
high. At higher compression rates, the images degrades considerably.
Top-of-the-line professional systems like Avid, Discrete Edit and Media
100 employ versions of MJPEG. These systems are expensive in part
because they must include a lot of computing power to process and store
the high data rates necessary. Before the development of the DV codec,
consumer and 'prosumer' computer editors also used MJPEG — at higher

Contd…
Downloaded by Wahitha Banu (wahitha19@gmail.com)
lOMoARcPSD|19697633

compression rates, and lower quality. The key component of an MJPEG 139
Multimedia Building Blocks
editing system is the digitizing card inside the computer. Analog signals (Using Video)
are input to the card, which does an AD conversion and compresses the
data, which is then written to the computer's hard drive system (in
professional systems, an array of multiple drives). On play back, the
process is reversed, and the card does the decompression and DA
conversion to recreate the picture.

8.15 SUGGESTED READINGS


Dhiraj Sharma, Foundations of IT, Excel Books, 2008.
Vaughan, Multimedia Making IT Work, Fifth Edition, Tata McGraw Hill.
John F. Koegel Bufford, Multimedia Systems, Pearson Education, 2003.
Judith Jeffloate, Multimedia in Practice (Technology and Applications), PHI, 2003.
Ze-Nian Li and Mark S. Drew, Fundamentals of Multimedia, Prentice-Hall, 2004.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

141
Multimedia and the Internet

UNIT IV

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

142
Multimedia and its Applications

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

143
LESSON Multimedia and the Internet

9
MULTIMEDIA AND THE INTERNET

CONTENTS
9.0 Aims and Objectives
9.1 Introduction
9.2 History of the Internet
9.3 The Internet
9.4 Who Owns the Internet?
9.5 Size of the Internet
9.6 Services on the Internet
9.7 How the Internet Works?
9.8 A Network of Networks
9.9 Data Communication
9.10 Tools for the Web
9.11 How does TCP/IP Work?
9.12 Worldwide Networking
9.13 Enables Communication
9.14 E-mail
9.15 Modems
9.16 Communication Software
9.17 Let us Sum up
9.18 Lesson End Activity
9.19 Keywords
9.20 Questions for Discussion
9.21 Suggested Readings

9.0 AIMS AND OBJECTIVES


After studying this lesson, you would be able to understand:
z The concept of Multimedia and the Internet
z History of the Internet
z The Internet and its technology
z Services on the Internet
z How the Internet works
z Tools for the Web

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

144
Multimedia and its Applications 9.1 INTRODUCTION
Multimedia with the aid of internet allows revolutionary new ways to provide a
variety of services such as video-on-demand, interactive TV, access to digital
libraries, distance training, collaborative work, videoconferencing, and many others.
The multimedia and information superhighway technologies including World Wide
Web have already created many benefits, but we can still only guess at many benefits
these liberating new technologies will create in the future. Multimedia, Internet, and
Web are changing our lives!
In this lesson, the fundamental technical concepts and principles of Internet and Web
are presented. Special emphasis is given to the enabling techniques that allow video
over IP. The synergy between the Internet and multimedia promises to bring a
tremendous explosion in application possibilities. The second part is on present and
future interactive multimedia applications on information superhighways.
Although the development and use of Internet multimedia applications are increasing,
the ability to manipulate and process multimedia information on the Internet is
missing. The Internet's rich connectivity makes it an ideal blueprint for a high-
performance computing system: millions and millions of heterogeneous computing
nodes that can support open-standards protocols for communication and exchange of
information. The implementation of high-performance multimedia computing on the
Internet requires mapping the application computation onto a set of networked
processing resources. Fortunately, multimedia processing exhibits a high degree of
parallelism that can benefit from the Internet architecture's concurrent nature.

9.2 HISTORY OF THE INTERNET


The Internet is not a recent innovation; it traces its history back more than forty years
to a project called ARPANET. The Internet emerged from the ARPANET (the
network of the Pentagon’s Advanced Research Project Agency), which was sponsored
by the U.S. Department of Defence, to link the enormous number of private sector and
university-based researchers working on Defence-funded projects. Instead of
performing its own research, ARPA (a branch of the Department of Defence), which
became DARPA (Defence Advanced Research Projects Agency) in 1972, regularly
funded research projects related to technological development or military problems.
At first this interconnection of experimental and production networks was called the
‘DARPA Internet’, but later the name was shortened to just ‘The Internet.’
In the 1960s, ARPA became interested in developing a way for computers to
communicate with each other and began to fund programs at universities and
corporations. A network would both advance American technological development
and provide a secure command and control over information during wartime. In the
1960s, the cold war between the western nations and the Warsaw Pact countries,
principally the Soviet Union, almost resulted in the beginning of World War III. In
1962, the Soviet Union placed nuclear-armed missiles on the island of Cuba, just
ninety miles off the coast of the United States. The U.S. military realized that a
nuclear attack on major communication centers would paralyze the U.S. defence
system completely. The ARPANET program was developed to devise a system of
decentralized computer linkages that would reduce dependence on major
communication centers to link national security sites. The result was a technology that
reduced data transmitted over a network into packets that were transmitted
independently from one another through connected computers throughout the country.
The packets were received from all over the country by the target computer and
reassembled into coherent messages.
Access to the ARPANET in the early years was limited to the military and universities
doing defence research. Cooperative, decentralized networks such as UUCP, a

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

worldwide Unix communications network, and USENET (User’s Network) came into 145
Multimedia and the Internet
being in the late 1970s, initially serving the university community and, later,
commercial organisations. In the early 1980s, more-coordinated networks, such as the
Computer Science Network (CSNET) and BITNET (Because It is There Network),
began providing nationwide networking to the academic and research communities.
These networks were not part of the Internet, but later special connections were made
to allow the exchange of information between the various communities.
In the late 1980s, a coding protocol called HTML or the HyperText Markup Language
was developed. HTML is composed of a set of formatting indicators that could be
embedded in information. These formatting indicators could be interpreted by
software called the browser. The first developed browser was called Mosaic. The
formatting created the protocol called the World Wide Web or Web. The Internet is
not the same thing as the WEB. The World Wide Web is the name applied to the files
available on the Internet that use the HTML tags for formatting. Other types of files
are traveling the Internet including FTP, Gopher, IRC, digital video and others. The
WWW is a distributed hypermedia environment within the Internet which was
originally developed by the European Particle Physics Laboratory (CERN). The
World Wide Web allows multimedia information to be located on a network of
servers around the world which are interconnected, allowing one to travel through the
information by clicking on hyperlinks. Any hyperlink (text, icon or image in a
document) can point to any document anywhere on the Internet.
In 1986, the National Science Foundation Network (NSFNET), which linked
researchers across the country with five supercomputer centers came into existence.
Soon expanded to include the mid-level and statewide academic networks that
connected universities and research consortiums, the NSFNET began to replace the
ARPANET for research networking. The ARPANET was honorably discharged
(and dismantled) in March 1990. CSNET soon found that many of its early members
(computer science departments) were connected via the NSFNET, so it ceased to exist
in 1991.
Most of the growth in the WWW (World Wide Web) as an information and commerce
channel has occurred since 1992. The number of connections to the WWW has been
growing at an accelerating rate ever since. Although computers connected to the
WWW were primarily computers at universities and at a very few businesses at first,
as more and more business connected to the net, the net as a general business tool was
born. The present popularity of the WWW as a commercial medium (in contrast to
other networks on the Internet) is due to its ability to facilitate global sharing of
information and resources, and its potential to provide an efficient channel for
advertising, marketing, and even direct distribution of certain goods and information
services. In fairly short period of time, the movement in Net connectivity began to
shift from business-to-business linkages to focus on home connections.
Therefore, initially the Internet had a humble mission, to explore experimental
networking technologies that would link researchers with remote resources such as
large computer systems and databases. The success of ARPANET helped cultivate
numerous other networking initiatives, which grew up intertwined; years later now
these have evolved into an ever expanding, complex organism comprising tens of
millions of people and tens of thousands of networks.

9.3 THE INTERNET


The Internet is a network of thousands of computer networks connecting millions of
people all over the world. In the communication perspective, the Internet can be
defined as the community of people and organisations that communicate with the help
of technology. The Internet is a vast information system connecting millions of
computers worldwide allowing people to communicate, conduct business, research,

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

146 sale, purchase, etc. Every piece of information we would ever need is available on the
Multimedia and its Applications
Internet, we just need to learn how to access it. The Internet has rapidly changed from
a network connecting scientists and engineers to a network connecting everyone.
E-mail addresses are as common as telephone numbers with the advent of the
‘Information Superhighway.’ The designers of the original Internet could not possibly
have foreseen the current success of their creation or the myriad of purposes that it is
being used for now and will be used in the future. Although, its original purpose was
to provide researchers with access to expensive hardware resources, the Internet has
demonstrated such speed and effectiveness as a communication medium that it has
transcended the original purpose. It has, in recent years, grown so large and powerful
that it is now an information and communication tool one cannot afford to ignore.
Today, the Internet is being used by all sorts of people and organizations –
newspapers, publishers, TV stations, celebrities, teachers, librarians, hobbyists, and
business people—for a variety of purposes, from communicating with one another to
accessing valuable services and resources. You can hardly pick up a newspaper or
magazine without reading about how the Internet is playing a part in someone’s life or
project or discovery. It is interesting to understand the significance of the Internet’s
growth and popularity. The speed, convenience, and very low cost of data
transmission result in greater immediacy in human communication, and has unleashed
an extraordinary explosion of new ideas and services. Vast quantities of inter-personal
communication are undertaken using it, and large numbers of documents are
accessible from servers located throughout the world. Initially, access to data on the
Internet required considerable technical capability; but the days of obscure interfaces
are quickly receding. Most important among the internet services are electronic mail;
menu-based search capabilities to discover and access text documents anywhere in the
world; and ‘world-wide web’ (WWW) servers, which provide access to compound
documents containing several different formats (including image, voice, video etc.),
whose elements may be stored on different machines scattered throughout the world.
With professional-quality services now available, the kinds of documents, which are
accessible via the Internet, are no longer limited to research papers, discussions on
varied topics, films and life on the net, games software and university course material.
For example, university library catalogues are accessible over it; specialist collections
of scientific papers and data can be located and copied using it; government
committees make their discussion papers available over it; and government agencies
publish reports, submissions and proceedings on servers connected to it.
The services are maturing beyond structured data and text. Sound, graphics and
images can also be transmitted. Text messages can be delivered directly to fax
machines. Synchronous conversations are being supported, particularly in text, but
also using sound. Video transmission is being experimented in many locations, and
the ‘video-phone’ may well become widely available on the Internet in the near
future.
The Internet is commonly described as a ‘network of networks.’ It does not just
connect one computer with another; it connects the computer with all other Internet-
connected computers. The Internet is not just a bunch of computers, rather it is a
perpetually expanding universe with its own geography, weather, and dynamic
cultures. The Internet will continue to flourish. It is, however, in transition from a
multi-national campus-based family, to a multi-purpose community which services
research, educational, corporate and governmental needs quite generally. In this
cyberspace, people communicate across time zones without ever seeing each other,
and information is available 24 hours a day from thousands of places. The Internet is
already the largest computer network in the world and, in terms of connected
networks, people, and resources; it is getting larger every minute.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

147
9.4 WHO OWNS THE INTERNET? Multimedia and the Internet

No one owns the Internet. Any single person, company, university, or government did
not fund it. Every person who makes a connection, every group whose local area
network (LAN) becomes connected, owns a slice of the Internet. It is a global
collection of networks, both big and small. These networks connect together in many
different ways to form the single entity that we know as the Internet. In fact, the very
name comes from this idea of interconnected networks. Till date there is no single
agency or organisation totally responsible for controlling the Internet. However, it
does not mean it is not monitored and maintained in different ways. The Internet
Society, a non-profit group established in 1992, oversees the formation of the policies
and protocols that define how we use and interact with the Internet.

9.5 SIZE OF THE INTERNET


Nobody really knows how many computers and networks actually make up the
Internet. It is estimated that there are now as many as 30,000 networks connecting
more than 5 million computers and around 70 million users logging on to it in around
90 countries. Whatever is the actual number, however, it is clear that this number is
increasing day by day. In India also, the Planning Commission, in its Ninth Plan
projections, has predicted a sharp increase in the use of the Internet. There will be
more than 1.6 million Internet users by the end of year 2002. The number of e-mail
users is 20,000 at present and they are expected to rise to 50 lakh (250 times increase)
by the end of the Planning period.
The types of resources accessible via the Internet are growing at an astounding rate.
The term resource describes anything you can access on the Internet, no matter where
it is physically located. Examples of some Internet resources are a database of
regularly updated weather information, an online magazine, a case study, and an
archive of daily newspaper articles. A resource can also be a mailing list or a
newsgroup that brings together people from all over the world to discuss shared
interests such as cricket, cooking, cinema, poetry, romance etc. In short, there are
literally tens of thousands of servers, archive sites, mailing lists, newsgroups, and
databases available on the Internet.
The networks on the Internet are not controlled by a central brain, such as a powerful
supercomputer; rather, all the networks and computers act as peers in the exchange of
information and communication. The technology that makes it happen is known as
internetworking; it creates universality among disparate systems, enabling the
networks and computers to communicate. This method of networking is very flexible
and robust. It allows diverse computers and systems to communicate by means of
networking software, not proprietary hardware. If a network goes down—meaning it
is not available to transfer information—the packets can be rerouted to other networks
in many cases. This dynamic alternate routing of information creates a very persistent
means of communication.
While most people do not care about these standards and technical details, an
understanding of the underlying infrastructure will help in learning to use the Internet
properly and in taking full advantage of its powerful capabilities.

9.6 SERVICES ON THE INTERNET


The primary purpose of the Internet is information sharing. About any information
search, it has become a common thing to say ‘it will be on the Internet.’ Almost every
information is available on the Internet. Besides providing information services
Internet has the following capabilities and uses for the users:
z Information resources

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

148 z Free or shareware software


Multimedia and its Applications
z Virtual colleges and classes
z Customer services and information by commercial organisations
z Customer feedback and support
z Books, papers, case studies and course materials
z Games, music and other sort of entertainment
z E-mail
z Chat
z Library catalogues
z Usenet newsgroup and electronic mailing lists
z Sale and purchase services
z Virtual payments
z Stocks monitoring
z Electronic newspapers, magazines, journals etc.
z Airline and railway reservations
z Video conferencing
z Job search, interviews etc.

9.7 HOW THE INTERNET WORKS?


If we ask an Internet wizard what this network is all about, most probably we will get
a long and technical lecture. Fortunately, the most important principle of all is that
you do not have to fully understand how the Internet works to use it. Most of the users
are pounding away at keyboards and communicating merrily, with no knowledge of
how the Internet works. Now, we will discuss some of the basic principles that
underlie the Internet.
The Internet is a huge collection of millions of computers, all linked together on a
computer network. The network allows all of the computers to communicate with one
another. A home computer is usually linked to the Internet using a normal phone line
and a modem that communicates to an Internet Service Provider (ISP). A computer in
a business or office has a Network Interface Card (NIC) that directly connects it to a
Local Area Network (LAN) inside the business. The business then connects its LAN
to an ISP using a high-speed phone line like a T1 line. A T1 line can handle
approximately 1.5 million bits per second, while a normal phone line using a modem
can usually handle 30,000 to 50,000 bits per second.
ISPs then connect to larger ISPs, and the largest ISPs maintain fiber-optic backbones
for an entire nation or region. Backbones around the world are connected through
fiber-optic lines, undersea cables, or satellite links. In this way, every computer on the
Internet is connected to every other computer on the Internet.

9.8 A NETWORK OF NETWORKS


The Internet is an incalculably massive network of networks. The Internet is made up
of little Local Area Networks (LANs), citywide Metropolitan Area Networks
(MANs), and huge Wide Area Networks (WANs) that connect computers for
organisations all over the world. These networks are hooked together with everything
from regular dial-up phone lines to high-speed dedicated leased lines, satellites,
microwave links, and fiber optic links. And the fact that they are on the Internet means
that all these networks are interconnected. This network web extends all over the

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

world. There are so many networks interconnected within the Internet that it is 149
Multimedia and the Internet
impossible to show an accurate, up-to-date picture. Everyday, new computers and
links are being added. It is estimated that a new network is added every 10 minutes.

9.9 DATA COMMUNICATION


The computers on a network communicate to one another. For this they use same
communication protocols. There are lots of protocol standards such as DECnet, SNA,
IPX, and AppleTalk, but to communicate, two computers are required to use the same
protocols at the same time. Researchers at DARPA worked on a protocol which would
be able to handle larger numbers of users, and resultant TCP/IP was born in the 1970s.
The U.S. government in 1978 accepted this more sophisticated technology, and
TCP/IP became the preferred networking tool.
TCP/IP, which stands for Transmission Control Protocol/Internet Protocol, is the
language of the Internet. You may speak French and I may speak Hindi, but if we both
speak English, we can communicate. So, any computer that wants to communicate on
the Internet must speak TCP/IP.
TCP/IP was a part of an experiment in internetworking, that is, connecting different
types of networks and computer systems. First used ubiquitously on the ARPANET in
1983, it was also implemented and made available at no cost for computers running
the Berkeley Software Distribution (BSD) of the Unix operating system. TCP/IP,
developed with public funds, is considered an open, non-proprietary protocol, and
there are now implementations of it for almost every type of computer on the planet.
Non-proprietary means that no one company has exclusive rights to the products
needed to connect to the Internet. Any number of companies makes the hardware and
software necessary for the network connection.
TCP/IP is not the only protocol suite that is considered open. Since the early 1980s,
the International Organisation for Standardization (ISO) has been developing the
Open Systems Interconnection (OSI) protocols. While many of the OSI protocols and
applications are still evolving, a few are actually being used in some networks on the
Internet, and more are planned. So even though most of the computers speak TCP/IP,
the Internet is officially considered a multi-protocol network.

Check Your Progress 1


1. What is the Internet?
………………………………………………………………………………
………………………………………………………………………………
2. What is the size of the Internet?
………………………………………………………………………………
………………………………………………………………………………
3. What are the services provided over the Internet?
………………………………………………………………………………
………………………………………………………………………………

9.10 TOOLS FOR THE WEB


Electronic mail, remote login, and file transfer protocol are basically the web tools.
These are TCP/IP applications. There are plenty of applications using variations on or
combinations of these basic tools. The three basic Internet services are:

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

150 Electronic mail is the most commonly available and most frequently used service on
Multimedia and its Applications
the Internet. Through e-mail a text message can be sent to another person or to a
whole group of people.
Remote login is an interactive tool that allows the user to access the programs and
applications available on another computer.
File transfer protocol allows users to transfer files from one computer to another. A
file can be a text document, graphics, software, sound etc.
There are quite a few applications available today that use a combination or variation
of these three tools to hide details even further. These operate on a client/server
model, that is, you use the client on your computer, and it contacts servers for
directions and information. Clients and servers do not have to be located in the same
geographical area. This technology is very flexible; during one session, your client
may access servers all over the world to help you find information. As the Internet has
grown larger, locating the information we need has become difficult unless we are
using information discovery and retrieval tools. The major resource-browsing
applications, which operate on the client/server concept, include Internet Explorer,
Netscape Navigator, Eudora, Gopher, and Wide Area Information Servers (WAIS).

9.11 HOW DOES TCP/IP WORK?


When you are actually using the above-mentioned tools, information of various types
is being transferred from one computer to another. TCP/IP breaks this information
into pieces called packets. Each packet contains a piece of the information or
document plus some ID tags, such as the addresses of the sending and receiving
computers.
Each packet can travel independently. Because of all the network interconnections,
there are often multiple paths to a destination. The packets may travel different
networks to get to the destination computer. The packets may arrive out of order, but
that is not a problem, because each packet also contains sequence information about
where the data it is carrying goes in the document, and the receiving computer can
reconstruct the whole document again. That is why, the Internet is known as a packet-
switched network.

9.12 WORLDWIDE NETWORKING


Once all the networks are in place, the Internet, which is actually tens of thousands of
networks, looks seamless to the user. By connecting networks together to enable
communication and information exchange, all the details are hidden from the user: the
packets, the routers, and all those interconnections. Despite of different computers and
disparate networks, somehow the whole web works, and any computer directly
connected to the Internet can communicate to all the other computers on the Internet.
So, you working on a computer in your office in New Delhi or in your spare bedroom
in Punjab can communicate with a colleague in Canada or a friend in England. It is as
if you are directly connected by one wire.

9.13 ENABLES COMMUNICATION


Everyday, hundreds of thousands of people are communicating through the Internet—
conversing, collaborating, working, playing, and studying. Even marriages are made
and broken on the Internet. Clubs are formed. Problems are solved. Jobs are found.
Handicaps and disabilities make no difference. Through e-mail and the other methods
of online communication, people have become best friends without ever seeing or
talking to each other. It is not uncommon for people to turn to the Net for answers; a
question posted to online communities—mailing lists and conferences—can yield

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

dozens of invaluable tales of experiences and testimonials within hours. Online 151
Multimedia and the Internet
communication eliminates all the barriers of physical distance and place.
On the Internet, people can communicate asynchronously and in real time.
Asynchronous (taken from Greek word for ‘not at the same time’) communication
means that someone can type in a message and send it off, but the recipient does not
have to be around to receive it. This type of communication has some real benefits.
You can send messages whenever you want to, they reach their destination quickly,
and the recipients can read and respond when they want to. Answering machines and
voice mail are everyday examples of asynchronous communication. Real-time,
interactive communication, in contrast, means that as someone is talking – that is,
typing – you see it on your screen as it is typed. Real-time audio and video
conferencing is starting to become more prevalent on the Internet too.

9.14 E-MAIL
Electronic mail is the most popular application on the Internet today. It is a very
powerful tool that is simple to use and easy to understand. Using e-mail can give you
a real feeling for the energy and reach of the Net. It is hard to imagine any other form
of communication that can be so intimate and yet so wide reaching, so focused, or so
expansive. You can communicate as easily with someone across twelve time zones as
with someone in the same building. Your message can be limited to just one person,
or it can reach hundreds of people.

How to send e-mail?


Postal mail is often called snail mail in comparison to e-mail. E-mail is really fast—it
is sent and received in seconds, minutes at the most. Sending e-mail is easy, too. All
you need is access to the Internet, an e-mail program, and the e-mail address of the
person with whom you wish to communicate.

Internet Access
The first requirement for e-mail is the Internet access. The Internet connection can be
a dial-up connection or it can be through a leased line.

E-mail Programs
You will need an e-mail program that will run on your own computer. Most large
systems and public-access computers offer several e-mail programs. Some
commercial Internet service providers will supply programme to load on your
computer. A common characteristic of e-mail programmes is that they let you
compose and send e-mail, and then read and organize the e-mail you receive. There
are many different e-mail programmes; the most popular ones are listed below:
z E-mail Address: In order to send someone e-mail, you need to know the
recipient’s address. An e-mail address, like a postal mail address, contains all the
necessary information needed to deliver a message to someone. Internet e-mail
addresses are, in fact, very simple. They consist of a local part and a host part. The
username refers to the mailbox, login name, or user id of the recipient on that
computer. For example, if your friend Dhiraj logs in on his computer as
dhiraj2001, then that is his username. The host part of the address should be
recognizable to you—a series of words separated by dots. The local part and host
part of an e-mail address are separated by an ‘@’ sign:
username@hostname
The username is the unique name of the user whereas hostname is actually the
address of the host computer on which the user is sending the request. Suppose

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

152 that you know that sanyam’s username is sanyam2001 you could send e-mail to
Multimedia and its Applications
him using his address:
sanyam2001@indiatimes.com.
Now indiatimes.com is the host name. The host name provides the Internet
location of the mailbox, usually the name of the computer owned by a company or
Internet service.
z Sending Mail: Once you have an e-mail program and know the recipient’s e-mail
address, you are ready to send a message. Each e-mail program is different, so if
you are not familiar with yours, you may have to fumble around a bit or actually
read the manual or online documentation. You will need to specify that you want
to send a message, either by typing send, clicking a send button, or by performing
some other computer function. The e-mail program will prompt you for
information, asking for the recipient’s e-mail address, the key piece of
information the program needs to send the message to the recipient. It will also
ask for the subject of your message—usually a summary, title, or brief
description. The subject is optional, but you should get into the practice of
including it. A good subject description makes the person to whom you are
sending aware of the nature of your message, whether it is important or
whimsical. The program may give you the option of sending a ‘carbon copy’ (cc)
message. If there is someone else you think would be interested in the message,
here is a chance to include his or her address. You can send carbon copies to more
than one recipient. If you have the disk space, it is a good idea to send a copy to
yourself so you will have a record of your outgoing messages. After you have
answered all the e-mail program prompts, you can compose your message, using
your e-mail program’s editor. It is important to make your message easy to read
and understand.

9.15 MODEMS
Today’s modems are complex and offer so many functions and features that the user
manuals accompanying them are sometimes hundreds of pages long. For example,
most contemporary modems include functions such as support for multiple
transmission rates, standard telephone operations, connection negotiation,
compression, error correction, facsimile transmission, security, and loop-back testing.
Modems can also be physically characterized by whether they are internal or external
models and whether they are suitable for use with a laptop computer. Modems are
computer appliances that convert the digital signal from your computer into an analog
sound wave that can be transmitted over telephone lines. A modem at the other end
converts the analog signal back into a digital signal that is understood by the computer
you are talking to. Exciting advances are being made in the modem technology, with
faster speeds and more error-free data transmission. High-speed modems can reduce
errors from line noise and even do data compression. As with any computer-related
purchase, you should buy the very best modem you can afford. Technology changes
fast, and one year from now, today’s high-speed modem will be as obsolete as that
ancient modem, the 300bps acoustic coupler.
If you have already got a slower modem, do not despair just yet. Many individuals are
still using 28.8Kbps modems that they have had for several years to access the
Internet and other services. All of the access and information systems support them,
and, for the occasional user, the difference in online and/or long-distance charges may
not be too significant. The higher your modem speed, of course, the less time it takes
you to get information. Using a 28.8Kbps modem, you can access electronic mail,
Telnet, FTP, and the terminal client gopher application. However, the bigger the
message or file, the longer it will take to show on your screen or transfer to your
computer. If you plan to spend a lot of time online and if you need quick, error-free
access, go for a high-speed modem with error correction and data compression. Most

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

of the Internet applications incorporate graphics and multimedia, and requires a fast 153
Multimedia and the Internet
modem. 56Kbps modems are common now with the prices falling everyday. The
faster your modem, the faster you can access information.
The ideal modem for telecommunications not only communicates at high speeds but
also has error correction and data compression features. Error correction protocols
help filter out line noise and they ensure an error-free transmission. Most file transfer
programs also have a mechanism to ensure accurate file transfers. Data compression,
while a useful feature, may not help you much on some bulletin boards and
information services that have already compressed their files because your modem
cannot compress them any further. Shopping for a modem gets you into a complexity
of feature combinations: speed, modulation protocols, data compression, and more.

9.16 COMMUNICATION SOFTWARE


The second required component is software that will enable communication.
Communication software, which is installed on your personal computer, sets up the
three-way conversation between your computer, the modem, and the remote computer
or terminal server. Since you are dialing into the Internet, there are many types of
communication packages available, enabling three different kinds of connections.
These are terminal emulation, offline access, and SLIP/PPP. All of these are
commonly used and available from a large number of Internet providers. The best
choice for you depends on your existing equipment situation and how much you are
willing to spend. The different types of connections are explained as below:
1. Terminal emulation: Terminal emulation is the easiest type of dial-up Internet
access. Using your modem and communications software, you can dial into an
Internet-connected computer or communications server. Or in other words you
will dial-in to your Internet service provider. Each user has an account on the
ISP’s Internet Services host computer on the Internet. Your computer will use
terminal emulation software and not TCP/IP to talk to the Internet host computer,
whereas the Internet host computer will use TCP /IP to communicate to the rest of
the network. You can get communication software from a number of places. Some
modems come bundled with communications software. You can also buy it from
any software store. Once connected, everything you do is from the perspective of
the remote computer into which you have dialed. When you read e-mail or news,
you are using e-mail and news applications that reside on the remote computer.
Similarly, remote logins, file transfers, and other tools, are all executed on the
remote computer. Your computer provides only the display.
2. Offline access: Offline access brings some of the Internet functions, such as
electronic mail, USENET news, and file transfer, straight to your computer, but
lets you work offline. This means that you are not actively dialed-in while you are
working, only when the need arises. When that happens, the software makes the
connection, performs the required functions, such as transferring e-mail back and
forth, and then disconnects. Internet Providers or services supply you with special
software, called client or agent software. Although, you are not interactively using
the Internet, you can still do a lot of useful things, such as download electronic
mail and news, reading messages etc. at your convenience on your computer
rather than tying up a phone line or running up connection charges. But this way
all of the Internet applications are not, particularly remote login, Gopher, and
Mosaic, available to you. Because you cannot issue commands and receive
information interactively when you are not connected. Despite this limited
functionality, these client connections are recommended for novice users, because
they are more user-friendly.
3. SLIP/PPP (Full-access dial-up connection): The use of SLIP (Serial Line
Internet Protocol) or PPP (Point-to-Point-Protocol) means the user’s computer has

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

154 full Internet connectivity permitting full client/server operation and graphical user
Multimedia and its Applications
interface applications with access to all Internet services. It is a more advanced
client connection that uses client networking software and a high-speed modem to
actually become a directly connected computer on the Internet. This happens
because of a fast modem and with the help of software that conforms to SLIP or
PPP. Either of these, used in conjunction with graphical Internet client
applications like Gopher, Internet Explorer, Netscape Navigator, and Eudora
brings the power and flexibility of the Internet straight to your computer over an
ordinary telephone line. SLIP and PPP are different, but each performs essentially
the same function – that is, they make your computer a peer computer on the
Internet. A SLIP or PPP connection is a way to connect to the Internet. Through
modem, you dial into another computer or terminal server that is running SLIP
(if your computer is running SLIP) or PPP (if your computer is running PPP) to
make this connection. These remote ends are known as SLIP or PPP servers. They
help you get set up at the beginning of the connection, but they are invisible after
connection. You will also need a unique Internet Protocol (IP) address, because
your computer must be identified on the network. Your Internet provider will
assign you an IP address, or the remote SLIP/PPP server will assign you a number
to use when you make the connection.
When you use this type of connection, you are actually executing Internet
applications on your own computer, not on an Internet-connected computer that
you have dialed into. For example, if you want to transfer a file-using FTP from a
public-access site, you transfer that file straight to your computer instead of
working with the terminal-emulation middle link.
Check Your Progress 2
1. What are the major tools for the web?
………………………………………………………………………………
………………………………………………………………………………
2. How does TCP/IP work?
………………………………………………………………………………
………………………………………………………………………………
3. What is E-mail? How it can be used?
………………………………………………………………………………
………………………………………………………………………………

9.17 LET US SUM UP


Multimedia with the aid of internet allows revolutionary new ways to provide a
variety of services such as video-on-demand, interactive TV, access to digital
libraries, distance training, collaborative work, videoconferencing, and many others.
The multimedia and information superhighway technologies including World Wide
Web have already created many benefits, but we can still only guess at many benefits
these liberating new technologies will create in the future. In this lesson, the
fundamental technical concepts and principles of Internet and Web have been
discussed. The lesson highlights the concept of Multimedia and the Internet, the
Internet and its technology, Services on the Internet, the working of the Internet
works, major tools for the Web etc.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

155
Multimedia and the Internet
9.18 LESSON END ACTIVITY
Access various informative web sites on the internet which extensively use
multimedia features and save them on your computer.

9.19 KEYWORDS
ARPA and DARPA: ARPA was a branch of the US Department of Defence which
became DARPA (Defence Advanced Research Projects Agency) in 1972.
ISP: Internet Service Provider.
TCP/IP: Transmission Control Protocol/Internet Protocol.
Remote Login: An interactive tool that allows the user to access the programs and
applications available on another computer.
File Transfer Protocol: Allows users to transfer files from one computer to another.
A file can be a text document, graphics, software, sound etc.

9.20 QUESTIONS FOR DISCUSSION


1. What is the technology behind Internet?
2. What is the role of a Modem?
3. What is Terminal emulation?

Check Your Progress: Model Answers


CYP 1
1. The Internet is a network of thousands of computer networks connecting
millions of people all over the world. In the communication perspective,
the Internet can be defined as the community of people and organisations
that communicate with the help of technology. The Internet is a vast
information system connecting millions of computers worldwide allowing
people to communicate, conduct business, research, sale, purchase, etc.
Every piece of information we would ever need is available on the
Internet, we just need to learn how to access it. The Internet has rapidly
changed from a network connecting scientists and engineers to a network
connecting everyone. E-mail addresses are as common as telephone
numbers with the advent of the ‘Information Superhighway.’ The
designers of the original Internet could not possibly have foreseen the
current success of their creation or the myriad of purposes that it is being
used for now and will be used in the future. Although, its original purpose
was to provide researchers with access to expensive hardware resources,
the Internet has demonstrated such speed and effectiveness as a
communication medium that it has transcended the original purpose. It
has, in recent years, grown so large and powerful that it is now an
information and communication tool one cannot afford to ignore.
2. Nobody really knows how many computers and networks actually make
up the Internet. It is estimated that there are now as many as 30,000
networks connecting more than 5 million computers and around 70
million users logging on to it in around 90 countries. Whatever is the
actual number, however, it is clear that this number is increasing day by
day. In India also, the Planning Commission, in its Ninth Plan projections,
has predicted a sharp increase in the use of the Internet. There will be
more than 1.6 million Internet users by the end of year 2002. The number
Contd…
Downloaded by Wahitha Banu (wahitha19@gmail.com)
lOMoARcPSD|19697633

156 of e-mail users is 20,000 at present and they are expected to rise to 50
Multimedia and its Applications
lakh (250 times increase) by the end of the Planning period.
3. The primary purpose of the Internet is information sharing. About any
information search, it has become a common thing to say ‘it will be on the
Internet.’ Almost every information is available on the Internet. Besides
providing information services Internet has the following capabilities and
uses for the users:
™ Information resources
™ Free or shareware software
™ Virtual colleges and classes
™ Customer services and information by commercial organisations
™ Customer feedback and support
™ Books, papers, case studies and course materials
™ Games, music and other sort of entertainment
™ E-mail
™ Chat
™ Library catalogues
™ Usenet newsgroup and electronic mailing lists
™ Sale and purchase services
™ Virtual payments
™ Stocks monitoring
™ Electronic newspapers, magazines, journals etc.
™ Airline and railway reservations
™ Video conferencing
™ Job search, interviews etc.

CYP 2
1. Electronic mail, remote login, and file transfer protocol are basically the
web tools. These are TCP/IP applications. There are plenty of applications
using variations on or combinations of these basic tools. The three basic
Internet services are:
Electronic mail is the most commonly available and most frequently used
service on the Internet. Through e-mail a text message can be sent to
another person or to a whole group of people.
Remote login is an interactive tool that allows the user to access the
programs and applications available on another computer.
File transfer protocol allows users to transfer files from one computer to
another. A file can be a text document, graphics, software, sound etc.
There are quite a few applications available today that use a combination
or variation of these three tools to hide details even further. These operate
on a client/server model, that is, you use the client on your computer, and
it contacts servers for directions and information. Clients and servers do
not have to be located in the same geographical area. This technology is
very flexible; during one session, your client may access servers all over
Contd…

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

the world to help you find information. As the Internet has grown larger, 157
Multimedia and the Internet
locating the information we need has become difficult unless we are using
information discovery and retrieval tools. The major resource-browsing
applications, which operate on the client/server concept, include Internet
Explorer, Netscape Navigator, Eudora, Gopher, and Wide Area
Information Servers (WAIS).
2. When you are actually using the above-mentioned tools, information of
various types is being transferred from one computer to another. TCP/IP
breaks this information into pieces called packets. Each packet contains a
piece of the information or document plus some ID tags, such as the
addresses of the sending and receiving computers. Each packet can travel
independently. Because of all the network interconnections, there are
often multiple paths to a destination. The packets may travel different
networks to get to the destination computer. The packets may arrive out
of order, but that is not a problem, because each packet also contains
sequence information about where the data it is carrying goes in the
document, and the receiving computer can reconstruct the whole
document again. That is why, the Internet is known as a packet-switched
network.
3. Electronic mail is the most popular application on the Internet today. It is
a very powerful tool that is simple to use and easy to understand. Using e-
mail can give you a real feeling for the energy and reach of the Net. It is
hard to imagine any other form of communication that can be so intimate
and yet so wide reaching, so focused, or so expansive. You can
communicate as easily with someone across twelve time zones as with
someone in the same building. Your message can be limited to just one
person, or it can reach hundreds of people. Postal mail is often called snail
mail in comparison to e-mail. E-mail is really fast—it is sent and received
in seconds, minutes at the most. Sending e-mail is easy, too. All you need
is access to the Internet, an e-mail program, and the e-mail address of the
person with whom you wish to communicate.

9.21 SUGGESTED READINGS


Dhiraj Sharma, Foundations of IT, Excel Books, 2008.
Vaughan, Multimedia Making IT Work, Fifth Edition, Tata McGraw Hill.
John F. Koegel Bufford, Multimedia Systems, Pearson Education, 2003.
Judith Jeffloate, Multimedia in Practice (Technology and Applications), PHI, 2003.
Ze-Nian Li and Mark S. Drew, Fundamentals of Multimedia, Prentice-Hall, 2004.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

158
Multimedia and its Applications
LESSON

10
DESIGNING AND TOOLS FOR THE WORLD WIDE WEB

CONTENTS
10.0 Aims and Objectives
10.1 Introduction
10.2 Why Multimedia over Internet?
10.3 Problems of Multimedia on the Internet
10.4 Codecs
10.5 Multimedia Requirements
10.6 Browser Support
10.7 Multimedia Formats
10.8 Designing for the World Wide Web
10.8.1 Display and Graphics
10.8.2 Nibbling
10.8.3 HTML and Multimedia
10.9 Multimedia Networking
10.10 Let us Sum up
10.11 Lesson End Activity
10.12 Keywords
10.13 Questions for Discussion
10.14 Suggested Readings

10.0 AIMS AND OBJECTIVES


After studying this lesson, you would be able to understand:
z The concept of Multimedia and the Web
z Tools for the Web
z Multimedia over Internet
z Problems of multimedia on the internet
z Multimedia requirements
z Multimedia Formats
z Designing for the World Wide Web

10.1 INTRODUCTION
A protocol is basically a set of rules which governs transmission from one point to
another. It is a software that is required to use the physical connection. It is

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

responsible for establishing the connection, sending and receiving the data in packets. 159
Designing and Tools for
The software is called a protocol because there must be compatible software on each the World Wide Web
end but they do not have to be written by the same vendor. Instead a protocol for the
proper exchange of data is defined and released as a standard (such as TCP-IP). As
long as the vendor on each end adheres to the protocol, a connection can be sustained
which will support an application.
The Internet is more of a phenomenon than a network but is important when
discussing multimedia because a popular Internet application, the www is capable of
accessing and displaying multimedia formats such as pictures, audio and video. The
current Internet has thrived and grown due to the existence of TCP implementations
for a wide variety of classes of host computers. These various TCP implementations
achieve robust interoperability. TCP is used for transmitting data from one computer
to another by means of connection oriented methods. TCP is used for e-mail, FTP,
Telnet, www and various other Internet services.
In early IP networks, a packet could be sent either to one receiver i.e. point to point or
to all receivers i.e. broadcast. A single transmission, which could reach a specific
group of receivers, was not possible. The idea of IP multicasting was developed to
promote the audio and video transmissions in real-time through the Internet. The
system of connected networks, which comprise the Internet, has also been used to
carry live audio and video. Extensions to the TCP IP protocols currently used have
been proposed as Real Time Protocols (RTP). Broadcast of audio and video has taken
place on the Multicast Backbone (MBONE), by allocating higher priority to audio and
video information from within routers.

10.2 WHY MULTIMEDIA OVER INTERNET?


Following are the reasons:
z The Internet is a shared network of networks.
z The concept of integrated data and multimedia services over single network.
z There are millions of recipients over the internet.

Figure 10.1: Multimedia over Internet

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

160
Multimedia and its Applications There are a number of multimedia applications/services available over the internet:
z Online-Music
z Online-Video
z TV-Streaming
z Videoconferencing
z Internet-Telephony
z Online-Gaming
z Online-Software
z Online-Learning
z Distributed emulation
z Remote virtual Reality

10.3 PROBLEMS OF MULTIMEDIA ON THE INTERNET


Multimedia applications require steady bandwidth over the internet. Just increasing
the bandwidth will not solve the whole problem. For most multimedia applications,
the receiver has a limited buffer. If no measure is taken to smooth the data stream, it
may overflow or underflow the application buffer. When data arrives too fast, the
buffer will overflow and the some data packets will be lost, resulting in poor quality.
When data arrives too slowly, the buffer will underflow and the application will
starve.
Then there is problem of synchronization of streams. Real-time data becomes obsolete
and will be dropped if it doesn't arrive in time. If proper action is not taken, the
retransmission of lost packets would aggravate the situation and jam the network.
A common feature of all solutions is increased bandwidth, at least from the network to
the user. Applications (e.g. interactive video applications), which require high speed
links to the user are not only constrained by the bandwidth available on the
downstream channel. The limiting factor may be the return path because, if the
services run on top of TCP, packets need to be acknowledged and the ratio of the
bandwidths required is around 1/30.
Satellite distribution systems do not allow users to send multimedia information in
real-time to other users or to the multimedia server. However, such systems can
support asymmetric applications that do not need a broadband return path and provide
a narrowband return path for control, selection and navigation. They can not support
highly interactive multimedia applications, such as virtual reality, because of the
delays on satellite links and are only suitable for applications that deliver a lot of data
to the user in response to a small amount of return path data. Good candidates are
multicasting and broadcasting applications, such as tele-shopping and software
download.
A digital broadband core network, based on ATM technology, guarantees no
restriction of bandwidth or degradation of transport services. Network evolution plans
need to include upgrades of backbone link capacity and extension of regional
exchange points to optimise local traffic support and to reduce interconnection traffic
between different geographical areas.
Because today's network was designed for telephony, it is not well suited to future
traffic patterns. Network performance, which works on a best-effort basis, is also
inadequate. The problem will get worse as the penetration of interactive services
increases.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

161
10.4 CODECS Designing and Tools for
the World Wide Web
Codec is an abbreviation for compression/decompression. A codec can be either a
software application or a piece of hardware that processes video through complex
algorithms, which compress the file and then decompress it for playback. Unlike other
kinds of file-compression packages that require you to decompress a file before
viewing, video codecs decompress the video on the fly, allowing the client to view the
file from its compressed original. Following are the types:

Temporal Compression
This method of compression looks for information that is not necessary for continuity
to the human eye or ear (remember that videotape plays back sound as well as
pictures). It looks at the video information on a frame-by-frame basis for changes
between frames. For example, if you're working with video of a talking head (a clip of
a person sitting or standing with little motion), there's a lot of redundant information
in the recording. The background rarely changes, and most of the motion involved is
simple head movements and the movement of the area around the mouth. The
compression algorithm compares the first frame (known as a key frame) with the next
(called a delta frame) to find anything that changes. After the key frame, it only keeps
the information that does change, thus deleting a large portion of your file. It does this
for each frame until it reaches the end of the file. If there is a scene change, it tags the
first frame of the new scene as the next key frame and continues comparing the
following frames with this new key frame. As the number of key frames increases, so
does the file size.

Spatial Compression
Spatial compression uses a different method to delete information that is common to
the entire file or an entire sequence within the file. It also looks for redundant
information, but instead of specifying each pixel in an area, it defines that area using
coordinates

Hardware Codecs
Hardware codecs are the most efficient way to compress and decompress video files.
They are faster and require fewer CPU resources than their software counterparts. In
order to capture clean raw video, most machines require a hardware codec that allows
the video file to be fragmented and distributed rapidly on your hard drive. These
hardware codecs are expensive, but deliver high-quality results. Using a hardware-
compression device will deliver high-quality source video footage, but requires
viewers to have the same decompression device in order to watch it. Hardware codecs
are used often in video conferencing, where the equipment of the audience and the
broadcaster are configured in the same way.

10.5 MULTIMEDIA REQUIREMENTS


The basic idea of "multimedia" is to manage and co-ordinate various devices of
communications and entertainment electronics with the PC as a central controller.
Using a few additional pieces of hardware most standard PCs can control multimedia
applications. The appropriate multimedia software is required to combine and
integrate the individual component video, music, animation, graphics and text
according to user requirements. An IBM-compatible PC has several features specific
to multimedia display.
A multimedia system is a combination of mainly three components:
z hardware devices that serve as an interface to digital computers and analog
devices.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

162 z computer hardware that enables the processing and management of multimedia
Multimedia and its Applications
information.
z software to process and integrate the information. Example, drivers, editors,
processing software and authoring software.
All multimedia systems may not require all these hardware and software. By itself
multimedia is nothing new. Recorded sound, movies and pictures have been around
for years. The new part is the way computers can intertwine these things.

10.6 BROWSER SUPPORT


The first Internet browsers had support for text only, and even the text support was
limited to a single font in a single color, and little or nothing else. Then came web
browsers with support for colors, fonts and text styles, and the support for pictures
was added. The support for sounds, animations and videos is handled in different
ways by different browsers. Some elements can be handled inline, some requires a
plug-in and some requires an ActiveX control.

10.7 MULTIMEDIA FORMATS


Multimedia elements (like sounds or videos) are stored in media files. The most
common way to discover the media type is to look at the file extension. When a
browser sees the file extensions .htm or .html, it will assume that the file is an HTML
page. The .xml extension indicates an XML file, and the .css extension indicates a
style sheet. Picture formats are recognized by extensions like .gif and .jpg. Multimedia
elements also have their own file formats with different extensions.
Check Your Progress 1
1. Why Multimedia over Internet?
………………………………………………………………………………..
………………………………………………………………………………..
2. What are the problems of multimedia on the internet?
………………………………………………………………………………..
………………………………………………………………………………..
3. What are the multimedia requirements for internet?
………………………………………………………………………………..
………………………………………………………………………………..

10.8 DESIGNING FOR THE WORLD WIDE WEB


Multimedia design and development involves lot of creative activities. While
developing multimedia many a times you are provided with many unrelated
multimedia elements. Even in the developed multimedia many a times difficulties
such as navigation through large amount of text, non-existent online help, poor
documentation, and non-coordinated video and audio sequences exist. Therefore, for
creation of good multimedia one must try to follow the process of multimedia
development. In addition, one must also be exposed to basic principles of learning as
that contributes in having a good presentable academic session through multimedia.

10.8.1 Display and Graphics


Make your Web pages look good on a 640x480-pixel VGA monitor showing
256 colors (8-bit graphics). Your working space on this monitor is actually about
600 pixels wide by 300 tall, because browsers include controls and slider bars. This is

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

where you must place the eye-catchers that will be first loaded and viewed by visitors 163
Designing and Tools for
without scrolling. the World Wide Web

Figure 10.2: Display and Graphics for World Wide Web

10.8.2 Nibbling
The principle you must always keep in mind when designing and making multimedia
elements for the Web should be called nibbling. At a serious metal-working supply
store you can buy a power tool called a nibbler; it want only devours the edges of
sheet metal in an ear-damaging staccato of rapid tiny bites. You must apply this tool
to the elegant bitmapped logo you created in Photoshop to trim it from 24- to 8- to
4-bit color depth and resize it from 96 pixels square to 64 pixels square. Nibble the
audio clip of your client's theme song from 44.1 kHz to 11 kHz, and see if it's
acceptable at 8-bit sample size. Text as HTML is cheap: nibble your page design and
throw away the pretty shadowed GIF graphic headers and image maps-re-create your
text in HTML headers or emphasized text, and try coloring it. For every image
referenced in an HTML document, a separate Internet HTTP connection must be
made between your computer and that image's server before the image itself is
downloaded; so using many tiny images (such as graphic images as bullets) may not
be efficient.

10.8.3 HTML and Multimedia


You should have a basic understanding of HTML before you begin developing
multimedia for the Web. HTML-coded documents are the fundamental vehicles for all
types of information delivered on the World Wide Web.
HTML was formalized in 1992 and due to the tremendous interest in it, there soon
was a need to establish a standards organization to set recommended practices. The
World Wide Web Consortium (W3C) was founded in 1994 to deal with setting
standards for markup languages. Even though HTML has become almost entirely
standardized in HTML 4.0, there are still markup elements that one browser
recognizes while another does not. For example, Internet Explorer recognizes the
<MARQUEE> property, which allows scrolling text. Netscape does not recognize it.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

164 Netscape also has markup tags that are unique as well, such as the <BLINK> property
Multimedia and its Applications
that Internet Explorer does not recognize.

Basic HTML Document


A web page is a basic text document that can be written in any text editor (Notepad,
Wordpad, WordPerfect, etc.). HTML is a composition of tags that your web browser
reads and interprets, then displays the content of the pages appropriately. Think of the
older word processors that displayed all of your document’s code (where paragraphs
begin, how many spaces are between each word, etc.). The only difference is now you
have to write the codes to determine the presentation (unless you use a What You
See is What You Get/WYSIWYG editor such as: MS FrontPage, Macromedia’s
Dreamweaver).
Luckily computer innovation moves quite rapidly, thus there are a number of tools
you can use to create web pages without writing a single tag. However, it is good to
know what the tags are in case the format isn’t exactly to your specifications you can
manually manipulate the tags.
Every tag must have a beginning and an ending tag, with the exception of meta tags.
The basic skeleton of the document is as follow:
<html>
<head>
<meta name=”description” contents=”Brief html example”>
<meta name=”keywords” contents=”html, web, web publishing”>
<title>Academic Computing at Fullerton College</title>
</head>
<body>

Contents of the page. Basic HTML example.


</body>
</html>
The HTTP (HyperText Transfer Protocol) item most often requested by the Web
client is the HTML document. HTML (HyperText Markup Language) is an SGML
(Standard Generalized Markup Language) DTD (Document Type Definition). Tim
Berners-Lee invented HTML while at CERN, the European Laboratory for Particle
Physics in Geneva.
HTML has two major functions. First, it allows user, as a web page author, to format
his pages in a common language where can be read by all browsers. The browser
interprets the HTML code and a formatted web page within the browser window is
displayed. In practical terms, HTML is a collection of platform-independent styles
that define the various components of a World Wide Web (WWW). HTML is not a
true computer language, but it can better be a “browser language”. The second
function is it allows user to link their web page to all other pages throughout the
world. HTML documents are ASCII files that contain the text to be displayed on a
Web page, along with formatting codes or tags that define attribute of the page
elements. We also can use word-processing software if we save it as “text only with
line breaks”.
Any simple text editor can be used to generate HTML files, such as .htm or .html is
the appropriate suffix. Some WYSIWYG editor and Web page creation tools are
available HTML is unformatted linked documents that readable by browser for

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

interpretation. It can be created using specific editors, or word-processors but it has to 165
Designing and Tools for
be saved in text file format. the World Wide Web

HTML Tags
HTML tags are used to mark the elements of a file for the browser. Elements, which
are a fundamental component of the structure of a text document can contain plain
text, other elements, or both. Instead of denoting the various elements in a HTML
document, HTML tags are used. They are surrounded by symbols “<” and “>”. Tags
are usually paired to start and end the tag instruction. The end tag is like the start tag
except a slash “/” precedes the text within the brackets. Every HTML document
should contain certain standard HTML tags. Each document consists of head and body
text. The head contains the title and the body contains the actual text that is made up
of paragraphs lists and other elements. Browsers expect specific information because
they are programmed according to HTML and SGML specifications.
Below shows the required elements in a sample bare-bones document:
<html>
<head>
<TITLE>A Simple HTML Example</TITLE>
</head>
<body>
<H1>HTML is easy to learn</H1>
<P>This is a paragraph.</P>
</body>
</html>
The required elements are the <html>, <head>,
<title>, and <body> tags. A template file needs to be created to include these tags in
each file:

HTML
This element tells the browser that the file contains HTML coded information. The
file extension .html also indicates that an HTML document must be used.

HEAD
The head element identifies the first part of our HTML coded document that contains
the title.

TITLE
The title element contains our document title and identifies its content in a global
context. The title is displayed somewhere on the browser window, but not within the
text area.

BODY
The second and largest part of the HTML document, which contain the content of the
document.

Headings
HTML has six levels of headings, which numbered one through six, with one being
the most prominent. Heading is displayed in larger and or bolder fonts than normal
body text. Do not skip levels of heading in the document.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

166 Paragraph
Multimedia and its Applications
We must indicate paragraph with <P> elements. A browser ignores any indentations
or blank lines in the source text. Without <P> elements, the document becomes one
large paragraph.

Cascading Style Sheets


Cascading Style Sheets (CSS) are a way of formatting your html documents. They let
the publisher describe how text, backgrounds, colors, etc., will be displayed in a
browser. Although CSS are powerful, not all browsers support the use of CSS. The
best way to determine whether you should use CSS is to test the document by opening
it in a few different types of browsers.

Meta-Tags
Meta-tags contain information about the document. They provide information to
search engines and users about your page. Who designed the page? What information
is on the page? What organization is posting the page? Meta tags have two key
attributes to them: name (name of the information) and content (the actual data). The
most commonly used Meta information contains the “description” of the html
document and “keywords” for searches.
The following are two examples of such tags:
<meta name="description" content="Biology department at Fullerton College">
<meta name="keywords" content="[department name], college, California, fullerton,
alumni, student, academics, education, research, hornets, learning, distance, online,
courses, news, teaching, public, service, outreach, giving, admit, admissions,
undergraduate, visitor, information, center, centers, institutes, institute">
These tags are located in the documents <head> tags.

Graphics
Graphics are your preference. To develop the main site, we used the same colors as
described earlier: Blue (#002452), Yellow (#D8A400) on a white (#FFFFFF)
background. The font used was, Arial, 12 point. If you’re designing you’re own
graphics, it is best to keep them under 100KB. This allows surfers with slow
connections to view your pages relatively quickly. If the graphic becomes to large,
surfers will not wait an hour to see what your page looks like, so it is best to keep your
graphics small. Using programs like Adobe Photoshop and Macromedia’s Fireworks,
it is fairly easy to optimize your graphics so they don’t exceed 100KB.
HTML provides tags for inserting media into HTML documents: the <IMG> tag for
inline images; the <INSERT> tag for multimedia objects, including audio, video, and
programming tools such as Java applets, Microsoft's Component Object Model
(COM) objects, OLE Controls, and OLE; the <EMBED> tag for compound document
embedding; and Sun's <APP> and <APPLET> tags for code. The <INSERT> tag and
some optional attributes to play a QuickTime movie in HTML 3.0 might look
something like,
<insert data=lizzie.mov type="application/quicktime">
<param name=loop value=infinite>
<img src=soccer.jpeg alt="The Match">
</insert>
where the browser would play the QuickTime movie named "lizzie" if it supported the
.mov format and that MIME-type, and it would loop and play forever. If not
supported, the browser would show a substituted JPEG image. Using the <IMG>

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

element here provides backward compatibility with older browsers; if the browser has 167
Designing and Tools for
trouble with the JPEG image, it would display an alternate text label, in this case, the World Wide Web
"The Match."
The <EMBED> tag first used by Netscape to enable the many multimedia plug-ins
may soon be superseded by the more capable <INSERT> tag. If you develop
multimedia for the Internet, budget time and effort for keeping current in this rapidly
changing environment-staying at the leading edge takes effort. It will be some years
before multimedia delivery tools and techniques for the Web stabilize.

Text for the Web


Viewers of your Web site may not be displaying the same "preferred" font that you
used to design your page because user preferences in the browser may alter the way
text in your document looks and flows. Many developers design their documents in
Times Roman for the proportional font and Courier as the monospaced font. These
fonts readily move across platforms and are the default fonts users typically see if they
do not set their own preferences. Although you can specify a font using the <FONT>
tag, browsers can only attempt to find a substitute when that font is not installed on
the end user's computer.
Great efforts are being made to define standard methods for displaying typefaces on
the Web, but neither of the two rival camps, TrueDoc (from Bitstream) and OpenType
(supported by Adobe, Microsoft, Agfa, and others), seem able to get their technology
properly implemented and launched. The font problem will eventually be solved in
one of two ways, or in a combination of both: either many more fonts will become
readily and cheaply available and will be commonly installed by end users or a
method of embedding a font’s character shapes into an HTML document will allow
fonts to ‘ship’ with the document.

Images for the Web


Theoretically, the Web can support any graphics format the client and server have in
common. Practically, even though Web standards do not specify a graphics format
you must use, browsers typically recognize two image formats, GIF and JPEG,
without resorting to special plug-ins. Both formats use built-in compression
algorithms to reduce file size. For other graphics formats, such as CGM, CMX, DXF,
and fractal- and wavelet-compressed images, special proprietary creation software and
browser plug-ins may be required.

GIF and PNG Images


GIF images are limited to 8-bits of color depth (256 colors). This is a commercial
image format developed by CompuServe Information Services, an on-line company
once owned by Unisys and presently folded into America Online. In late 1994, Unisys
announced a patent fee charge to all software developers who use the GIF format. In
an angry industry wide response, a new "open" format called PNG (for Portable
Network Graphics Specification, not requiring fees) was developed to replace GIF.

Sound for the Web


In the beginning, when the Internet was primarily a collection of Unix machines,
sound files were sent from machine to machine in AU format and when downloaded,
were played back using a sound application. As the Web has developed, sound has
become more important and plug-ins currently allows embedding of sounds into
documents.
Browsers have become sound capable: Microsoft Explorer offers the <BGSOUND>
tag to play an AU, WAV or MIDI sound track in a document background. Netscape
and Internet Explorer offer the QuickTime plug-in for playing AIFF, MIDI, WAVD

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

168 and AU formats. If your browser does not support this sound plug-in, you can still
Multimedia and its Applications
launch an external "viewer" or helper application.
Streaming audio is more useful for the Web, where a sound file can start playing as
soon as data begins coming in – streaming is provided by the LiveAudio and
QuickTime plug-ins. Plug-ins such as StreamWorks, VocalTec and RealAudio feature
streaming capability and high compression/good fidelity, but they also require special
software at the service.
The QuickTime plug-in offers a fat-start feature to allow playing before a movie has
been completely downloaded: MIDI files and digitized sound embedded in a
QuickTime movie will play even without video, text or image tracks. You can either
embed HTML commands in a document to play a QuickTime movie automatically in
the background, or you can display a controller providing stop, fat-forward and
rewind for the user. With the Crescendo MIDI plug-in (which downloads and plays
standard MIDI files of MIME-type .mid or midi), you can automatically launch a
background sound when a page opens by using the <EMBED> tag.

Check Your Progress 2


1. What is Nibbling?
………………………………………………………………………………
………………………………………………………………………………
2. What are HTML Documents?
………………………………………………………………………………
………………………………………………………………………………
3. How text can be created for the web?
………………………………………………………………………………
………………………………………………………………………………

10.9 MULTIMEDIA NETWORKING


Today, Networking is playing a dominant role in integrating technology into our lives.
Networks connect computers in our offices and labs; they also can link us to other
computers across the nation and around the world. Using networks, computer systems
can share resources and information. That many forms of information can be
exchanged instantaneously over long distances has changed the way we work and
play. For example, employees in many corporations and other organizations rely more
on electronic mail than conventional mail for communication with coworkers. Indeed,
networks have created a new habitat, commonly called cyberspace. These new
opportunities have a profound effect on ways we work and interact with one another.
Not only is information more readily available, it is also richer and strikingly more
dynamic. In cyberspace, you are immersed in and engaged by information rather than
merely possessing it. Communication technologies have existed for a long time. So,
what is special about data communications over computer networks? First, the
technology driving data communications offers far greater capacity and speed than
any other previous form. For example, we now have the capability to store and send
whole libraries of information much more quickly than sending a simple message by
ordinary postal service. Because this data is represented digitally, we can combine and
communicate various forms of information simultaneously, for example, text, audio,
and images. Connectivity means more than simply people communicating with other
people. Computer networks also make it possible for individuals to communicate with
other computers over long distances. For instance, from your home computer, you can
easily borrow both software and processing power from a computer system far away.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

You can also ask the distant computer system to supply you with the latest stock 169
Designing and Tools for
market quotes, college basketball scores, or international news. When traveling, you the World Wide Web
can telephone your computer system at home or office to check for electronic mail or
messages. In the past, computers were isolated and largely incompatible. Today,
networks support communication from computer to computer as well. Computer
systems can request and receive services from other computers automatically and
invisibly to the user. This offers a number of advantages. Borrowing processing from
a remote system extends the capabilities of your own computer. It also means that
computers of different scale and performance can exchange the results of processing
almost seamlessly. Distributing the work of processing among cooperating computer
systems is still in its infancy. We can expect that it will have a profound effect on
computing in the future.
A computer system is not merely a tool but rather itself a medium for representing,
storing, manipulating, and communicating different forms of information: text,
numbers, graphics, images, sounds, and video. The common denominator for these
differing forms of information is that they all can be digitized for use by our
computers. This data can be studied, combined, transformed, and transmitted with
utmost ease. The computer is also a medium for ideas and information. Computers can
be used not only to express but communicate these ideas as well. They can store
knowledge and facts but more importantly, computer systems can store and
manipulate information in many different forms. Informational media include text,
illustrations, photographs, animation, video, sounds, voice, and music. The modern
computer is an all-purpose medium for informational media. Regardless of the media,
the computer system represents, stores, and transmits all in its native digital form.
Multimedia refers to the integration of various forms of information such as text,
graphics, sound, and images. The modern computer system is a multimedia machine;
that is, it is capable of integrating two or more conventional forms of informational
media in a single electronic document. Because we can express and combine various
forms of information using a computer, we can interact, explore, and learn even more
from that information. In this way, the computer becomes a vehicle for knowledge
rather than just a tool that stores, distributes, and displays information.

10.10 LET US SUM UP


Multimedia is pictures, sounds, music, animations and videos. Modern web browsers
have support for many multimedia formats. The basic idea of "multimedia" is to
manage and co-ordinate various devices of communications and entertainment
electronics with the PC as a central controller. The MIDI (Musical Instrument Digital
Interface) is a format for sending music information between electronic music devices
like synthesizers and PC sound cards. The lesson discusses in detail the concept of
Multimedia and the Web, various tools for the Web, multimedia over Internet,
problems of multimedia on the internet, multimedia requirements, multimedia
Formats and designing multimedia for the World Wide Web.

10.11 LESSON END ACTIVITY


Discuss among the group how to design a web page with sound and video features.

10.12 KEYWORDS
Codec: Codec is an abbreviation for compression/decompression. A codec can be
either a software application or a piece of hardware that processes video through
complex algorithms.
HTML-coded Documents: HTML-coded documents are the fundamental vehicles for
all types of information delivered on the World Wide Web.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

170 Web Page: A web page is a basic text document that can be written in any text editor -
Multimedia and its Applications
Notepad, Wordpad, WordPerfect, etc. HTML is a composition of tags that your web
browser reads and interprets, then displays the content of the pages appropriately.

10.13 QUESTIONS FOR DISCUSSION


1. Explain in brief about Multimedia on the Web.
2. Give the sound & video formats.
3. Write about speech recognition and synthesis.
4. How to play sound on web pages?
5. How to play videos on web pages?
6. What are the requirements of multimedia?

Check Your Progress: Model Answers


CYP 1
1. Following are the reasons for Multimedia over Internet:
™ The Internet is a shared network of networks.
™ The concept of integrated data and multimedia services over single
network
™ There are millions of recipients over the internet
2. Multimedia applications require steady bandwidth over the internet. Just
increasing the bandwidth will not solve the whole problem. For most
multimedia applications, the receiver has a limited buffer. If no measure
is taken to smooth the data stream, it may overflow or underflow the
application buffer. When data arrives too fast, the buffer will overflow
and the some data packets will be lost, resulting in poor quality. When
data arrives too slowly, the buffer will underflow and the application will
starve.
Then there is problem of synchronization of streams. Real-time data
becomes obsolete and will be dropped if it doesn't arrive in time. If proper
action is not taken, the retransmission of lost packets would aggravate the
situation and jam the network. A common feature of all solutions is
increased bandwidth, at least from the network to the user. Applications
(e.g. interactive video applications), which require high speed links to the
user are not only constrained by the bandwidth available on the
downstream channel. The limiting factor may be the return path because,
if the services run on top of TCP, packets need to be acknowledged and
the ratio of the bandwidths required is around 1/30. Satellite distribution
systems do not allow users to send multimedia information in real-time to
other users or to the multimedia server. However, such systems can
support asymmetric applications that do not need a broadband return path
and provide a narrowband return path for control, selection and
navigation. They can not support highly interactive multimedia
applications, such as virtual reality, because of the delays on satellite links
and are only suitable for applications that deliver a lot of data to the user
in response to a small amount of return path data. Good candidates are
multicasting and broadcasting applications, such as tele-shopping and
software download.
Contd…

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

3. The basic idea of "multimedia" is to manage and co-ordinate various 171


Designing and Tools for
devices of communications and entertainment electronics with the PC as a the World Wide Web
central controller. Using a few additional pieces of hardware most
standard PCs can control multimedia applications. The appropriate
multimedia software is required to combine and integrate the individual
component video, music, animation, graphics and text according to user
requirements. An IBM-compatible PC has several features specific to
multimedia display.
A multimedia system is a combination of mainly three components:
™ hardware devices that serve as an interface to digital computers and
analog devices.
™ computer hardware that enables the processing and management of
multimedia information.
™ software to process and integrate the information. Example, drivers,
editors, processing software and authoring software.

CYP 2
1. The principle that must always be kept in mind when designing and
making multimedia elements for the Web should be called nibbling. At a
serious metal-working supply store you can buy a power tool called a
nibbler; it wantonly devours the edges of sheet metal in an ear-damaging
staccato of rapid tiny bites. You must apply this tool to the elegant
bitmapped logo you created in Photoshop to trim it from 24- to 8- to 4-bit
color depth and resize it from 96 pixels square to 64 pixels square. Nibble
the audio clip of your client's theme song from 44.1 kHz to 11 kHz, and
see if it's acceptable at 8-bit sample size. Text as HTML is cheap: nibble
your page design and throw away the pretty shadowed GIF graphic
headers and image maps-re-create your text in HTML headers or
emphasized text, and try coloring it. For every image referenced in an
HTML document, a separate Internet HTTP connection must be made
between your computer and that image's server before the image itself is
downloaded; so using many tiny images (such as graphic images as
bullets) may not be efficient.
2. A web page is a basic text document that can be written in any text editor
(Notepad, Wordpad, WordPerfect, etc.). HTML is a composition of tags
that your web browser reads and interprets, then displays the content of
the pages appropriately. Think of the older word processors that displayed
all of your document’s code (where paragraphs begin, how many spaces
are between each word, etc.). The only difference is now you have to
write the codes to determine the presentation (unless you use a What You
See Is What You Get/WYSIWYG editor such as: MS FrontPage,
Macromedia’s Dreamweaver).
Luckily computer innovation moves quite rapidly, thus there are a number
of tools you can use to create web pages without writing a single tag.
However, it is good to know what the tags are in case the format isn’t
exactly to your specifications you can manually manipulate the tags.
3. Viewers of the Web site may not be displaying the same "preferred" font
that you used to design your page because user preferences in the browser
may alter the way text in your document looks and flows. Many
developers design their documents in Times Roman for the proportional
font and Courier as the monospaced font. These fonts readily move across
platforms and are the default fonts users typically see if they do not set
Contd…
Downloaded by Wahitha Banu (wahitha19@gmail.com)
lOMoARcPSD|19697633

172 their own preferences. Although you can specify a font using the
Multimedia and its Applications
<FONT> tag, browsers can only attempt to find a substitute when that
font is not installed on the end user's computer.
Great efforts are being made to define standard methods for displaying
typefaces on the Web, but neither of the two rival camps, TrueDoc (from
Bitstream) and OpenType (supported by Adobe, Microsoft, Agfa, and
others), seem able to get their technology properly implemented and
launched. The font problem will eventually be solved in one of two ways,
or in a combination of both: either many more fonts will become readily
and cheaply available and will be commonly installed by end users or a
method of embedding a font’s character shapes into an HTML document
will allow fonts to ‘ship’ with the document.

10.14 SUGGESTED READINGS


Dhiraj Sharma, Foundations of IT, Excel Books, 2008.
Vaughan, Multimedia Making IT Work, Fifth Edition, Tata McGraw Hill.
John F. Koegel Bufford, Multimedia Systems, Pearson Education, 2003.
Judith Jeffloate, Multimedia in Practice (Technology and Applications), PHI, 2003.
Ze-Nian Li and Mark S. Drew, Fundamentals of Multimedia, Prentice-Hall, 2004.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

173
High Definition Television
and Desktop Computing

UNIT V

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

174
Multimedia and its Applications

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

175
LESSON High Definition Television
and Desktop Computing

11
HIGH DEFINITION TELEVISION AND
DESKTOP COMPUTING

CONTENTS
11.0 Aims and Objectives
11.1 Introduction
11.2 High Definition Television (HDTV) and Desktop Computing
11.2.1 Resolution for High Definition Television and Desktop Computing
11.2.2 HDTV, ATV, EDTV, and IDTV
11.3 HDTV Standards
11.3.1 SMPTE 240M: 1125/60 Production Standard
11.3.2 1125/60 Studio Equipment
11.3.3 HDTV Exchange Standards
11.3.4 ATV Transmission Standards
11.4 HDTV and Computing
11.5 Square Pixels
11.6 Display Refresh Rate and Interlace
11.7 Let us Sum up
11.8 Lesson End Activity
11.9 Keywords
11.10 Questions for Discussion
11.11 Suggested Readings

11.0 AIMS AND OBJECTIVES


After studying this lesson, you would be able to understand:
z The background and potential of HDTV
z The basic parameters of the 1125/60 production standard
z The features of proposed ATV systems
z The features of outlines potential applications areas

11.1 INTRODUCTION
Multimedia seeks to engage people through interfaces that provide visual and aural
richness. Providing a rich visual environment demands a lot of pixels and a high
refresh rate. HDTV represents the state of the art of image quality in motion image
capture, recording, processing and electronic distribution, and so will be important to
high-end multimedia applications.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

176 Advanced Television (ATV) systems for the distribution of entertainment to


Multimedia and its Applications
consumers are being standardized and will be deployed in about 1995. ATV
technology will be used to distribute entertainment that will be produced primarily
using film or HDTV studio technology. Consumer ATV receivers — and eventually
VCRs and camcorders — will incorporate high performance digital compression and
decompression technology, and ATV equipment will benefit from the economy of
scale of manufacture in consumer volumes. Cheap consumer hardware and
components, and the wide availability of program material in compressed digital form,
will cause ATV to be a significant influence on multimedia. Several ATV proposals
have elements in common with the ISO MPEG standards developed taking place in
the computing and communications communities, and so there is a possibility that
these standards could converge to the extent that common components could be used,
much to the benefit of the computer and communications industries.
This lesson outlines the background of HDTV, describes the basic parameters of the
1125/60 production standard, discusses the features of proposed ATV systems,
outlines potential application areas, and briefly describes standardization issues
currently under discussion.

11.2 HIGH DEFINITION TELEVISION (HDTV) AND


DESKTOP COMPUTING
High definition television (HDTV) is defined as having twice the vertical and twice
the horizontal resolution of conventional television, a picture aspect ratio of 16:9, a
frame rate at least 24 Hz, and at least two channels of CD-quality sound. HDTV
studio equipment commercially available at the moment has about two mega pixels
per frame (in a 1920x1035 format), six times the number of pixels as conventional
television. The data rate of current studio-quality HDTV is about 120 megabytes per
second. The parameters of HDTV are optimized for a viewing distance of about three
times picture height. This enables a horizontal picture viewing angle of about thirty
degrees, three times that of conventional television.
This gives the viewer a much greater sense of involvement in the picture. Advanced
Television (ATV) refers to transmission systems designed for the delivery of
entertainment to consumers, at quality levels substantially improved over
conventional television. ATV transmission systems have been deployed in Japan, and
a number of systems have been proposed for adoption in the United States. The
parameters of these systems vary widely, although the data rate of each proposed
systems is about 20 megabits per second.
Standards for film, video and entertainment have historically developed in a three-
level hierarchy: production, exchange and distribution. Production refers to the
shooting and editing of material. Exchange refers to the buying and selling of
programs (often on 35 mm film). Distribution refers to the delivery of programs to
consumers, which can take place using transmission as in conventional broadcasting,
or using physical media such as videotapes and videodiscs. Transmission can be
through conventional terrestrial broadcasting (VHF/UHF), cable television (CATV),
direct broadcast by satellite (DBS), or potentially through telecommunications
(e.g. “fiber to the home”, FTTH).
The development of HDTV and ATV standards is currently in disarray. The standards
process is confused among production, exchange and distribution. Certain
organizations have much to lose if the adoption of standards accelerates the
introduction of HDTV; these organizations have deterred standards development.
The U.S. Federal Communications Commission has jurisdiction only over standards
for terrestrial VHF/UHF broadcast, but has no jurisdiction and no official position on
other delivery standards, and no mandate to discuss production or exchange standards.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Political, technonationalist and industrial policy concerns are frequently evident in 177
High Definition Television
standards discussions. The lack of formal standards for HDTV and ATV could and Desktop Computing
encourage introduction of consumer ATV using physical media, since this would
bypass the need for formal standards.
The television industry has reacted with alarm at the possibility of the introduction of
ATV broadcasting. Networks and television stations face the prospect of making huge
capital outlays to upgrade their plants for ATV, with no corresponding increase of
revenue. Traditional consumer equipment manufacturers and traditional broadcasting
interests have proposed ATV systems with parameters closely tied to existing
practice, in the hope of minimizing new investment. For example, many U.S. interests
hope to retain the same troublesome 59.94 Hz field rate as NTSC just to be
“compatible”.
Unfortunately their European counterparts have also chosen to maintain their field rate
of 50 Hz, and the discrepancy between these two rates makes it unlikely that a
common distribution standard will emerge. Despite the chaos in distribution
standards, the Society of Motion Picture and Television Engineers (SMPTE) has
adopted SMPTE Standard 240M for 1125/60 studio equipment. The emergence of this
standard has encouraged equipment manufactures to invest in tooling to bring studio
production equipment to the market. Much commercial studio production equipment
that conforms to this standard is available for acquisition, recording, processing,
transmission and display.
HDTV is different from video. It has greater resolution (2 Mpx vs. 300 Kpx) and
improved color accuracy. Its color is coded with the component system (instead of the
composite system, which suffers color quality impairments). In its digital form,
HDTV is poised to exploit the emerging digital infrastructure in a way that NTSC and
PAL cannot. HDTV is different from film: it has no judder, no weave, and no
scratches! Being electronic instead of photochemical, HDTV offers consistent,
reliable color reproduction compared to film, and has all the advantages of electronic,
digital post-production. The image quality of HDTV is good enough for Hollywood:
many feature productions are exploiting HDTV technology in effects sequences and
synthetic computer graphics sequences.
HDTV is different from computer graphics. Commercial equipment is available today
for real-time acquisition, recording, processing and transmission. HDTV technology is
poised to bring motion and real-life images to computer graphics.
HDTV interchange standards have been designed to deliver convincing, emotive,
artifact-free pictures. The coding of pictures into HDTV signals takes into account the
perceptual characteristics of human vision to make the most effective use of the
available bandwidth, and a standardized set of interchange parameters has been
adopted, so that HDTV images maintain their quality (including color accuracy) when
exchanged.
Low cost HDTV and ATV equipment is inevitable as HDTV moves towards the high
unit volumes that will come from consumer acceptance. Large consumer electronic
companies recognize the difficulty of the consumer’s encountering the “home theater
experience” on a direct-view CRT display at most 32 inches in diameter, the largest
practical size of tube for the home. So despite the immense technological difficulties,
flat panel displays are likely to emerge for HDTV. The computer industry will be the
first beneficiaries of these products — at a high price — but the goal of the
manufactures is to reap profits from consumer unit volumes, not from the
comparatively small volumes of the computer industry.
At the moment there are several challenges to the integration of HDTV and computer
graphics. Since HDTV was derived from television technology, it utilizes the 60
frame per second refresh rate and the interlace technique of conventional television.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

178 Computer applications generally require a refresh rate of 70 Hz or greater because of


Multimedia and its Applications
the high ambient light levels typical of computer use in the office.
The computer community has been working toward standards for the compression of
digital moving pictures and audio in the Moving Pictures Experts Group (MPEG) of
ISO. Several ATV proponents have ATV transmission systems that have a close
resemblance to MPEG, but it is not yet clear whether the resemblance is sufficiently
close to allow huge investments in silicon to be exploited in both arenas.
Although political interests have apparently again split the world’s television
distribution into three domains — Japan, North America and Europe, in a 1990’s
version of the NTSC, PAL and SECAM trichotomy — many HDTV enthusiasts hope
that a common production standard will emerge. Some of us also hope that ATV
standards will have a deep-rooted commonality with MPEG and other compression
standards emerging from computing and communications, in order to facilitate the
exchange of images among the people of the world.

11.2.1 Resolution for High Definition Television and Desktop Computing


Resolution refers to the capability of an imaging system to reproduce fine detail. As
picture detail increases in frequency, the response of an imaging system generally
deteriorates. In film, resolution is measured as the finest pattern of straight, parallel
lines that can be reproduced, expressed in line pairs per millimeter (lp/mm). A line
pair contains a black region and a white region.
In video, resolution refers to the number of line pairs (cycles) resolved on the face of
the display screen, expressed in cycles per picture height (C/PH) or cycles per picture
width (C/PW). A cycle is equivalent to a line pair of film. In a digital system, it takes
at least two samples — or pixels or scanning lines — to represent a line pair.
However, resolution may be substantially less than the number of pixel pairs due to
optical, electro-optical and electrical filtering effects. Limiting resolution is reached at
the frequency where detail is recorded with just 10% of the system’s low-frequency
response. In consumer television, horizontal resolution is expressed in terms of the TV
lines that make up the picture height. Resolution in TVL is twice the resolution in
C/PW, divided by the aspect ratio of the picture.
In computer graphics, resolution refers simply to the number of discrete horizontal
and vertical picture elements — or pixels — that are employed to represent an image
in digital form. For example, a 1152 × 900 system has a total of about one million
pixels (one megapixel, or 1Mpx). Computer graphics has not traditionally been
concerned with whether individual pixels can be discerned on the face of the display.
In most color computer systems, an image comprising a one-pixel black-and-white
checkerboard displays as a uniform gray. Computer graphics often treats each pixel as
representing an idealized rectangular area independent of all other pixels. This notion
discounts the correlation among pixels that is an inherent and necessary aspect of
image acquisition, processing, compression, display and perception. In fact the rather
large spot produced by the electron beam of a CRT and the arrangement of phosphor
triads on the screen produce an image of a pixel on the screen that bears little
resemblance to a rectangle. If pixels are viewed at a sufficient distance, these artifacts
are of little importance. However, we tend to view CRTs at close viewing distances,
where these distortions need to be compensated in order to achieve good image
quality. HDTV systems are optimized taking these distortions into account.
In the living room, television viewing is best when the viewer is located at a distance
of about seven picture heights from the screen. Cinema offers a wide range of viewing
distances, but about three times picture height is optimum. In the long term, consumer
HDTV will bring the cinema viewing experience to the living room. A workstation

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

user typically sits very close to his screen, but for a large display surface his viewing 179
High Definition Television
angle approximates that of HDTV. and Desktop Computing
Resolution refers to the capability of a system to reproduce spatial detail. Another
important aspect of digital picture representation is the capability to reproduce
intensity values. HDTV uses eight bits for each of three components — red, green and
blue — but the transfer function and color interpretation are established with careful
attention to the needs of human visual perception.

Nomenclature, Video vs Computing


A video system is denoted by the total number of lines in its raster (frame), and its
field rate — in hertz, or fields per second — separated by a slash. Broadcast television
in the North America and Japan is denoted 525/59.94; the system used in Europe is
denoted 625/50. These systems are colloquially called NTSC and PAL but those terms
properly denote color coding and not raster structure. Conventional television has a
4:3 picture aspect ratio, and employs interlaced scanning (to be discussed later), which
is implicit in the 525/59.94 notion.
The total number of lines in a raster is of less concern to the viewer than the number
of lines that contain useful picture information. The number of lines per picture height
(L/PH) is about four percent of the total number of lines in a field — in an interlaced
system, eight percent of the frame — in order to accommodate vertical blanking
interval overhead. For example, a 525/59.94 system has about 483 picture lines.
Computer users denote scanning structures by their horizontal and vertical pixel
counts only, and generally do not indicate field or frame rate. For example, a
1152´900 system may have 937 total lines and a frame rate of 65.95 Hz. Sometimes,
scanning parameters are implicit in acronyms, for example, VGA implicitly has a
“resolution” of 640´480.

11.2.2 HDTV, ATV, EDTV, and IDTV


HDTV has about twice the horizontal and twice the vertical (linear) resolution of
conventional television, a 16:9 picture aspect ratio, and at least 24 Hz frame rate.
Under this definition, HDTV has approximately double the number of lines of current
broadcast television, at approximately the same field rate. The doubled line count,
combined with the doubled horizontal resolution and the increase in aspect ratio,
causes an HDTV signal to have about six times the luma (Y) bandwidth of
conventional television.
Advanced Television (ATV) refers to delivery of entertainment television to
consumers at a quality level substantially improved over conventional television.
Terrestrial (VHF/UHF) ATV requires a change in FCC broadcasting regulations.
HDTV studio equipment will be used to produce programming for ATV distribution,
but the standards used for these two areas need not be identical. My definition of ATV
reflects the wide latitude of choices available in setting ATV standards. For example,
the ATV proposals of Zenith and ATVA/MIT offer only 900 Kpx, substantially short
of the two megapixels required for twice the vertical and twice the horizontal
resolution of NTSC.
Enhanced Definition Television (EDTV) describes a 525/59.94 or 625/50 broadcast
television signal that is originated with altered or augmented signal content, requiring
broadcast regulation changes, that makes possible higher quality at consumer
receivers.
Improved Definition Television (IDTV) describes receiver techniques that improve
the quality of standard NTSC or PAL broadcast signals but require no emission
regulation changes. A receiver is considered IDTV if it employs frame-rate doubling
to eliminate inter-line twitter, although additional techniques such as noise reduction

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

180 may also be employed. IDTV does not require changes in signal transmission
Multimedia and its Applications
standards, and consequently can be implemented entirely at the receiver.

Psychophysics
The fundamental development work for HDTV was done at the Japan Broadcasting
Corporation (NHK), after extensive psychophysical and perceptual research led by
Dr. Takashi Fujio. Human viewers tend to position themselves relative to a scene such
that the smallest detail of interest in the scene subtends an angle of about one minute
of arc, which is approximately the limit of angular discrimination for normal vision.
For the 483 picture lines of 525-line television, the corresponding viewing distance is
about seven times picture height, and the horizontal viewing angle is about 11°. For
the 1035 visible lines of 1125-line HDTV, the corresponding viewing distance is 3.3
times screen height and the horizontal viewing angle is almost tripled to 28°.
Using an acuity of 60°, viewing distance in terms of picture height should be about
3400 divided by the number of picture lines. A computer user tends to position herself
closer than this — about 60% to 75% of this distance — but at this closer distance
individual pixels are discernible.
The viewer of HDTV consequently does not perceive increased “definition”
(resolution) for the same size picture compared to conventional television, but rather
moves closer to the screen. Psychophysical research has shown that a viewer’s
emotional involvement in a motion picture is increased when the picture subtends a
large viewing angle. Consumer HDTV should be called wide screen television, and
this designation would probably be more appropriate to consumer marketing and
product differentiation than “HDTV”. It is unnecessary to increase the vertical angle
of view as much as the horizontal, and the aspect ratio of 16:9 has been standardized
for HDTV, compared to 4:3 for conventional television. The HDTV aspect ratio,
about 1.78:1, is almost the same as the most common cinema aspect ratio of 1.85:1.
NHK research has revealed that in combination with large viewing angles,
high-quality stereo sound impacts the psychophysical response of the viewer to the
picture. In particular, the viewer’s eye-tracking response is dramatically different from
conventional television. Most HDTV proposals include CD-quality stereo audio.

Quality
The picture quality of HDTV is superior to that of 35 mm motion picture film, but less
than the quality of 35mm still film. Motion picture film is conveyed vertically through
the camera and projector, so the width — not the height — of the film is 35 mm.
Cinema usually has an aspect ratio of 1.85:1, so the projected film area is about 21
mm x 11 mm, only three tenths of the 36 mm x 24 mm projected area of 35 mm still
film. In any case the limit to the resolution of motion picture film is not the static
response of the film, but judder and weave in the camera and the projector.
The colorimetry obtainable with the color separation filters and CRT phosphors of a
video system is greatly superior to that possible with the photochemical processes of a
color film system. There are other issues related to the subjective impressions that a
viewer obtains from viewing motion picture film — the film look — that are still
being explored in HDTV. For example, specular highlights captured on film have an
appearance that is subjectively more pleasing than when captured in video.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

Check Your Progress 1 181


High Definition Television
1. Define High definition television (HDTV). and Desktop Computing

……………………………………………………………………………..
……………………………………………………………………………..
2. Discuss the Resolution for High Definition Television and Desktop
Computing.
……………………………………………………………………………..
……………………………………………………………………………..

11.3 HDTV STANDARDS


Standards for motion pictures and video exist in three tiers: production, exchange, and
distribution. Production is the shooting and assembling of program material.
Exchange of programs takes place among program producers and distributors.
Distribution to the consumer may take place using physical media such as videotape
or videodisc, or through one of four transmission media: terrestrial VHF/UHF
broadcast, cable television (CATV), direct broadcast from satellite (DBS) or
telecommunications.
The film production community will produce material in the electronic domain only if
it can be assured the access to international markets afforded by 24 Hz, which
translates easily to both 29.97 Hz and 25 Hz with minimal artifacts. Electronic
origination at either 25 Hz or 30 Hz would introduce serious artifacts upon conversion
to the other. It is my contention that a single worldwide standard for
HDTV production is feasible only if it accommodates distribution of material
originated at 24 Hz.
Broadcasters have recently made proposals to produce HDTV material in Widescreen-
525 or Widescreen-625 production formats with a 16:9 aspect ratio. These proposals
would use conventional video production equipment, modified for wide aspect ratio.
This technique would allow broadcasters and perhaps even local stations to originate
wide-aspect-ratio programming with minimum expenditure.
However, programs would contain approximately the same amount of picture detail as
conventional television, therefore the viewer could not take advantage of the wide
viewing angle and the increased sense of involvement which in my opinion is the key
to consumer differentiation of HDTV.

11.3.1 SMPTE 240M: 1125/60 Production Standard


The technical parameters of the 1125/60 production system were standardized in
SMPTE Standard 240M, adopted in February, 1989. Disclaimers on this document
indicate that it is applicable to HDTV production only, although the MUSE
broadcasting system in use in Japan is based on 1125/60 parameters. SMPTE 240M
applies to the 1125/60 analog signal. A digital representation of 1125/60 having a
sampling structure of 1920x1035 and a sampling rate of 74.25 MHz has recently been
adopted in SMPTE Standard 260M. SMPTE 240M specifies RGB or YPbPr color
components, with carefully-specified colorimetry and transfer functions. Luma (Y)
bandwidth is specified as 30 MHz, about five or six times the bandwidth of current
broadcast television. Not all currently-available HDTV equipment meets this
bandwidth, and most of the proposed transmission systems do not come close to the
performance of the studio production equipment.
Although the field rate of SMPTE 240M system is exactly 60 Hz — emphasized by
the 60.00 notation in the document — certain organizations propose operating at a

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

182 field rate of 59.94 Hz to maximize compatibility with existing NTSC equipment and
Multimedia and its Applications
production processes. Some current HDTV studio equipment is configurable for
operation at either rate. There are several system problems with 59.94 Hz field rate. In
59.94 Hz operation with standard digital audio sampling frequencies of either 44.1
kHz or 48 kHz, there is not an integer number of audio samples in each frame. This,
and the requirement for dropframe timecode, imposes a penalty on operation at that
rate. However the production procedures for 59.94 Hz are of course well established
for conventional 525/59.94 video, troublesome though they may be.

11.3.2 1125/60 Studio Equipment


Commercial hardware operating with the 1125-line system is now widely available.
Professional studio equipment that is commercially available for purchase at the time
of writing (in January 1993), includes:
z Cameras
z Videotape recorder (analog, 1 inch open reel)
z Videotape recorder (digital, 1 inch open reel)
z Videotape recorder (analog “Uni-Hi”, 19 mm cassette)
z Videodisc
z Telecine (film-to-video)
z Film recorders
z Video monitors
z Video projectors
z Still and sequence stores
z Up-converters
z Line doublers
z Cross-converters
z Down-converters
z Production Switchers
z Graphics and Paint Systems
z Blue-screen matte (Ultimatte)
z Test equipment
Sony has demonstrated a fourth-generation 1125/60 CCD camera that has resolution,
sensitivity and noise performance comparable to the best film cameras and motion
picture film.
HDTV is being used for the production of material to be released on theatrical
(cinema) film. Its acceptance as a production medium for cinema awaits the wider
availability of HDTV production facilities and more knowledge of HDTV production
techniques on the part of the film production community.

11.3.3 HDTV Exchange Standards


The de facto international television program distribution standard has been for the
last 40 years, and continues to be, 35 mm motion picture film. In North America, film
is transferred to video using 3-2 pulldown, which involves scanning successive film
frames alternately to form first three then two video fields. The film is run 0.1%

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

slower than 24 frames per second, to result in the required 59.94 Hz field rate. In 183
High Definition Television
Europe, film is run four percent fast with 2-2 pulldown to result in a 50 Hz frame rate. and Desktop Computing
Discussions of exchange standards are in an early stage, but there is general
agreement that film “friendliness” will be important for ATV: it is certain that the
primary origination medium for consumer ATV in any form will initially be 35 mm
motion picture film, due to the vast amount of existing program material in that
medium.

11.3.4 ATV Transmission Standards


All proposed transmission standards involve the reduction of transmission bandwidth
by exploiting the statistical properties of “typical” images and the perceptual
properties of the human visual system.
Terrestrial and satellite broadcasting requires spectrum allocation, which is subject to
domestic and international political concerns. Broadcasting standards are agreed by
the International Radio Consultative Committee (CCIR), a treaty organization that is
part of the United Nations. CCIR Recommendations — which you and we might call
standards — are agreed unanimously and internationally. The CCIR started setting
broadcasting standards well before the introduction of video recording, and the CCIR
has inherited video production and exchange standards even though they do not
strictly speaking involve the radio spectrum.
The CCIR has adopted Recommendation 709 for an HDTV production system.
HDTV colorimetry has been agreed, but the recommendation is in a half-finished state
reflecting the lack of international agreement on remaining parameters, particularly
frame rate and raster structure.
Broadcasters in Europe and the U.S. have proposed transmission systems having on
50 Hz and 59.94 Hz field rates respectively, citing requirements for “compatibility”
with local broadcast standards. No commercial equipment, and very little
experimental equipment, exists for either of these standards. It is now evident that
there will be no single worldwide transmission standard for ATV, mainly for political
and national industrial policy reasons. In the United States, adoption of SMPTE 240M
as an ANSI standard was blocked by a legal challenge by the ABC television network,
which cited “lack of industry consensus”. This was the first time in history that a
SMPTE standard was not endorsed by ANSI, and the ABC objection.

11.4 HDTV AND COMPUTING


Use of the HDTV production standard by the computer industry will open access to
equipment for image capture, recording, transmission, distribution and display.
Interface to 525-line video equipment has been difficult due to the disparity in
interface standards between computer graphics equipment and video equipment.
Further, poor detail and color resolution in NTSC have precluded its use in application
areas such as medicine and graphics arts. HDTV remedies those deficiencies by
adopting component color coding (instead of composite coding as in NTSC), zero
setup for accurate reproduction of blacks (instead of 7.5 percent setup as in NTSC and
in EIA-343-A), a single well-characterized colorimetry standard (as opposed to the
wide variety of phosphor chromaticity and white point values currently in use in
computer graphics) and a well-defined transfer function that will allow accurate
gamma correction.
In the past, many application areas have been forced to adopt proprietary display
interface standards because the resolution or color accuracy available from standard
workstation platforms has been inadequate. HDTV has a display quality that will meet
the requirements of even the most exacting users and this will allow the use of

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

184 platform technology in place of proprietary solutions in applications such as printing


Multimedia and its Applications
and publishing.
Quantel’s HDTV Graphic Paintbox is optimized for printing and publishing
applications, and includes interfaces to prepress equipment. The Rebo Research
ReStore offers access to HDTV through a Macintosh computer, and thereby allows
the use of HDTV imagery and equipment with commercially-available Macintosh
programs for retouching, presentation, color separation, and many other applications.
Obtaining motion in computer graphics has in the past required either very expensive
graphics accelerator hardware or painstaking frame-by-frame non-realtime animation.
HDTV will allow easy access to video equipment designed to handle motion video
and therefore will bring motion to the workstation world.

11.5 SQUARE PIXELS


Current digital 1125/60 HDTV production equipment conforms to SMPTE 260M,
which has 1920 samples per active line (S/AL), 1035 lines per picture height (L/PH)
and conforms to the 16:9 aspect ratio of SMPTE 240M. This combination of
parameters yields samples spaced about 4% closer horizontally than vertically, that is,
a sample aspect ratio of about 0.96. This situation came about due to lack of a
cohesive input from the computer industry during the standards development process.
Some U.S. interests were buoyant at this development, perceiving that non-square
pixels would deter the deployment of non-American HDTV equipment. Others were
dismayed that since no American equipment was available, the effect of the standard
would be to deter the computer industry from exploiting HDTV.
Unequal vertical and horizontal sample spacing — or non-square pixels — is very
inconvenient to computer users. Although many rendering systems can utilize any
pixel aspect ratio and geometric calculations are only moderately inconvenienced by
unequal spacing, interchange of raster data is severely compromised by non-square
pixels. If raster data at a sample aspect ratio of unity is to be utilized in a system with
a different sample aspect ratio, spatial re-sampling is necessary. Re-sampling requires
substantial computation. This computation takes time in non-realtime applications or
requires dedicated arithmetic hardware in realtime applications. Also, re-sampling
introduces picture impairments that are unacceptable in certain applications, such as in
graphics arts where 90 degree picture rotation is a common operation that must be
accomplished with no impairments.
Square pixels have a number of important properties that make geometric calculation
straightforward. There exists a huge volume of image data that is already scanned and
stored in square-pixel form and adoption of a standard with square pixels assures easy
access to this data. Many imaging devices (such as CCD sensors) and display devices
(such as LCD and plasma screens) have inherently fixed geometry and having these
devices use square pixels would allow the same devices to be used for television and
industrial applications.
The term Common Image Format (CIF) refers to an attempt to standardize the spatial
sampling structure of an HDTV image, independent of its frame rate. Square pixels
can be accommodated in a Common Image Format of 1920 samples per line and 1080
picture lines. This would result in just slightly less than two megapixels per frame, an
arrangement that results in optimum utilization of DRAM and VRAM devices. There
have been proposals for a 2048x1152 common image format, but its total storage
requirement of 2.25 Mpx has poor utilization of power-of-two memory and
multiplexer components. Unfortunately it is in the interest of some organizations in
the United States to delay the adoption of any HDTV standard, and despite ATSC’s
endorsement of the 1920´1080 common image format, the United States is took no
official position on the matter at the 1990 international CCIR standards discussions.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

185
11.6 DISPLAY REFRESH RATE AND INTERLACE High Definition Television
and Desktop Computing
A scanned display must be operated at a field rate sufficient to overcome wide-area
flicker, which is a strong function of ambient brightness level. Although 48 Hz is an
adequate refresh rate in the dark environment of a movie theater, and 60 Hz is
adequate for the average North American living room, a refresh rate of at least 70 Hz
is necessary for the high-ambient-brightness environments typical of computer
displays. All 525/59.94, 625/50 and 1125/60.00 television systems currently utilize
interlaced scanning. Interlace is a mechanism of reducing transmission bandwidth by
half, for a given wide-area flicker rate, by transmitting a single frame as two fields
whose scan lines intertwine. Interlaced systems reduce transmission bandwidth at the
expense of introducing inter-line twitter in pictures with a large amount of vertical
detail.
Interlace works reasonably well in television because the electro-optical filtering that
is inherent in television image sensors (such as camera tubes) reduces vertical detail
and consequently reduces inter-line twitter. Interlace causes objectionable twitter in
pictures that have not been electro-optically or otherwise filtered, such as in synthetic
computer graphic pictures that have large amounts of vertical detail or contain spatial
aliasing components.
Aside from issues of inter-line twitter, interlace is undesirable for television
production because of its inherent confusion of vertical detail and motion. Interlace is
now generally seen by the HDTV production community as an expedient way to
achieve a 2-to-1 bandwidth compression in order to permit economical camera and
recording equipment in the short term. When technology permits, HDTV production
equipment will utilize progressive scanning.
Although current-generation 1125/60 acquisition and recording equipment is
universally 2:1 interlaced, there is general agreement that the industry will tend
towards progressive (non-interlaced) systems for transmission and display. Zenith and
ATVA/MIT have proposed transmission systems that rely on a 787.5/59.94/1:1
production standard with progressive scan. Essentially, these proposals take a factor of
two penalty in spatial resolution — from 2 Mpx to 900 Kpx — in return for a factor of
two increase in temporal resolution.
The claim is made that these systems have better temporal resolution than interlaced
systems, but the cameras that have been shown for 787.5/59.94 have relatively poor
performance compared to the best available 1125/60 cameras, and to date no
conclusive experiments on HDTV/ATV motion rendition have been conducted.

Check Your Progress 2


1. What are High definition television (HDTV) standards?
……………………………………………………………………………..
……………………………………………………………………………..
2. What are Square Pixels?
……………………………………………………………………………..
……………………………………………………………………………..

11.7 LET US SUM UP


HDTV has the dubious distinction of being the most controversial item of new
technology to be introduced in recent times. Enmeshed in long running arguments
over basic technology, price and market access, HDTV is having a painful birth
worldwide, and the Australian experience differs little from that overseas. While the

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

186 focus of the domestic arguments in many respects differs from the focus of the
Multimedia and its Applications
international arguments, the common factor is that the standard evokes loud and
partisan argument. Digital television is important for future military planning for a
variety of reasons. The first is that COTS equipment based on this family of
technologies will become widely used in C3 applications, and compatibility with
modulations and formats will be required. Display and imaging technology developed
for digital TV applications is also finding its way into digital ISR systems. Finally, IW
applications such as propaganda broadcast will have to account for increasing
adoption of digital TV globally. The lesson highlights and explores the varied
dimensions of High Definition Television and Desktop Computing.

11.8 LESSON END ACTIVITY


Discuss among the group the arguments and knowledge aspects about the High
Definition Television and Desktop Computing.

11.9 KEYWORDS
HDTV: HDTV is different from video. It has greater resolution (2 Mpx vs. 300 Kpx)
and improved color accuracy.
Resolution: Resolution refers to the capability of an imaging system to reproduce fine
detail. As picture detail increases in frequency, the response of an imaging system
generally deteriorates.
ATV: Advanced Television (ATV) refers to delivery of entertainment television to
consumers at a quality level substantially improved over conventional television.

11.10 QUESTIONS FOR DISCUSSION


1. What is Psychophysics?
2. Discuss Quality features for High Definition Television.
3. What are ATV Transmission Standards?

Check Your Progress: Model Answers


CYP 1
1. High definition television (HDTV) is defined as having twice the vertical
and twice the horizontal resolution of conventional television, a picture
aspect ratio of 16:9, a frame rate at least 24 Hz, and at least two channels
of CD-quality sound. HDTV studio equipment commercially available at
the moment has about two mega pixels per frame (in a 1920x1035
format), six times the number of pixels as conventional television. The
data rate of current studio-quality HDTV is about 120 megabytes per
second. The parameters of HDTV are optimized for a viewing distance of
about three times picture height. This enables a horizontal picture viewing
angle of about thirty degrees, three times that of conventional television.
2. Resolution refers to the capability of an imaging system to reproduce fine
detail. As picture detail increases in frequency, the response of an
imaging system generally deteriorates. In film, resolution is measured as
the finest pattern of straight, parallel lines that can be reproduced,
expressed in line pairs per millimeter (lp/mm). A line pair contains a
black region and a white region.
Contd…

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

CYP 2 187
High Definition Television
1. Standards for motion pictures and video exist in three tiers: production, and Desktop Computing
exchange, and distribution. Production is the shooting and assembling of
program material. Exchange of programs takes place among program
producers and distributors. Distribution to the consumer may take place
using physical media such as videotape or videodisc, or through one of
four transmission media: terrestrial VHF/UHF broadcast, cable television
(CATV), direct broadcast from satellite (DBS) or telecommunications.
2. Current digital 1125/60 HDTV production equipment conforms to
SMPTE 260M, which has 1920 samples per active line (S/AL), 1035 lines
per picture height (L/PH) and conforms to the 16:9 aspect ratio of SMPTE
240M. This combination of parameters yields samples spaced about 4%
closer horizontally than vertically, that is, a sample aspect ratio of about
0.96. This situation came about due to lack of a cohesive input from the
computer industry during the standards development process. Some U.S.
interests were buoyant at this development, perceiving that non-square
pixels would deter the deployment of non-American HDTV equipment.
Others were dismayed that since no American equipment was available,
the effect of the standard would be to deter the computer industry from
exploiting HDTV.

11.11 SUGGESTED READINGS


Dhiraj Sharma, Foundations of IT, Excel Books, 2008.
Vaughan, Multimedia Making IT Work, Fifth Edition, Tata McGraw Hill.
John F. Koegel Bufford, Multimedia Systems, Pearson Education, 2003.
Judith Jeffloate, Multimedia in Practice (Technology and Applications), PHI, 2003.
Ze-Nian Li and Mark S. Drew, Fundamentals of Multimedia, Prentice-Hall, 2004.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

188
Multimedia and its Applications
LESSON

12
KNOWLEDGE-BASED MULTIMEDIA SYSTEMS

CONTENTS
12.0 Aims and Objectives
12.1 Introduction
12.2 Problems Facing Multimedia Systems
12.3 Anatomy of an Intelligent Multimedia System
12.3.1 Intelligent Multimedia System Design
12.3.2 The Multimedia
12.4 Knowledge Sources for Multimedia Interaction
12.5 Multimedia Language Understanding
12.6 Intelligent Automated Multimedia Output Generation
12.7 Knowledge-based Development Tools for Multimedia Systems
12.8 Let us Sum up
12.9 Lesson End Activity
12.10 Keywords
12.11 Questions for Discussion
12.12 Suggested Readings

12.0 AIMS AND OBJECTIVES


After studying this lesson, you would be able to understand:
z The role of artificial intelligence in multimedia systems
z The ability to communicate and make presentations in coordinated multiple
media/modalities
z Integrate the various subsystems and functionality
z The operator interaction with sophisticated computer systems

12.1 INTRODUCTION
Multimedia systems hold the promise of great benefits such as increasing peoples’
productivity, efficiency and effectiveness, and increasing the utility and enjoyment of
our vast information resources. Multimedia systems promise to provide the needed
increase in the bandwidth of information exchange between humans and computers,
and to enhance human understanding of complex information through better
presentation technologies and appropriate combinations of these technologies for
information presentation. The extent to which these promises are fulfilled depends on
continued improvement in hardware technology, development of much needed

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

supportive software technology, and the growth of a community of trained multimedia 189
Knowledge-based
authors and technologists. Multimedia Systems
The scope of today's multimedia systems is very limited, and the functionality of the
different types of systems is not integrated to form a productive workplace. In fact,
multimedia means different things to different people. To some it means video for
conferencing, to others it means hypermedia documents, and to others it means
multimedia human-computer dialogue. Also, some people view multimedia
documents as static and fixed, and others view documents and data as "live." For
example, Clark states that multimedia should be referred to as ‘interactive electronic
presentation (IEP)’ to describe a collusion of sounds and images elicited from a piece
of (electromechanical) machinery by the user's persistent activity. However, Clark's
definition includes only the concept of self-contained multimedia "books" to be
consumed by "readers" or "viewers," and he states that an IEP is closed and finite. On
the other hand, Boy and Cornell, Suthers, and Woolf stress that documents and data
should be treated as "live" or dynamic.
Certainly the view of documents and data being static and fixed is inadequate for
people engaged in productive activity or problem-solving tasks. People will need to be
able to locate, retrieve, use, save, and possibly manipulate relevant multimedia
documents/data in an environment that will support the accomplishment of their tasks,
possibly in cooperation with others.
We take the position that multimedia does not simply mean self-contained static
documents, nor does it simply means that computer-based video is used to provide a
"media space" to support cooperative work among co-located or remotely located
people. Our concept of a multimedia system is that of an integrated work environment
with a human-computer interface designed as an intelligent agent with the ability to
communicate and make presentations in coordinated multiple media/modalities. The
objective is to integrate the various subsystems and functionality that a user needs in a
workstation environment, to simplify operator interaction with sophisticated computer
systems, and to minimize the time and effort spent by the user on manipulating the
interface. The human-computer interface should have the ability to: conduct dialogue
with the user; act as an intelligent assistant for accessing application systems; accept
and understand input expressed in multimodal language; decide how information and
responses are to be presented to the user, including the selection of media/modalities
for information presentation, composition, and presentation of the output in multiple
modalities; adhere to respected human factors guidelines for human-computer
interaction and information presentation; and support cooperative work with others.
This lesson discusses the possible role of artificial intelligence in multimedia systems
and some of the current research being conducted.

12.2 PROBLEMS FACING MULTIMEDIA SYSTEMS


Although great benefits are to be gained from multimedia systems, their incorporation
into the workplace, school, and home is not an easy task. The requisite hardware is
becoming widely available at reasonable cost, but other problems remain to be solved.
These problems are primarily in two areas, personnel and technology:
1. Personnel: There is a lack of people trained in the development and management
of distributed databases and document repositories. In the words of Grimes and
Potel, “a fundamental problem afflicts multimedia authoring-not enough people
have the necessary skills”.
2. Technology: There is a lack of software designed to integrate, control, coordinate,
manage, and adapt the various media for human computer interfaces.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

190 ™ There is a lack of support software for facilitating the authoring, composition,
Multimedia and its Applications
and production of multimedia documents.
™ There is a lack of support technology in the area of multimedia data and
document storage and manipulation.
™ There is a lack of search and pattern recognition capability for locating
information and/or documents that are of interest in multimedia storage
facilities.
™ There is a lack of software support technology for group decision making and
cooperative work, especially in the application of multimedia technology to
cooperative decision making and work.
All of the tasks listed above are difficult and provide good candidates for application
of artificial intelligence technology to help solve the problem.

12.3 ANATOMY OF AN INTELLIGENT MULTIMEDIA


SYSTEM
As mentioned earlier, our concept of a multimedia system consists of an integrated
work environment with a human-computer interface designed as an intelligent agent
with the ability to conduct dialogue with the user in coordinated multiple
media/modalities. The human-computer interaction is modeled on the manner in
which two or more people naturally communicate in coordinated multiple modalities
when working with graphics, video, and other devices at hand.
The system should have the ability to:
z Conduct dialogue with the user:
™ Adhere to respected principles of conversation, and
™ Adhere to respected human factors guidelines for human-computer interaction
and information presentation, including maintaining the context of the
dialogue and maintaining consistency in displays and presentations.
z Maintain knowledge and belief models to enable the system to understand user
inputs and compose system outputs:
™ Track and model the dynamic focus of the dialogue in order to maintain
context during the dialogue.
™ Model the user's task(s) and the state of the user's accomplishments and
progress with respect to the task(s).
z Maintain knowledge bases of information about:
™ Modalities and user interaction,
™ World knowledge, and
™ Application-specific knowledge.
z Act as an intelligent assistant for accessing and using application systems, through
such activities as:
™ Assisting the user in finding relevant information on topics of interest.
™ Assisting the user with finding, selecting, and accessing appropriate tools to
apply to the task.
™ Assisting and guiding the user in the accomplishment of tasks.
™ Providing explanations and multimedia presentations to aid in user
comprehension of relevant information.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

191
z Accept and understand input expressed in multimedia language. Knowledge-based
Multimedia Systems
z Provide the user with flexibility in the media that is selected and combined for
expressing input to the system.
z Decide how information and responses are to be presented to the user:
™ Select modalities/media for information presentation.
™ Compose the output in multiple modalities.
™ Present the multimedia output in a coordinated manner.
™ Manage the windows by intelligently performing the window operations (i.e.,
creation, placement, sizing/resizing, moving, iconization, retrieval, and
destruction) to relieve the user of the burden of performing these chores.

12.3.1 Intelligent Multimedia System Design

Figure 12.1: Multimedia System Design


Figure 12.1 provides an overview of our design for an intelligent multimedia system.
An implemented system, called CUBRICON, has been developed as a proof-of-
concept prototype as part of the Intelligent Multimedia Interfaces Project. The

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

192 CUBRICON prototype includes implementation of all the components shown in the
Multimedia and its Applications
box labeled "Multimedia System." The application used for the CUBRICON
prototype was that of air force mission planning. The input modes that were
implemented in the CUBRICON system were speech, keyboard, and mouse. All of the
output modes implemented in the CUBRICON system except for video, and the
implemented graphics and animation capability was fairly simple.
The CUBRICON system design is based upon an integrated use of communication
modes or media, whether verbal, visual, tactile, or gestural. Human beings primarily
communicate with each other via written and spoken natural language and gestures,
supplemented with pictures, diagrams, video, and other sounds. The CUBRICON
system design provides for the use of a unified multimedia language. Input and output
streams are treated as compound streams with components corresponding to different
media. This approach is intended to imitate, to a certain extent, the ability of humans
to simultaneously accept input from different sensory' devices (such as eyes and ears),
and to simultaneously produce output in different media (such as voice, pointing
motions, and drawings).
The CUBRICON system includes: (a) language parsing and generation that processes
and supports synchronized multimedia input and output streams, (b) knowledge
representation and inferencing to provide reasoning ability, (c) knowledge bases and
models to provide a basis for its decision-making ability, and (d) automated
knowledge-based medium selection and formulation of responses.
CUBRICON possesses the following critical functionality. CUBRICON:
Accepts and understands multimedia input such that references to entities in a natural
language sentence can be accompanied by coordinated simultaneous pointing to the
respective entities on a graphics display:
z is able to use a simultaneous pointing reference and natural language reference to
disambiguate one another when appropriate;
z infers the intended referent of a point gesture which is inconsistent with the
accompanying natural language automatically composes and generates relevant
output to the user in coordinated multimedia:
z automatically selects appropriate output media/modalities for expressing
information to the user, with the selection based on the nature of the information,
discourse context, and the importance of the information to the user's task;
z uses its media/modalities in a highly integrated manner including simulated
parallelism;
z judges the relevance of information with respect to the discourse context and user
task and responds in a context-sensitive manner;
z adheres to respected human factors guidelines for human computer interaction and
information presentation; these guidelines include:
™ maintain the context of the user/computer dialogue,
™ maintain consistency throughout a display, and
™ maintain consistency across displays.
Automatically performs the window manipulation operations (i.e., creation,
placement, sizing/resizing, moving, iconization, retrieval, and destruction) so as to
relieve the user of the need to manipulate the interface.
CUBRICON accepts input from three input devices: speech input device, keyboard,
and mouse device pointing to objects on a graphics display. CUBRICON produces

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

output for three output devices: high-resolution color graphics display, monochrome 193
Knowledge-based
display, and speech output device. The primary path that the input data follows is Multimedia Systems
indicated by the modules that are numbered:
1. Input Coordinator,
2. Multimedia Parser Interpreter,
3. Executor/Communicator to Target System,
4. Multimedia Output Planner, and
5. Coordinated Output Generator.
The Input Coordinator module accepts input from the three input devices and fuses the
input streams into a single compound stream, maintaining the temporal order of
tokens in the original streams. The Multimedia Parser/Interpreter is an augmented
transition network (ATN) that has been extended to accept the compound stream
produced by the Input Coordinator and produce an interpretation of this compound
stream. Appropriate action is then taken by the Executor module. This action may be a
command to the mission planning system, a database query, or an action that entails
participation of the interface system only. An expression of the results of the action is
then planned by the Multimedia Output Planner for communication to the user. The
Output Planner is a generalized ATN that produces a multimedia output stream
representation with components targeted for different devices (e.g., speech device,
color graphics display, monochrome display). This output representation is translated
into visual/auditory output by the Coordinated Output Generator module. This module
is responsible for producing the multimedia output in a coordinated manner in real
time (e.g., the Planner module can specify that a certain icon on the color graphics
display must be highlighted when the entity represented by the icon is mentioned in
the simultaneous natural language output).
The CUBRICON system includes several knowledge sources to be used during
processing. The knowledge sources include:
z a lexicon,
z a grammar defining the language used by the system for multimedia input and
output,
z discourse model,
z user model,
z a knowledge base of human-computer interaction knowledge, including output
planning strategies to govern the composition of multimedia responses to the user,
z a knowledge base of information about generally shared world knowledge, and
z a knowledge base of information about the specific task domain of tactical air
control.
These knowledge sources are used for both understanding input to the system and
planning/generating output from the system. They are discussed in more detail later.
The CUBRICON system is implemented on a Symbolics Lisp Machine with a color
graphics monitor, a monochrome monitor, and a mouse pointing device. Speech
recognition is handled by a Dragon Systems VoiceScribe 1000. Speech output is
produced by a DECtaik speech production system, CUBRICON software is
implemented using the SNePS semantic network processing system, an ATN
parser/generator, and Common Lisp. SNePS is a fully intentional propositional
semantic network and .has been used for a variety of purposes and applications.
SNePS provides:
z a flexible knowledge representation facility in the semantic network formalism;

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

194 z representation of rules in the network in a declarative form so they can be


Multimedia and its Applications
reasoned about like any other data;
z a bi-directional inference subsystem which focuses attention towards the active
processes and cuts down the fan-out of a pure forward or backward chaining;
z a simulated multiprocessing control structure;
z special nonstandard connectives to model human reasoning processes.

12.3.2 The Multimedia


The CUBRICON design and implementation incorporates the following media or
modalities: spoken natural language, typed or printed natural language, pointing
gestures, geographical maps, color graphic pictorial displays, tables, and "fill in the
blank" forms. This list does not exhaust the possibilities, of course, but provides a
good variety with which to prove our concept and upon which to build. Other media,
such as video and eye-tracking devices, were not used in the prototype but would be a
natural extension of the system.
One of the significant features of the CUBRICON system is that it not only generates
output in multiple modalities, but also decides which modalities to use and how to use
and combine them. CUBRICON modality selection is primarily based on the nature
and characteristics of the information and the purpose for which the modality is being
used. Our system design is based on the premise that graphic/pictorial presentation is
always desirable. The following is a list of the CUBRICON modalities and a brief
summary of the selection criteria.
1. Color graphics: Selected whenever the CUBRICON system knows how to
represent the information pictorially.
2. Geographic maps: Selected when the information is geographically locative or
has a locative attribute.
3. Table: Selected when the values of common attribute(s) of several entities must
be expressed.
4. Forms: A predefined form is selected when the task engaged in by the user
requires the form. An example modeled on one of the forms used by air force
mission planners.
5. Animation: Simple animation is used for information or objects that can be
visually presented and which are temporally changing or moving.
6. Deictic gestures: Selected for emphasis or to call the user's attention to one or
more objects on the screen(s).
7. Natural language: Selected for the expression of a proposition, relation, event, or
combination thereof when the types of knowledge structures being expressed are
heterogeneous. Natural language can be presented in either spoken or written
form. Printed natural language (printed on the screen) is selected for longer
technical responses that would strain the user's short-term memory if speech were
used.
Spoken natural language is used in a manner that is designed to avoid
overwhelming the user's short-term memory.
It is selected for:
(a) Dialogue descriptions to assist the user in comprehending the presented
information. These include explanations of graphic displays or display
changes and verbal highlighting of objects on the displays e.g., "The enemy
airbases are highlighted in red".

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

(b) Warnings to alert the user of important events that have taken place or are 195
Knowledge-based
about to take place (e.g., new critical information comes into the application Multimedia Systems
system database and the system notifies user: "The XXX airbase has been
damaged by enemy shellfire".
(c) Informing the user about the system's activity (e.g.. "I'm still working" when
the user must wait for output from the system).
Most frequently, multiple modalities are desirable to present a body of information to
the user. For example, to inform the user about the movements of a certain tank
battalion, a desirable presentation might be an explanation delivered in combined
spoken speech and coordinated drawing on a graphic map display showing
movements of the battalion, as well as a printed textual summary with ancillary
information on the monochrome display.
The multiple modalities should be selected to complement and enhance one another.
Andriole has used graphic equivalence effectively using dual displays or split screens
to present the same material in different forms to aid user comprehension and
problem-solving performance. We are not restricting the system to presenting the
same material in different forms, but, instead, our system presents related material or
different aspects of a given event or concept in different forms/modalities (as
appropriate based on the nature and characteristics of the information). We are also
not restricted to graphic display presentations.

Check Your Progress 1


1. What are Knowledge-based Multimedia Systems?
…………………………………………………………………………….
…………………………………………………………………………….
2. What are the problems facing Multimedia Systems?
…………………………………………………………………………….
…………………………………………………………………………….

12.4 KNOWLEDGE SOURCES FOR MULTIMEDIA


INTERACTION
The system includes several knowledge sources for use in multimedia language
understanding and production. These knowledge sources are a lexicon and grammar;
a discourse model; a user model; a knowledge base of human-computer interaction
knowledge, including output planning strategies to govern the composition of
multimedia responses to the user; a knowledge base of information about generally
shared world knowledge; and a knowledge base of information about the application
task domain used in this research effort, namely, tactical air control.

12.5 MULTIMEDIA LANGUAGE UNDERSTANDING


A user communicates with the system using natural language and gestures (pointing
via a mouse device). Typically, the user speaks to the system, but keyboard input is
just as acceptable. The use of pointing combined with natural language forms a very
efficient means of expressing a definite reference. This enables a person to use a
demonstrative pronoun as a determiner in a noun phrase and simultaneously point to
an entity on the graphics display to form a succinct reference. Thus, a person would be
able to say "this SAM" (surface-to-air missile system) and point to an object on the
display to disambiguate which of several SAM systems is meant.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

196
Multimedia and its Applications 12.6 INTELLIGENT AUTOMATED MULTIMEDIA
OUTPUT GENERATION
As the volume and types of information and presentation types become more
numerous and varied, it will become increasingly important for computer systems to
make automated knowledge-based decisions regarding information presentation to
users. If we consider the status of the technology for composing and temporally and
spatially coordinating information presentations using just two media, natural
language and graphics, we find that the problem is difficult and the technology is just
in its infancy. As more media and modalities are added to the presentation suite, the
problems become more severe and the need for research becomes more significant.
Such intelligent automated multimedia information presentation has application in
many areas, including multimedia or hypermedia document presentation, information
retrieval from multimedia databases and knowledge bases, and explanation
subsystems of user help facilities.
Some of the sub problems of multimedia output generation include:
1. Selection of media and apportionment of information content among the media
used for information presentation.
2. Coordination of the media with respect to both space and time.
3. Consistency of selection, composition, and generation across presentations.

12.7 KNOWLEDGE-BASED DEVELOPMENT TOOLS FOR


MULTIMEDIA SYSTEMS
As more media become viable and available for computer systems, the need for more
sophisticated development tools becomes more critical. For example, high-quality
authoring systems for hypermedia or multimedia presentation/document authoring are
needed as well as development tools for multimedia human-computer interfaces.
Generic intelligent automated multimedia composition and generation technology,
mentioned above, could play an important role in alleviating some of the development
problems. Hypermedia document systems: As mentioned above, the area of
hypermedia/multimedia document authoring and presentation systems requires better
system development tools and intelligent automated multimedia presentation
technology.
Another area in which hypermedia document technology could benefit is in the area of
"smart links," that is, hypertext/media links which have some decision-making ability
to lead the viewer/reader to appropriate information based on factors such as what the
system knows the viewer has already "read," the viewer's goals and objectives for
viewing the material, and the viewer's background and level of expertise in relevant
fields.

Check Your Progress 2


1. What are Knowledge Sources for Multimedia Interaction?
…………………………………………………………………………….
…………………………………………………………………………….
2. What are Knowledge-based development tools for multimedia systems?
…………………………………………………………………………….
…………………………………………………………………………….

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

197
12.8 LET US SUM UP Knowledge-based
Multimedia Systems
Multimedia systems hold the promise of great benefits in terms of increased
productivity, efficiency, effectiveness, and information enjoyment. Multimedia
systems promise to provide the needed increase in the bandwidth of information
exchange between humans and computers, and to enhance human understanding of
complex information through better presentation, technologies and appropriate
combinations of these technologies for information presentation. However, before
these promises can be fulfilled, there are many problems that need to be solved. Many
of these problems are in technology areas such as multimedia document authoring,
multimedia information and document storage and management, search techniques,
computer-supported collaborative work, and multimedia human-computer interaction.
The field of artificial intelligence will help provide solutions to these problems. This
lesson discusses the possible role of artificial intelligence in multimedia systems. A
concept for a multimedia system is presented that provides an integrated work
environment with a human-computer interface designed as an intelligent agent with
the ability to communicate and make presentations in coordinated multiple
media/modalities. The objective is to integrate the various subsystems and
functionality that a user needs in a workstation environment, to simplify operator
interaction with sophisticated computer systems, and to minimize the time and effort
spent by the user on manipulating the interface. This lesson also reviews some of the
current research being conducted in the area of artificial intelligence applied to
multimedia systems.

12.9 LESSON END ACTIVITY


Discuss among the group the arguments and knowledge aspects about the knowledge
based Multimedia systems. Search the web and multimedia magazines to explore the
dimensions about knowledge based multimedia systems.

12.10 KEYWORDS
CUBRICON: An intelligent system (CUBRC Intelligent CONversationalist)
Input Coordinator: The module which accepts input from the three input devices and
fuses the input streams into a single compound stream, maintaining the temporal order
of tokens in the original streams.
Knowledge Sources: Knowledge Sources are used for both understanding input to the
system and planning/generating output from the system.

12.11 QUESTIONS FOR DISCUSSION


1. Discuss the nature and scope of an Intelligent Multimedia System.
2. How an Intelligent Multimedia System can be designed?
3. What are the development tools for multimedia systems?

Check Your Progress: Model Answers


CYP 1
1. Multimedia systems hold the promise of great benefits such as increasing
peoples’ productivity, efficiency and effectiveness, and increasing the
utility and enjoyment of our vast information resources. Multimedia
systems promise to provide the needed increase in the bandwidth of
information exchange between humans and computers, and to enhance
human understanding of complex information through better presentation
technologies and appropriate combinations of these technologies for
Contd….
Downloaded by Wahitha Banu (wahitha19@gmail.com)
lOMoARcPSD|19697633

198 information presentation. The extent to which these promises are fulfilled
Multimedia and its Applications
depends on continued improvement in hardware technology, development
of much needed supportive software technology, and the growth of a
community of trained multimedia authors and technologists. The scope of
today's multimedia systems is very limited, and the functionality of the
different types of systems is not integrated to form a productive
workplace. In fact, multimedia means different things to different people.
To some it means video for conferencing, to others it means hypermedia
documents, and to others it means multimedia human-computer dialogue.
2. Although great benefits are to be gained from multimedia systems, their
incorporation into the workplace, school, and home is not an easy task.
The requisite hardware is becoming widely available at reasonable cost,
but other problems remain to be solved. These problems are primarily
both areas – personnel and technology:
™ There is a lack of people trained in the development and management of
distributed databases and document repositories.
™ There is a lack of software designed to integrate, control, coordinate,
manage, and adapt the various media for human computer interfaces.
™ There is a lack of support software for facilitating the authoring,
composition, and production of multimedia documents.
™ There is a lack of support technology in the area of multimedia data and
document storage and manipulation.
™ There is a lack of search and pattern recognition capability for locating
information and/or documents that are of interest in multimedia storage
facilities.

CYP 2
1. The system includes several knowledge sources for use in multimedia
language understanding and production. These knowledge sources are a
lexicon and grammar; a discourse model; a user model; a knowledge base
of human-computer interaction knowledge, including output planning
strategies to govern the composition of multimedia responses to the user;
a knowledge base of information about generally shared world
knowledge; and a knowledge base of information about the application
task domain used in this research effort, namely, tactical air control.
2. As more media become viable and available for computer systems, the
need for more sophisticated development tools becomes more critical. For
example, high-quality authoring systems for hypermedia or multimedia
presentation/document authoring are needed as well as development tools
for multimedia human-computer interfaces. Generic intelligent automated
multimedia composition and generation technology, mentioned above,
could play an important role in alleviating some of the development
problems. Hypermedia document systems: As mentioned above, the area
of hypermedia/multimedia document authoring and presentation systems
requires better system development tools and intelligent automated
multimedia presentation technology. Another area in which hypermedia
document technology could benefit is in the area of "smart links," that is,
hypertext/media links which have some decision-making ability to lead
the viewer/reader to appropriate information based on factors such as
what the system knows the viewer has already "read," the viewer's goals
and objectives for viewing the material, and the viewer's background and
level of expertise in relevant fields.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

199
12.12 SUGGESTED READINGS Knowledge-based
Multimedia Systems
Dhiraj Sharma, Foundations of IT, Excel Books, 2008.
Vaughan, Multimedia Making IT Work, Fifth Edition, Tata McGraw Hill.
John F. Koegel Bufford, Multimedia Systems, Pearson Education, 2003.
Judith Jeffloate, Multimedia in Practice (Technology and Applications), PHI, 2003.
Ze-Nian Li and Mark S. Drew, Fundamentals of Multimedia, Prentice-Hall, 2004.

Downloaded by Wahitha Banu (wahitha19@gmail.com)


lOMoARcPSD|19697633

201
MODEL QUESTION PAPER Model Question Paper

MCA
Second Year
Sub: Multimedia and its Applications
Time: 3 hours Total Marks: 100
Direction: There are total eight questions, each carrying 20 marks. You
have to attempt any five questions.

1. Discuss the suitability of Macintosh and Windows production platforms.


2. Discuss and differentiate Authoring Tools with Programming Tools.
3. How Symbols and Icons can add value in Multimedia applications?
4. What are file formats for preservation of digital sound?
5. Give some softwares that are suitable for Page Lay out generators.
6. What are different types of Animation Systems?
7. Why it is said that Video is a continuous stream of images? Discuss.
8. What are the requirements of multimedia?

Downloaded by Wahitha Banu (wahitha19@gmail.com)

You might also like