You are on page 1of 55

MULTIMEDIA COMMUNICATION

Introduction, Multimedia information representation, Multimedia networks,


Multimedia applications, Application and Networking Terminology.
(Chap 1 Text 1)

1.1 INTRODUCTION:
 Multimedia Communication embraces a range of applications and networking
infrastructures.
 Multimedia is used to indicate that the information / data relating to an application
may be composed of a number of different types of media which are integrated
together.
 The different media types are Text, images, speech, audio and video and some
example applications are video telephony (speech and Video).
 Multimedia electronic mail(text, images and audio for example), interactive television
(Text, audio and video), electronic commerce (Text, images, audio and video), Web
TELEVISION(Text, audio and Video) and many others.
 There are number of different types of networks that are used to provide the
networking infrastructure.
 These include not only networks that were designed from the outset to provide
Multimedia Communication services – normally referred to as broadband
multiservice networks – but also networks that were designed initially to provide
just a single type of service and it is as a result of advances in various technologies
that these can now support a range of other (Multimedia)services.
 For example, public and private switched telephony service but they are now used
the different media types.
 Similarly, computer networks such as the internet, which were designed initially to
provide general data communication services such as electronic mail and file
transfers, can now support a much richer set of Multimedia applications.
 In terms of the different types of media, Text and images are generated and
represented in a digital form.
 Speech, audio and video, however, are generated in the form of continuously varying
– normally referred to as analog signals.
Mohammadali Shaikh PIMSE Page | 5
MULTIMEDIA COMMUNICATION

 Hence in order to integrate all of the different media types together, it is necessary to
first convert the various analog signals into a digital form.

 The integrated digital information stream can then be stored within computer and
transmitted over a network in a unified way.
 In addition, unlike Text and images which are created in the form of a single block of
digital information, since speech, audio and video are continuously varying signals,
the digitization process can produce large volumes of information which carries on
increasing with time.
 Hence in most Multimedia applications, in order to reduce the volume of information
to be transferred, arrange of compression algorithms are applied to the different
media types prior to integrating them together.
 In addition to the compression algorithms that have been used for many years with
Text and images, there is now available a wide range of algorithms for the
compression of speech, audio and video.
 However because of the relatively low levels of compression that could be achieved
Multimedia applications involving speech, audio and video – video telephony and
video conferencing for example – required a high capacity transmission channel to
transmit the integrated source information.
 The capacity of the transmission channel required has reduced to the point that most
types of communication network can now support a range of Multimedia
applications.
 In addition, it is as result of the same advances in compression algorithm, coupled
with the development of the associated integrated circuits, that most television
broadcasts are now in a digital form.
 A major issue in relation to analog television has always been the high level of
transmission capacity that is required to broadcast the composite television signal
containing the integrated audio and video signals.
 The move to digital means that a transmission channel that was once used to
broadcast a single(Analog) television program can now be used to broadcast multiple
programs(Digital) services can use the same channels so enabling Multimedia
applications such as interactive television and electronic commerce to be supported.
Mohammadali Shaikh PIMSE Page | 5
MULTIMEDIA COMMUNICATION

 In addition, with all applications that involve the use of a communication networks, it
is imperative that the two or more items of the equipment that are attached to the
networks to provide the service, operate and interpret the transmitted information in
the same way.
 This can only be achieved by the adoption of international standards for all
applications and for all of the different types of networks.
 Also their adoption by all the manufacturers of the related equipment.
 Multimedia communication includes a range of applications and networking
infrastructures.
 Definition1: The term "multimedia" is used to indicate that the information/data being
transferred over the network may be composed of one or more of the following media
types:

1. Text: Includes both Unformatted Text - comprising strings of characters from a


limited character set and Formatted Text - comprises strings as used for the structuring,
access, and presentation of electronic documents.
2. Images: Includes Computer Generated Image - comprising lines, curves, and circles,
and Digitized Images of documents and pictures.
3. Audio: Includes both low-fidelity speech - as used in telephony and high fidelity
speech - stereophonic music as used with compact discs.
4. Video: Includes short sequences of moving images (also known as video clips) and
complete movies/films
 Definition2: Multimedia is any combination of text, art, sound, animation, and video. It is
delivered to the user by electronic or digitally manipulated means. A multimedia project
development requires creative, technical, organizational, and business skills.
 Definition3: Multimedia is the presentation of a (usually interactive) computer
application, incorporating media elements such as text, graphics, video, animation and
sound on computer.
 Multimedia applications may involve either of the following:
o Person-to-Person communications or
o Person-to-System communications

Mohammadali Shaikh PIMSE Page | 5


 Person-to-Person communicates using suitable Terminal Equipment (TE)
 Person-to-System communications:
o Person interacts with the system using suitable Digital device like workstation or
multimedia personal computer (PC).
o These Digital device are located either in homes or offices.
o Basically system is a server containing a collection of files or documents - each
comprising digitized text, images, audio, and video information either singly or integrated
together in some way alternatively It may also contain - a library of digitized
movies/videos.
o User interacts with the server by means of a suitable selection device connected to the Set-
top box (STB) associated with a television or modem used with the computers
 Networking infrastructure: provided using a number of different types of network
 Networks: Two types
o Designed initially to provide just a single type of service due to advances in various
technologies these networks can now provide a range of different other services.
o a. Ex 1:PSTN (Public Switched Telephone Network) or GSTN (General Switched
Telephone Network) – designed initially to provide the basic switched telephone
service but due to the Advances in digital signal processing hardware and associated
software PSTNs/GSTNs now provide a range of more advanced services involving -
text, images, and video.
o b. Ex 2: Data network: designed initially to support basic data applications - e-mail,
file transfers, and others now support a much richer set of applications - which involve
images, audio, and video.
o Designed from the outset to provide multimedia communication services. Ex 1:
ATM networks.

1.2 Multimedia Information Representation:


o Applications involving text and images - comprise blocks of digital data units.
o Text data - typical unit is block of characters with each character represented by, fixed
number of Binary digits (bits) or Code word.
o Digitized image data - comprises a 2-D block of pixels (picture elements) with each
pixel represented by a fixed number of bits
o Applications involving text and images: comprise the short request for a file.
o Ex.: file contents being returned, the duration of the overall transaction is relatively
short.
o Applications involving Audio and Video Signals: Vary continuously with time as the
amplitude of the speech, audio, or video signal varies.
o Ex.: Typical telephone conversation can last for several minutes and Movie
(comprising audio and video) can last for a number of hours.
o Applications involves of single type of media: Basic form of representation of the
particular media type is often used.
o Applications involving either text-and-images or audio-and-video: Their Basic
form is often used since the two media types in these applications have the same form
of representation.
o Applications involving of different media types: We integrated together in some way
as it's necessary to represent all 4 media types in a digital form.
o For text and images: This (digital) is their standard form of representation.
o For audio and video: since, their basic forms of representations are analog signals -
these must be converted into a corresponding digital form - before they can be
integrated with the two other media types.
o Digitization of an audio signal: produces a digital signal with amplitude of the signal
varies continuously with time and is of relatively high bit rate, is measured by bps (bits
per second) and for speech signal a typical bit rate of 64 kbps.
o Applications involving audio can be of a long duration: this bit rate must be
sustained for an equally long time period
o Digitization of video signal: the same applies as that of audio signals but, except that
the much higher bit rates and longer time durations are involved.
o In general, the communication networks that are used to support applications that
involve audio and video cannot support the very high bit rates that are required for
representing these media types in a digital form hence we go for compression.
o Compression: It's a technique first applied to the digitized signals in order to reduce
the resulting bit rate to a level which can be supported by various networks.
o Compression to text and images: To reduce the time delay between a requests being
made for some information and the information becoming available on the screen of a
computers or over others.

1.3 Multimedia Networks:


 Five basic types of communication networks are used to provide multimedia
communication services:
o 1. Telephone networks.
o 2. Data networks.
o 3. Broadcast television networks.
o 4. Integrated services digital networks.
o 5. Broadband multiservice networks.
o 1,2, and 3 networks are initially designed to provide just a single type of service as listed
as below:
o 1. Telephone networks: telephony
o 2. Data networks: data communications
o 3. Broadcast television networks: broadcast television
o Technological developments enabled these networks to provide additional services.
o 4, and 5 networks: Designed from the outset to provide multiple services.
1.3.1 Telephone networks:
Main components of the network are shown in the Fig below.
 Public Switched Telephone network (PSTNs) has been in existence for many years and
have gone through many changes over the time.
 Designed to provide a basic switched telephone service which, with the advent of the
other network types has become known as POTS (Plain Old Telephone Service).
 Switched': term is used to indicate that the subscriber can make a call to any other
telephone that is connected to the total network.
 Initially such networks spanned just a single country later, telephone networks of
different countries were interconnected so, that they now provide an international
switched service.

Local Exchange/End Office: Telephones located in the home or in a small business are
connected directly to their nearest LEs/Eos.
Private Branch Exchange (PBX):
 Telephones located in the medium or large office/site are connected to a PBX or
Private switching Office.
 Provides a (free) switched service between any two telephones - that are connected to
it.
 Connected to its nearest LE (public), which enables the telephone that are connected to
the PBX also to make calls through a PSTN.

Cellular Phone Networks: Been introduced which provide the similar service to the mobile
subscribes by means of the handsets that are linked to the cellular phone network
infrastructure by radio.

MSC (Mobile Switching Center): it's the switch used in the cellular phone network Like the
PBXs also, connected to a switching office in a PSTN which, enables both sets of subscribers
to make calls to one another.

IGE (International Gateway Exchange): route and switch the international calls.

General scheme of MODEM is shown in the Fig below


 Speech signal: is an analog signal varies continuously with time according, to the
amplitude and frequency variations of the sound resulting from the speech.
 Microphone: used to convert this into an analog electrical signal.
 Telephone networks operate in circuit mode which means, for each call a separate
circuit is set up through the network of the necessary capacity for the duration of the
call.
 Access circuits: link the telephone handsets to a PSTN or PBX were designed to
carry the 2-way analog signals associated with a call.
 Hence, within the PSTN all the switches and the transmission circuits that
interconnect them operate in digital mode to carry a digital signal a stream of binary
1s and 0s over the analog access circuits require the device modem.
Modem:
o At the sending side: modem converts the digital signal output by the source digital
device into an analog signal which is, compatible with a normal speech signal it is
routed through the network in the same way as a speech signal.
o At the receiving side: modem converts the analog signal back again into its digital
form before, relaying this to the destination digital device.
o Have the necessary circuits to set up and terminate the call.
o Using a pair of modems: at each subscriber access point a PSTN can also be used to
provide a switched digital service.
o Early modems: supported only a very low bit rate service of 300bps.
o Modems now support, bit rates of up to 56kbps as the result of advances in digital
signal processing circuits and is sufficient,
o To support various applications comprising of text and images integrated together
and also services that comprise speech and low resolution video modems are now
available to use with same access circuits that provide a high bit rate channel which is
in addition to the speech channel used for telephony the bit rate of this second
channel,
o Typically is such that it can support high resolution audio and video hence, they are
used to provide access to servers that support a range of entertainment related
applications.
o Figure below shows the general scheme of this, and such applications
need bit rates in excess of 1.5 Mbps.

 Technological advances in modems area have been made PSTNs can now support speech
applications and also a wide range of other multimedia communication applications.

 Designed to provide basic data communication services such as e-mail and general file
transfers.
 User equipment’s - connected to data networks: are the computers such as a PC, a
workstation, or an e-mail/file server.
 Two widely deployed types of data networks: 1. X.25 network and 2. Internet.
 X.25 network: operational mode is restricted to relatively low bit rate data applications.
Hence, unsuitable for most multimedia applications.
 Internet: Made up of a vast collection of interconnected networks all of which operate
using the same set of communication protocols.
 Communication protocol: an agreed set of rules that are adhered to by all communicating
parties for the exchange of information.
 Rules define the sequence of messages that are exchanged between the communication
parties and the syntax of these messages.
 By using, the same set of communication protocols: all the computers that are connected to
the Internet can communicate freely with each other irrespective of their type or
manufacturer this is the origin of the term "open systems interconnection".
 Figure below shows a selection of the different types of interconnected
network
 User at home or in a small business access to Internet is through an intermediate: ISP
(Internet Service Provider) network normally, this type of user wants access to the Internet
intermittently the user devices are connected to the ISP network either through a PSTN
with modems or through an ISDN (Integrated Services Digital Network which provide
access at a higher bit rate).
 Business user - obtain access through a site/campus network if, the business comprises
only a single site or obtain access through an enterprise-wide private network if, it
comprises multiple sites.
 Colleges and Universities In the case of a single site/campus: network is known as a
(private) LAN (Local Area Network), In the case of sites that are interconnected together
using an inter-site backbone network to provide a set of enterprise-wide communication
services network is known as an enterprise-wide private network Providing
communication protocols used by all the computers connected to the network are the same
as those defined for use with Internet.
 Enterprise network (Intranet): all internal services are provided by using the same set of
communication protocols, as those defined for the Internet.
 IBN (Internet Backbone Network): different types of network are all connected to it
through an interworking unit called gateways.
 Gateways (Router): an interworking unit connects IBN and the different types of network
responsible for routing and relaying all messages to and from the connected network
hence, also called as a router.
 Packet mode: all data networks operate in this mode.
 Packet: container for a block of data and has head in which, address of the intended
recipient computer (which is used to route the packet through the network).
 This mode of operation is chosen since, the format of the data associated with data
applications is normally in the form of discrete blocks of text or binary data with varying
time intervals between each block.
 Multimedia PCs: have become available that support a range of other applications.
 Ex.: with the addition of microphone and a pair of speakers with sound card and
associated software to digitize the speech PCs now are used to support telephony and other
speech-related applications with the addition of video camera and associated hardware and
software a range of other applications involving video can be supported.
 Due to those availability above of higher bit rate transmission circuits and routing nodes
have become available, and also more efficient algorithms to represent speech, audio and
video in a digital form
 Packet-mode networks and the Internet in particular: support general data
communication applications and also a range of other multimedia communication
applications involving speech, audio, and video currently

 Designed to support the diffusion of analog television and radio programs


throughout wide geographical areas.
 Cable distribution network sued as broadcast medium, normally in large town or city.

 Satellite network (Terrestrial broadcast network): broadcast medium for


large areas digital television services have becomeavailable.
 Low bit rate return channel for interaction purposes - with digital television
services provide a range of additional services (like games, home shopping, and
etc.,).
 Figure below shows the general architecture of a cable distribution network and
a satellite/terrestrial broadcast network.
 General architecture of a cable distribution network: Consist of set top box :
attached to the cable distribution network Provides:
1. Control of the television channels - that are received.

2. Access to other services.

 Ex.: Cable modem: integrated into the STB provides a low bit rate channel and a
high bit rate channel from the subscriber back to the cable head end.
 Low bit rate channel: used to connect the subscriber to a PSTN.

 High bit rate channel: used to connect the subscriber to the Internet.
 Cable distribution network: provide basic broadcast radio and television
services access to the range of multimedia communication services that are
available with both PSTN and Internet.

 Satellite and terrestrial broadcast networks: integrated into the STB provides
the subscriber with an interaction channel hence, enhancing the range of
services is the origin of the term "interactive television".

 Figure below shows the general architecture of the satellite and terrestrial
broadcast networks

 Started to be deployed in early 1980s.


 Originally designed to provide PSTN users with the capability of having
additional services.
 Achieved by converting the access circuits that connect user equipment to the
network (Ex.: telephone network) into an all digital form.
 Providing TWO separate communication channels over these circuits allow
users either to have two different telephone calls in progress simultaneously or
two different calls such as a telephone call and a data call.
 Access circuit with ISDN: known as DSL (Digital Subscriber Line).
 Subscriber telephone: can either a digital phone or a conventional analog one.
 Case of digital phone: electronics that are needed to convert the analog voice and call
setup signals into a digital form are integrated into the phone handset.
 Case of analog phone: electronics that are needed to convert the analog voice and call
setup signals into a digital form are located in the network termination equipment making
the digital mode of operation of the network transparent to the subscriber phone.
 Digitization of a telephone-quality analog speech signal - produces a constant bit rate
binary stream normally, referred to as the bitstream of 64kpbs.
 BRI (Basic Rate Access or Basic DSL of ISDN): support two 64kbps channels which
can be used either independently (as they were intended) or as a single combined 128kbps
channel.
 Design of ISDN: Two channels were intended for two different calls require 2 separate
circuits to be set up through the switching network independently hence, to synchronize 2
separate 64kbps bitstreams into a single 128kbps stream requires an additional box of
electronics to perform the aggregationfunction.
 PRI (Primary Rate Access): single higher bit rate channel of either 1.5 or 2 Mbps is
used.
 More flexible way of obtaining a switched 128kbps service has been introduced by many
network operators Service provided has been enhanced and a single switched channel
supports now of (p * 64kbps), wherep=1,2,3,4...30

 ISDN can support a range of multimedia applications Due to the relatively high cost of
digitizing the access circuits: cost of the services associated with an ISDN is higher than
the equivalent service provided by a PSTN.
 Figure below shows the summarization of the various services provided.
 Designed in mid-1980s for use, as public switched networks to support a wide
range of multimedia communication applications.
 Broadband: term used to indicate the circuits associated with a call could
have bit rates in excess of the maximum bit rate of 2Mbps 30X64kbps
provided byan ISDN.
 B-ISDN (Broadband Integrated Services Digital Networks): alternate names
for broadband multiservice networks since, were designed to be an enhanced
ISDN.
 N-ISDN (Narrow Integrated Services Digital Networks): alternate name for
ISDN.

 B-ISDN: when in first technology associated with the digitization of the video
signal using were, in general, an ISD could not support services that included
video.
 Due to considerable advances in the field of compression from ISDN now
support multimedia communication applications that includes video, and also
can the other 3 types of network combined effect, the slow down considerably
the deployment of B-ISDN.

 Number of the basic design features associated with the B-ISDN: have been
used as the basis of other broadband multiservice networks.
 Ex.: A multiservice network implies that the network must support multiple services.
 Different multimedia applications require different bit rates the rate being
determined by the types of media that are involved hence, switching and
transmission methods that are used within these networks must be more flexible
than those used in networks such as a PSTN or ISDN which were initially
designed to provide a single type of service.
 To have this flexibility:
1. All the different media types associated with a particular application are
first converted in the source equipment into a digital form.
2. These to be integrated together.
3. Resulting binary stream is divided into multiple fixed-size packets called cells.
 Information streams: of this type provides a more flexible way of both
transmitting and switching the multimedia information associated with a the
different types of application.
 Ex.: Transmission terms in: cells relating to the different applications can be
integrated together more flexibly.
 Use of fixed-sized cells: means the switching of cells can be carried out much
faster than, if variable-length packets were used.
 Different multimedia applications generate cell streams of different rates:
this mode of operation in rate of transfer of cells through the network also varies
hence, the name: ATM (Asynchronous Transfer Mode) ATM networks
(Broadband multiservice networks) - alternate name: Cell-Switching
Networks.
 Ex.: ATM LANs - span a single site, ATM MANs – span large town or city.
 Ex.: For broadband multiservice network is shown in the Figure below

 Being used as a high-speed backbone network to interconnect a number of


LANs distributed around a large town or city.
 Note: Two of the LANs are ATM LANs and other two are simply higher-speed
versions of older data only LANs. It's the typical of ATM networks which must
often interwork with older (legacy) networks.

 Many and varied applications involving of multiple media types present.


 Major categories of multimediaapplications:
1. Interpersonal communications.
2. Interactive applications over the Internet.
3. Entertainment applications.
 In many instances networks used to support applications were initially designed to
provide the service which involves just the single type of medium and with advances in
technology, made multimedia applications support possible along with initial designed of
basic services being from those possible and in some applications basic designed
applications become - still more enhanced form is of possible.
May involve speech, image, text, or video.
 Interpersonal communications may involve single type or integrated two or
more type of media involved:
1. Speech only
2. Image only
3. Text Only
4. Text and Images
5. Speech and Video
6. Multimedia

 Traditionally, involves – speech, telephony.


 Service is provided using telephones which are connected either to PSTN/ISDN/Cellular
network or PBX.
 Multimedia PC with microphone and speakers, if using user can make telephone calls
through PC.
 This requires the telephone interface card and associated software called CTI
(ComputerTelephony Integration).
 The advantages of using PC, instead of conventional telephone for calls are:
1. User can create his or her own private directory of numbers and initiate a call simply by
selecting the desired number on the PC screen.
2. Circuit’s bandwidth is more (providing access circuits to the network has sufficient
capacity).
3. Integration of telephony with all the other networked services are possible by PC.
4. In addition to Telephony many public and private networks support additional services.
 Ex.: Voice-mail and Teleconferencing
 Voice-mail: Used when the called party being unavailable Spoken message is then be left in
the voice mail box of the called party Voice mail server, located in the central repository
had voice mail boxes, Message can be read by owner of the mailbox the next time he, or
she contact the server.
 Teleconferencing: Calls involve multiple interconnected telephones/PCs. Person can hear
and talk to all of the others involved in the call called the conference call/teleconference
call since, it involves a telephone network or audio conference call which require an
audio bridge - a central unit which supports to set up a conference call
automatically.
 Internet was used to support telephony. Initially, designed to support computer-to-
computer communications Just (multimedia) PC-to-PC telephony was supported
subsequently, extended so that a standard telephone could be used.
 Figure below shows the general scheme
 PC-to-PC telephone call: Standard addresses are used to identify individual computers
connected to the internet are used same way as for a data transfer application.
 Internet: operates in the packet mode Both PCs must have the necessary hardware and
software to convert the speech signal from the microphone into packets on input and back
again prior to output to the speakers.
 Thus Telephony over the Internet is known as Packet voice as the network protocol
associated with the internet is called the Internet Protocol (IP), Voice over IP (VoIP).
 Telephony gateway: It’s a Interworking unit to connect the PC connected to the Internet
and a telephone connected to the PSTN/ISDN - since both operate in the circuit mode PC
user sends a request to make a telephone call to a preallocated telephony gateway using
the latter’s internet Address Gateway requests from the source PC the telephone number
of the called party assuming user is registered for this service. Source gateway on receipt
of above initiates the session (call) with the telephony gateway nearest to the called party
using the Internet address of the gateway. Called party then, initiates a call to the
recipient telephone using its telephone number and the standard call procedure of the
PSTN/ISDN.
 Assuming the called party answers called gateway signals back to the PC user through the
source gateway that the call can commence. Similar procedure followed to clear the
callon completion.

 Exchange of electronic images of documents is an alternate form of interpersonal


communications over PSTN/ISDN known as Facsimile (Simply, fax).
 Communication involves use of the pair of fax machines one at each network termination
point.
 Document sending: caller keys in the telephone number of the intended recipient, a circuit
is set up through the network in the same way as for a telephone call Two fax machines
communicate with each other to establish operational parameters after, which the sending
machine starts to scan and digitize each page of the document in turn both fax machines
have an integral modem within them and as, each page is scanned it’s digitized image is
simultaneously transmitted over the network and as this is received at the called side a
printed version of the document is produced after the last page of the document has been
sent and received connection through the network is cleared by the calling machine in the
normal ways.
 PC fax: PC can be used instead of the normal fax machine to send an electronic version of
document stored directly within the PCs memory. Digital image of each page of the
document is sent in the same way as the scanned image produced by a conventional fax
machine.
 Figure FAXCIMILE IPC- IMAGE only

 With Telephony this requires a telephone interface card and associated software, latter
operates in the same way as like the fax machine so, and terminal at the called side can be
either a fax machine /another similar PC.

 It Is Possible to send (by using LAN interface card and associated software) the digitized
document over other network types such as an enterprise network particularly, this mode of
operation useful when working with paper-based documents, such as invoices

Ex.: E-mail (Electronic mail).

 User terminal is normally are normally a PC or a workstation. User at home access to the
Internet through the PSTN/ISDN, and through an intermediate ISP network.
 Business users obtain access either through an enterprise network/site or campus network
 Email servers: One or more associated with each network Collectively contain a mailbox
for each user connected to that network User can both create and deposit mail his/her
mailbox read mail from it.
 Standard Internet communication protocol used by e-mail servers and internetwork
gateway.
 Figure below shows various operational scenarios

Figure Below. shows the format of the text-onlye-mail message

 At the head: unique Internet-wide name of both the sender and recipient of the mail,
In addition present mail copy can be sent to multiple recipients each of whom is listed
in the cc part of the mail header „cc‟ acronym for the carbon copy the original means
of making (paper) copies of documents Text only mails content: comprise unformatted
text typically, strings of ASCII characters.
 CSCV (Computer-Supported Cooperative Working) application: involves – text
and images integrated.
 Network used: enterprise network/LAN/Internet.
 Figure below. shows the general scheme

 Typically distributed group of people each in the place of work are all working
on thesame project.
 User terminal is either a PC or a workstation.

 Shared whiteboard: Window on each person’s display is used as the shared


workspace, display comprises integrated text and images.
 Software associates comprises of whiteboard program, a central program and a linked
set of support programs, one in each PC/workstation.
 Linked set of supported programs made up of change-notification part and update-
control part.
 Change-notification part: Sends details of the changes in the whiteboard program
whenever, a member of the group updates the contents of whiteboard.
 Update-control part: Present in each of the other PCs/workstations obtain above
change information in turn, proceed to update the contents of their copy of the
whiteboard.
 Ex.: video telephony uses integrated speech and video supported now by all
the network types.
 Figure below shows the general scheme
 In Cases of Home use: Terminals used normally dedicated to providing the videophone
service.
 In Cases of Office Use: Single multimedia PC/workstation is used to provide videophone
service together with a range of other services.
 In both the cases: video camera, microphone and speaker used for telephony by the
terminals/PCs.
 Dedicated terminal using a separate screen is used for the display Multimedia PC or
workstation using a window of the PC/workstation screen to display the moving image
of the called party.
 Network must provide two-way communication channel between 2 parties of sufficient
BW to support the integrated speech-and-video generated by each terminal/PC.
 Integration of video and speech: Bandwidth of the access circuits required to support is
higher than that require for speech only.
 Desktop videoconferencing call: Telephony like: call may involve not just 2 persons and
so, terminals/PCs several people each located in their own office.
 Used widely in large corporations involving multiple geographically distributed sites to
minimize the travel between the various locations. Large corporations of this type have
enterprise-wide network to link the sites together MCU (Multipoint control unit) is a
Central unit to support the videoconferencing. Videoconferencing server associated with
the network used in few cases.
 Figure above shows separate window on screen of each participant’s PC/workstation
should be used to display video image of all the other participants.
 Needed to implement displaying of video image of all the other participants on screen of
each participant this requires:
1. Multiple integrated speech-and-video communication channels, one for each
participant, being sent to each of the other participants needed to do this which Require
more bandwidth than is available.
2. Integrated speech-and-video information stream: from each participant is sent to the
MCU which then, selects just a single information stream to send to each participant.
Ex.: voice-activated MCU

 MCU whenever detects a participant speaking it relays the information stream from the
participants to all the other participants so, a single 2-way communication channel is
needed between each location and the MCU is needed thereby reducing the
communication bandwidth needed considerably.

 Some Networks such as LANs and the Internet supports Multicasting where all
transmissions from any of the PCs/workstations belonging to a predefined multicast group
are received by all the other members of the group Possible to hold a conferencing session
without an MCU possible with networks that support multicasting.
 Figure below shows the principle of this is only feasible when only a limited number of
participants are involved owing to the high load it places on the network.
 In Figure below a person at one location is communicating with a group of people at
another location.
 Ex.: for this case transmission of a live lecture or seminar, typically information stream,
transferred from the lecturer to the remove class would be integrated speech-and video
together with electronic copies of transparencies, and other documents used in the lecture
In reverse direction information may comprise just speech for questions or integrated
speech-and-video to enable the lecturer to both see and hear the members of the class at
the remote location.
 Communication requirements in terms, these are similar to those for a two-party
videophone call.
 If the lecturer is relayed to multiple locations a separate communication channel is
required to each remote site or MCU is used at the lecturer’s site.
 Relatively high BW that is involved network is either an ISDN (supports of multiple
64kbps channels) or a broadband multiservice network if one is available
.
 In Figure above there is a group of people at each location. This type is in use from
many years was the first example of videoconferencing. Normally, a group of people
are present at each location.
 Videoconferencing studios: Specially equipped rooms are used – which contain all
the necessary audio and video equipment, comprising of one or more video cameras, a
large- screen display, and associated audio equipment, all of which are connected to a
unit called videoconferencing system.
 Conference can involve just 2 locations or more usually, multiple locations (in this
latter case an MCU is normally, used to minimize the BW demands on the access
circuits to the network) as Figure in MCU is shown. as the central facility within the
network and hence, only a single 2-way communications channel is required for each
access circuit of the network. Ex.: this type of arrangement, with a
telecommunications-provider conference.
 If a private network alternately used MCU is normally located at one of the sites
Communication requirements, are then more demanding since, it must support
multiple input channels one for each of the other sites and a single output channel, the
stream from which must be broadcast to all of the other sites.

Multimedia:

 Assumption: The information content of each e-mail message consisted of text


only used in the earlier discussed.
 Ex.: In addition an mail containing, other media types such as images, audio, and
video are also used like voice-mail, video-mail, and multimedia mail.
 Voice-mail: Similar in principle to earlier discussed telephone networks.
Internet-based voice-mail there is a voice-mail server associated with each
network, in addition to e-mail server.
 Figure below Shows this
 User first enters the voice message addressed to the intended recipient local voice-mail
server then, relays this to the server associated with the intended recipient’s network stored
voice message is then, played out the next time the recipient accesses voice-mailbox.
 Same mode of operation is used for video-mail except, the mail message comprises an
integrated speech-and video sequence.
 Multimedia mail: An extension of text-only mail in as much as the basic content of the
mail comprises textual information.
 Textual information is annotated with a digitized image, a speech message, or a video
message, as in Figure.
 Speech-and-video case in the annotations can be sent either directly to the mailbox of the
intended recipient together with the original textual message and, hence stored and played
out in the normal way or they may have to be requested specifically by the recipient when
the textual message is being read.

 Recipient can always receive the basic text-only message but, the multimedia annotations
can be received only if the terminal being use by the recipient supports voice and/or
video

 Internet is used to support a range of interactive applications along with


interpersonal communication applications.

 Ex.: WWW (World Wide Web) or simply Web server comprises the linked
set of multimedia information servers that are geographically distributed around
the Internet.
 Total information stored on all the servers is equivalent to a vast library of document.

 Figure below shows the general principle


 Each document comprises a linked set of pages and linkages between the pages are
known as hyperlinks.
 Hyperlinks are pointers also known as references to other pages of the same document or
to any other document within the total web so, a reader of the document has the option at
well-defined points throughout the pages that make up a document to jump either to a
different page of the same document or, to a different document. Also, to return
subsequently to a specific point on a page at a later time.
 Optional linkage points within documents are defined by the creator of the document and are
known as anchors for which the necessary linkage information is attached.
 Hypertext are documents comprising only texts and are created usinghypertext.

 Hypermedia are documents comprising multimedia information and are created


using hypermedia.
 Figure below Shows general structure of this type of document

 There is no central authority for the introduction of new documents into the web. On side in
anyone create a new document providing the server has been allocated an Internet address,
and make hyperlink references from it to any other document on the web.
 URL (Uniform Resource Locator) is a Document’s unique address which identifies both
location of the server on the Internet, where the first page of the document is stored and also
the file reference on the server.
.
 Home page is the First page of the document all the hyperlinks on this and other pages have
similar URLs associated with them physical location of a page is transparent, to the user and
in theory can be located anywhere on the web.
 A Standard format is used for writing documents is known as HTML (Hyper Text Markup
Language) and is also used for writing client software to explore the total contents of the
web, i.e., the contents of the linked information on all the web servers.
 A Browser is a Client function and there are number of user-friendly browsers available
to explore visited servers and to open up a dialog with a particular server at the click of the
mouse. Once the desired document has been located, the user simply clicks on an anchor
point within a page of the document to activate the linkage information stored at that point
Possible to return to the previous anchor at any time.
 With the hypertext document: Anchor is usually, an underlined word orphrase.

 Ex.: Loudspeaker for a sound annotation for a video camera for a video clip.
 In Some applications client simply wishes to browse through the information stored at a
particular site. Ex.: Browsing through sales literature, product information, application
notes periodicals, newspapers, and so on. In general, no charge for accessing this
information however, access to books, journals, and similar documents may be by
subscriptions only.
 Teleshopping (home shopping)/ Telebanking (home banking) applications: A client
may wish not only to browse through the information at a site but also to initiate an
additional transaction Server must provide additional transaction processing support for,
say, ordering and purchasing since, this will also often involve financial transaction, more
rigorous security procedures are required for access and authentication purposes.

 Entertainment applications can be of 2 types:


1. Movie/video-on-demand
2. Interactive television

 The video and audio associated with entertainment applications must be of a much higher
quality/resolution. Since, wide-screen televisions and stereophonic sound are often used.
 Digitized movie/video with sound requires a minimum channel bit rate (bandwidth) of
1.5Mbps. Hence, network used to support this application, must be either a PSTN with a
high bit rate modem or a cable network of this type.
 For PSTN: high bit rate channel provided by the modem used only over the
access circuit and provides additional services to the other switched services that
the PSTN supports.
 Figure below. Shows – the general operating scheme in both the cases.

 Information stored on the server: collection of digitized movies/videos. Normally,


subscriber terminal comprises a conventional television with device for interaction
purposes. User interactions are relayed through the server through a set-top box which also
contains the high bit rate modem.
 MOD (Movie-On-Demand)/VOD (Video-On-Demand): From suitable menu
subscriber is able to browse through the set of movies/videos available and initiate the
showing of a selected movie. Subscriber can control the showing of the movie by using
similar controls to those used on a conventional VCR (Video Cassette Recorder) i.e.,
pause, fast-forward, and so on.
 Key feature of MOD: a subscriber can initiate showing of a movie selected from a large
library of movies at any time of the day or night.
 From Figure below, the server must be capable of playing out simultaneously a large
number of video streams equal to the number of subscribers currently watching a movie
requires the information flow from the server to be extremely high since, it must support
not just the transmission of a possibly large number of different movies, but also multiple
copies of each movie it is very challenging and costly since, the cost of the server is
directlyrelated to the aggregate information flow rate from it.

 Server: if, supporting a large number of subscribers it is common for several subscribers
to
request the same movie within a relatively short time interval between each request.
 Alternative mode in which requests for a particular movie are not played out immediately
but instead are queued until the start of the next play out time of that movie as shown in
Figure below.
 N-MOD (Near Movie-On-Demand): in this mode of operation all request for the same
movie which are made during the period up to the next play out time are satisfied
simultaneously by the server outputting a single video stream clearly, the viewer is unable
to control the play out of themovies.
 Similar applications as above been made use in Business environment except, the stored
information in the server is typically, training and general educational material, company
news, and so on and, thus the number of stored videos is normally much less as is the
number of simultaneous users so, video servers required are less sophisticated than those
used in public MOD/N-MOD systems.
 Stored video streams/programs are often in a different format is as that of CD-ROMs
since, the received video stream can then be displayed directly on the screen of a
multimedia PC or workstation.
 Communication requirements of the private networks are the same as those identified for
use with a public networks.

 Broadcast television networks: include cable, satellite, and terrestrial networks.


 Basic service of this network is diffusion of both analog and digital television (and radio)
programs.
 STB (Set-Top Box): associated with these networks has a modem within it.
 For cable networks as in Figure below. , STB provides both a low bit rate connection to
the PSTN and a high bit rate connection to the Internet.
 By connecting appropriate TE to the STB a keyboard, telephone, and so on subscriber is
able to gain access to all the services provided through the PSTN and the Internet. Through
the connection to the PSTN subscriber is able to actively respond to the information being
broadcast it’s the origin of the term interaction television.
 Typical uses of the return channel are for voting, participation in games, home shopping,
and so on.
 As in Figure a similar set of services are available through satellite and terrestrial
broadcast networks except, that the STB associated with these networks requires a high-
speed modem to provide the connections to the PSTN and the Internet.

 Figure below shows a selection of the terms used with multimedia.


 Information flow associated with the different applications can be either continuous or
block mode.

1. Information stream is generated by the source continuously in a time dependent way.


2. Continuous media is passed directly to the destination as it is generated, and at the
destination, the information stream is played out directly as it is received operation mode
of which is called streaming.
3. Continuous media generated in a time-dependent way is called Real-time media.
4. Continuous media with the bit rate of the communication channel that is used must be
compatible with the rate the source media is being generated.
5. Ex.: Media types that guarantee continuous streams of information in real time are audio
and video.
6. Bit rate of source information stream can be either CBR (Constant Bit Rate)/ VBR
(Variable Bit Rate).
7. Audio: Ex.: Digitized audio stream is generated at a constant bit rate which is determined
by the frequency. The audio waveform is sampled and the number of bits that are used to
digitize each sample.
8. Video: Ex.: Individual pictures/frames that make up the video are generated at a constant
rate after compression amount of information associated with each frame varies in general,
information stream associated with compressed video is generated at fixed time intervals
but the resulting bit rate is variable.

1. Source information comprises single block of information that is created in a time


independent way.
2. Ex.: block of text representing an e-mail or computer program a 2-D matrix of pixel
values that represents an image and so on.
3. Block mode media created in a time-independent way often stored at the source in say, a
file Downloading when it is requested block of information is transferred across the
network to the destination where it is again stored and subsequently output/displayed at a
time determined by the requesting application program.
4. Bit rate of the communications channel need not be constant but, such that, when a
blockis requested.
 RTD (Round-Trip Delay): delay between the request being made and the contents of the
block being output at the destination is within an acceptable time interval - RTD – for HCI
(Human-Computer Interface): can be no more than a few seconds.

Transfer of information streams associated with an application can be in 5


modes:

1. Simplex

2. Half-duplex (Two-wayalternate)
3. Duplex (Two-way simultaneous)
4. Broadcast

5. Multicast
 Information associated with the application flows in one direction only.
 Ex.: transmission of photographic images from a deep-space probe at
predetermined times Involves unidirectional flow of information from the
probe to an earth station.

 Information flows in both directions but, alternatively.


 Ex.: user making a request for some information form a remote server,
which returns the requested information.

 Information flows in both directions simultaneously.


 Ex.: two-way flow of the digitized speech and video associated with a video
telephony application
 Information output by a single source node is received by all the other nodes,
computers, and others which are connected to the same network.
 Ex.: broadcast of a television program over a cable network as all the television
receivers that are connected to the network receive the same set of programs.

 Similar to broadcast except, information output by the source is received by only a


specific subset of the nodes that are connected to the network (multicast group).
 Ex.: video conferencing involving a predefined group of terminals/computers
connected to a network exchanging integrated speech and video streams.
 In half-duplex and duplex communications the bit rate associated with the flow of
information in each direction can be same or different.
1. Rate associated, with the flow of information in each direction is equal then is called as
Symmetric.
2. Rate associated, with the flow of information in each direction is unequal then is called as
Asymmetric.
 Ex.: Video telephone call: involves exchange of integrated digitized speech and video
stream both direction simultaneously so, symmetric duplex communications channel is
required.
 Application involving browser (program) and a web server:
1. Low bit rate channel from the browser to the web server is required for request and
control purposes.
2. High bit rate channel from the server to the subscriber for the transfer of, say, and
requested file so, asymmetric half-duplex communications channel is required.

 Types of information stream associated with the different media types are:
1. Continuous mode
2. Block mode

 There are TWO types of communications channel associated with the various network
types they are:
1. Circuit-mode: operates in a time-dependent way, also called as Synchronous
communications channel as it provides a constant bit rate service at a specified rate.
2. Packet-mode: operates in a time-varying way, also called as Asynchronous
communications channel provides a variable bit rate service - actual rate is determined by
the variable transfer rate of packets across the network.

 Figure below shows – the circuit mode network


 Circuit-switched networks comprise an interconnected set of switching
offices/exchanges for which the subscriber terminals/computers are connected.
 Prior sending information source set up a connection through the network.
 Each subscriber terminal/computer has a unique network-wide number/address
associated with it.
 To make a call source first enters the number/address of the intended communication
partner Local switching office/exchange uses this to set up a connection through the
network to the switching office/exchange to which destination isconnected.
 Assuming destination is free and ready to receive a call a message is returned to the
source indicating that it can start to transfer/exchange information. After all the
information has been transferred/exchanged either the source or the destination requests
for the connection to be cleared.
 Bit rate associated with the connection is fixed and, determined by the bit rate that is used
over the access circuits that connect the source and destination terminal/computer to the
network signaling messages associated with the setting up and clearing ofa connection.
 Call/connection setup delay is the time delay while a connection is being established.

 Ex.: PSTN and ISDN.


 PSTN: call setup delay can range from a fraction of a second for a local call through to
several seconds for an internationalcall.
 ISDN: delay ranges from tens of milliseconds through to several hundred milliseconds.

1. Connection-Oriented(CO)
2. Connectionless(CL)
Connection-Oriented(CO):

 Figure below shows principle of operation of a CO network.

 Comprises an interconnected set of PSEs (Packet-Switching Exchanges).


 Connection-Oriented network is also called as Packet-switched network.
 Each terminal/computer is connected to the network has a unique network-wide
number/address associated with it. Prior to the sending any information connection is first
set up through network using the addresses of the source and destination terminals.
 Connection/circuit that is set up utilizes only a variable portion of the BW of each link
hence, connection is known as Virtual Connection/Virtual Circuit (VC).

VC Set up:

 Source terminal/computer sends a call request control packet to its local PSE which
contains address of the source and destination terminal/computer and a Virtual Circuit
Identifier (VCI) a short identifier.
 Each PSE maintains a table which specifies the outgoing link that should be use dot
reach each network address. On receipt of the call request packet PSE uses the
destination address within the packet to determine the outgoing link to be used.
 Next free identifier (VCI) for this link is then selected and two entries are made in a
routing table.
1. First entry: Specifies incoming link/VCI and the corresponding outgoing
link/VCI.
2. Second entry: To route packets in the reverse direction (the inverse of these as
we show in the example in the Figure).
 Call request packet is then forwarded on the selected outgoing link. Same procedure is
followed at each PSE along the route until the destination terminal/computer is reached
VCIs used on the various links form the VC.
 At the destination assuming the cal is accepted: A call accepted packet is returned to
the source over the same route/VC.
 Information transfer phase can start but, since a VC is now in place only the VCI is
needed in the packet header instead of the full network-wideaddress.
 Each PSE first uses the incoming link/VCI to determine the outgoing link/VCI from the
routing table Existing VCI in the packet header is replaced with that obtained from the
routing table Packet is forwarded on the identified outgoing link.
 Same procedure is followed to return information in the reverse direction. When all
information is transferred/exchanged VC is cleared. Appropriate VCIs are released by
passing a call clear packet along the VC.

Connectionless:

 In connectionless network: Establishment of connection is not required. Two


communicating terminals/computers can communicate and exchange
information as and when they wish.
 Figure below shows each packet must carry the full source and destination
addresses in its header in order for each PSE to route the packet onto the appropriate
outgoing link.

 Router is used, rather than packet switching exchange.


 In Both network types (CO and CL): Each packet is received by PSE/router on an
incoming link. It is stored in it’s entirely in a memory buffer. A check is made to determine if
any, transmission/bit errors are present in the packet header i.e., the signal that is used to
represent a binary 0 is corrupted and is interpreted by the receiver as a binary 1 and vice
versa.
 Service offered by the packet-switched network is Best-effort service.
 If no errors are detected the addresses/VCIs carried in the packet header are read to
determine the outgoing link that should be used.
 Packet is placed in a queue ready for forwarding on the selected outgoing link. All packets
are transmitted at the maximum link bit rate.
 With this mode of operation it is possible for a sequence of packets to be received on a
number of incoming links all of which need forwarding on the same outgoing link. Hence, a
packet may experience an additional delay while it is in the output queue for a link waiting
to be transmitted.
 This delay variable because it depends on the number of packets that are currently present
in the queue when a new packet arrives for forwarding this mode of operation is known as
packet store-and-forward.
 There is a packet store-and-forward delay in each PSE/router. Sum of the store-and-
forward delays in each PSE/router contributes to the overall transfer delay of the packet
across the network.
 Mean of the above delay is called Mean packet transfer delay and variation about the
mean is known as Delay variation (jitter).
 Ex.:
1. Internet (Ex.: for packet-switched network – that operates in the CL mode).
2. International X.25 packet-switching network and ATM (Ex.: for networks that
operate in the CO mode).

 Features in many interpersonal applications including audio and video conferencing, data
sharing, and computer supported cooperative working.
 These involve exchange of information between 3 or more terminals/computers.
 Multipoint conferencing is implemented in one of following ways:
1. Centralized mode
2. Decentralized mode
3. Hybrid mode
Centralized mode:
 Used with circuit-switched networks such as PSTN/ISDN.

 Figure Above. shows Centralized conference server is used.


 Prior to sending any information, each terminal/computer to be involved in the
conference must first set up a connection to the server.
 Each terminal/computer then, sends its own media stream comprising, say, audio,
video, and data integrated together in some way to the server using the established
connection.
 Server in turn, distributes either the media stream received from a selected
terminal/computer or a mix of the media streams received from several
terminals/computers back to all the other terminals/computers that are involved in the
conference.

 Used with packet-switched networks that support multicastcommunications.


 Ex.: LANs, intranets, and the Internet.

 Figure above. Shows output of each terminal/computer is received by all the


other members of the conference/multicast group.
 Conference server is not normally used, and instead each terminal/computer
manages the information streams that it receives from the other members.

Hybrid mode:

 Used when the various terminals/computers that makes up the conference are
attached to different network types.

 Figure above. Shows Ex.: conference comprises 4 terminals/computers 2

attached to a circuit-switched network and 2 to a packet switched network


that supports multicasting.
 Like in the centralized mode conference server is used output of each
terminal/computer is sent to the server either over individual circuits
terminals A and B or using multicasting terminals C and D Server that
determines output stream(s) to be sent to each terminal.

1. Data conferencing
2. Audio conferencing
3. Videoconferencing
4. Multimedia conferencing
 Data conferencing: Involves data only. Ex.: include data sharing and computer-
supported cooperative working.
 Audio conferencing: Involves audio (speech) only.
 Videoconferencing: Involves speech and video synchronized and integratedtogether.
 Multimedia conferencing: Involves speech, video, and data integrated together.

 Information flow between the various parties is relatively infrequent;


Conference server is a general-purpose computer with the conference
function implemented in software.
 With the other 3 types of conferencing the information flows demand the use
of special purpose units.

 Audio bridge is the unit.


 Typical units supporting 6 through 48 conferenceparticipants.

 MCU (Multipoint Control Unit) is the unit.


 Due to volume and rate of the information being exchanged centralized
modeof working is used with both network types.

 Multipoint Controller part (MC): Concerned with the establishment of


connections to each of the conference participants and with the negotiation
of an agreed set of operational parameters screen resolution, refresh rate, and
others.
 Multipoint Processor (MP): Concerned with the distribution of the
information streams generated during the conference. Include such functions
as the mixing of the various media streams into an integrated stream, voice-
activated switching and continuous presence.

 When using a audiobridge,acall is scheduled for a particular date, time, and


duration.
 Everyone who is to take part in the call is assigned a user ID and password.
At appropriate time all participants call in and after verified, they can hear
and speak to the other participants.
 In the same way MCU when using a call is scheduled as for anaudio bridge.
 Once the conference starts each participant can hear, see, and share data with
the other participants MCU with
1. Dial-in mode: The participants calling in.
2. Dial-out mode: MCU calls the participants provides better security.
 Voice-activated switching mode: Face of the participant is displayed in a
window on the screen of the participant’s terminal/computer, in the second
window the face of the remote participant who is currently talking. When
another participant starts to talk face of the new speaker replaces the face of the
current remote participant. In the event two or more participants starting to talk
at the same time MCU normally selects person who speaks the loudest.
 Continuous -presence mode: Remote window is divided into a number of
smaller windows each of which displays the face of the last set of participants
who spoke or who are currently speaking. With both modes of speech from all
participants are normally mixed into a single stream and hence, each participant
can always hear what is said by all the other participants.

 Definition: “Operational parameters associated with a communications channel


through a network collectively determine the suitability of the channel in
relation to its use for a particular application”.
 QoS parameters are different for circuit-switched and packet-switchednetworks.

 QoS parameters associated with a constant bit rate channel that is set up through a

Circuit-switched network are

1. Bit rate
2. Mean bit error rate
3. Transmissiondelay

 In digital telecommunication, the bit rate is the number of bits that pass a given
point in a telecommunication network in a given amount of time, usually a
second.
 A bit rate is usually measured in some multiple of bits per second.
 The term bit rate is a synonym for data transfer rate (or simply data rate).

 Probability of a bit being corrupted during its transmission across the channel in
a defined time interval.
 For constant bit rate channel: Mean BER is the probability of a bit being
corrupted in a defined number of bits Mean BER of 10-3 means, on average for
every 1000 bits that are transmitted, 1 of these bits is corrupt.
 Some applications providing the occurrence of bit errors is relatively infrequent
their presence is acceptable while in other applications it is imperative that no
residual bit errors are present in the received information.
1. Ex.:If the application involves speech then, an occasional bit error will go
unnoticed.
2. If the application involves transfer say, financial information it is essential
that the received information contains no errors in such applications, prior
to transmission the source information is normally divided into blocks the
maximum size of which is determined by the mean BER of the
communications channel.
 Ex.: if mean BER is 10-3 number of bits in a block must be considerably < 1000,
otherwise, on an average every block will contain an error and will be
discarded.
 Normally bit errors occur randomly hence, even with a block size of (say 100
bits) blocks may still contain an error but, the probability of this occurring is
considerably less.
 In general, BER probability is P, Number of bits in a block is N, Probability of a
block containing a bit error is PB.
 Assuming: random errors: PB=1-(1-P)N, which approximates to NxP if, NxP<1.
 In practice, both circuit-switched and packet-switched provide unreliable
service (best- try or best-effort service).
 Unreliable service (Best-try/Best-effort service) is a type of service where any
blocks containing bit errors will be discarded either with the network (packet-
switched network) or in the network interface at the destination (both packet-
switched and circuit switched networks).
 Application dictates that only error-free blocks are acceptable, it is necessary for
sending terminal/computer to divide the source information into blocks of a
defined maximum size and for the destination to detect when a block is
missing.
 When the above occurs destination must request source send another copy of the
missing block. This type of service is called Reliable service.
 Due to above case delay is introduced so, that retransmission procedure should
be invoked relatively infrequently which dictates a small block size. This leads
to high overheads since, each block must contain additional information that is
associated with the transmission procedure.
 So, choice of block size compromise between increased delay resulting from a
large block size hence, retransmission and the loss of transmission BW from the
high overheads of using a small block size.

 Associated with the channel is determined by:


1. Bit rate used
2. Delays occur in the terminal/computer network interfaces (codec delays) +
propagation delay of the digital, as they pass from source to destination,
across the network.
 The above is determined by physical separation of 2 communicating devices and
velocit y of propagation of a signal, across the transmission medium (free space:
3*108 m/s and a fraction of this in physical media, a typical value 2*108 m/s).
 Propagation delay: in each case is independent of the bit rate of the
communication channel. Assuming codec delay remains constant; propagation
delay remains same whether bit rate is 1kpbs, 1Mbps, or 1Gbps.

Drive the maximum block size that should be used over a channel which has a mean BER
probability of 10-4 if the probability of a block containing an error –and hence being
discarded – is to be 10-1.

ANSWER:

PB=1-(1-P)N
Hence 0.1=1-(1-10-4)N and N=950bits
Alternatively, PB=N x P
Hence 0.1=Nx10-4 and N=1000bits

 QoS parameters – associated with a packet-switched network include


1. Maximum packet size

2. Mean packet transfer rate

3. Mean packet error rate

4. Mean packet transfer delay

5. Worst-case jitter

6. Transmission delay

 Packet-switched network: rate of packets transfer across the network influenced strongly
by bit rate of the interconnecting links due to, variable store-and-forward delays in each
PSE/router. Actual rate of transfer of packets across the network is also variable.
 Mean packet transfer rate: Measure of the average number of packets transferred
across the network/second coupling with packet size being used determines equivalent
mean bit rate of the channel.
 Mean PER (Mean Packet Error Rate): Probability of a received packet containing one
or more bit errors. It is same as block error rate, associated with a circuit-switched network.
Thus related to both maximum packet size and worst-case BER of the transmission links
which interconnects PSEs/routers that make up the network.
 Mean packet transfer delay: Summation of the mean store-and-forward delay that a
packet experiences in each PSE/router which, it encounters along a rout.
 Jitter: Worst-case variation in the mean packet transfer delay.
 Transmission delay: Same for network operates in the packet mode or a circuit mode
Includes: Codec delay, in each of the two-communicating computers and Signal
propagation delay.

 Network QoS parameters define what the particular network being used provides
rather what application requires. Application has its own QoS parameters requirement
associated with it: Application involving images: Ex.: parameters may include
minimum image resolution and size. Application involving video: Ex.: parameters
may include digitization format and refresh rate.

Application QoS parameters relate to the network include:

1. Required bit rate or mean packet transfer rate


2. Maximum startup delay
3. Maximum end-to-end delay
4. Maximum delay variation/jitter
5. Maximum round-trip delay
Applications involving – transfer of constant bit rate stream: Parameters important
are:
 Required bit rate/mean packet transfer rate
 End-to-end delay
 Delay variation/jitter (can cause problem – in the destination decoder – if the
rateof arrival of the bit stream is variable)

For interactive applications following points are prime factor:

1. Startup delay: Amount of time that elapses between an application making a request to
start a session and the confirmation being received from the application at the destination
from a server.
Ex.: it is prepared to accept the request so, it includes time required to establish a network
connection + If required, delay introduced in both the source and destination computers
while negotiating that session can take place.
2. Round-trip delay: Delay between a request for some information being made and the start
of the information being received/displayed. It is Important for HCI (human-computer
interaction) to be successful. It should be as short as possible ideally, less than a few
seconds.
 For applications involving, transfer of constant bit rate stream: Circuit-switched
network should appear to be most appropriate since, call setup delay is not often important
and channel provides a constant bit rate service of knownrate.
 For interactive applications connection-less packet-switched network is appropriate
since, no network call setup delay and any variation in the packet transfer delay are not
important.
 For PSTN and ISDN: Operation is circuit-switched provide a constant bit rate channel in
the order of 28.8kbps (PSTN with modem) and 64/128kbps(ISDN).
 For cable modem: Operation is packet-switched. Modems in each of homes in a cable
region are time-share, uses a single high bit rate channel/circuit.
 Typical bit rate of shared channel: 27Mbps, number of concurrent users of the channel may
be several hundred. So, if there are 270 concurrent users each user would get a mean data
rate of 100 kbps.
 In these applications: main parameter of interest is not mean data/bit rate, but times to transmit
the complete file With PSTN/ISDN: it is related to channel bit rate and the size of the
file.
 Cable modems: time-shares the use of 27Mbps channel, when they gain access to it, file
transfer takes place at full rate.
 Assuming: File size is 100Mbits minimum time to transmit the file using different Internet
access modes is:
1. PSTN and 28.8kbps modem: 57.8 minutes
2. ISDN at 64kpbs: 26 minutes
3. ISDN at 128kpbs: 13 minutes
4. cable modem at 27Mbps: 3.7 seconds
 If other transfer request occurs during the time the file is being transmitted then,
completion time of each transfer request will increase as they share the use of the
channel here, probability of multiple users requesting a transfer in this short window
of time is relatively low.
 For interactive applications: Ex.: call setup delay with an ISDN or an ATM
network, and a PSTN for local calls – is very fast – and for many applications, quite
acceptable.
 For constant bit rate applications: – Providing equivalent mean bit rate, provided
by the network > input bit rate maximum jitter < defined value – then, a packet-
switched network can be used
 Buffering: Used to overcome the effect of jitter
 Fig. shows the generalprinciple

 Figure shows when using the packet-switched network for this type of application
additional delay is incurred at the source as the information bitstream is converted into
packets.

 Effect of jitter is overcome by retaining a defined number of packets in a memory buffer at


the destination before play out of the information bitstream is started. Memory buffer
operates using a first-in, first-out (FIFO) discipline.

 Number of packets retained in the buffer before output starts is determined by the worst-cast
jitter and the bit rate of the information stream
 Packetization delay: Additional delay, incurred at the source as the information bitstream
is converted into packets adds to the transmission delay of the channel.
 To minimize overall input-to-output delay, packet size used for application is kept small,
but of sufficient size to overcome the effect ofthe worst-case jitter.
 To simplify determining a particular network which can meet the QoS requirement of an
application: Number of standard application service classes have been defined.
 Each service class with specific set of QoS parameters associated – for network, can either
meet this or not.
 Networks support a number of different service classes.

 Ex.: Internet – to ensure, the QoS parameters associated with each class is met – packets
relating to each class are given a different priority then, each class packets can be
differently treated.
 Internet packets relating to multimedia applications involving real-time streams are given
higher priority than, packets relating to applications such ase-mail.

 Packets containing real-time streams such as radio and video are more sensitive to delay
and jitter, than the packets containing textual information. Hence, during periods of
congestion packets containing real-time streams are transmitted first packets containing
video are more sensitive to packet loss than, packets containing audio hence are given
more priority

You might also like