Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Save to My Library
Look up keyword
Like this
0Activity
0 of .
Results for:
No results containing your search query
P. 1
Infotheory&Coding BJS Compiled

Infotheory&Coding BJS Compiled

Ratings: (0)|Views: 74|Likes:
Published by Tejus Prasad

More info:

Published by: Tejus Prasad on Apr 01, 2012
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as DOC, PDF, TXT or read online from Scribd
See more
See less

04/01/2012

pdf

text

original

 
INFORMATION THEORY AND CODINGSession – 1Measure of Information
What is communication?
Communication involves explicitly the transmission of information from one pointto another, through a succession of processes.
What are the basic elements to every communication system?
o
Transmitter 
o
Channel and
o
Receiver 
How are information sources classified?
Source definition
Analog : Emit a continuous – amplitude, continuous – time electrical wave from.Discrete : Emit a sequence of letters of symbols.
How to transform an analog information source into a discrete one?
What will be the output of a discrete information source?
A string or sequence of symbols.MessagesignalReceiver User of informationTransmitter Sourceof informationCHANNELTransmittedsignalReceivedsignalEstimate of message signalCommunication System
 
INFORMATION SOURCEANALOG DISCRETE
1
 
EVERY MESSAGE PUT OUT BY A SOURCE CONTAINS SOMEINFORMATION. SOME MESSAGES CONVEY MORE INFORMATION THANOTHERS.
How to measure the information content of a message quantitatively?
To answer this, we are required to arrive at an intuitive concept of the amount of information.Consider the following examples:A trip to Mercara (Coorg) in the winter time during evening hours,1.It is a cold day2.It is a cloudy day3.Possible snow flurriesAmount of information received is obviously different for these messages.
o
Message (1) Contains very little information since the weather in coorg is ‘cold’for most part of the time during winter season.
o
The forecast of ‘cloudy day’ contains more information, since it is not an eventthat occurs often.
o
In contrast, the forecast of ‘snow flurries’ conveys even more information,since the occurrence of snow in coorg is a rare event.
On an intuitive basis, then with a knowledge of the occurrence of an event, what can besaid about the amount of information conveyed?It is related to the probability of occurrence of the event.
What do you conclude from the above example with regard to quantity of information?Message associated with an event ‘least likely to occur’ contains most information.
How can the information content of a message be expressed quantitatively?The above concepts can now be formed interns of probabilities as follows:Say that, an information source emits one of ‘q’ possible messages m
1
, m
2
…… m
q
with p
1
,
 
 p
2
…… p
q
as their probs. of occurrence.2
 
Based on the above intusion, the information content of the k 
th
message, can bewritten asI (m
)
α
 
 p1Also to satisfy the intuitive concept, of information.I (m
) must
zero as p
 
1
Can I (m
) be negative?
What can I (m
) be at worst?
What then is the summary from the above discussion?
I (m
) > I (m
 j
); if p
< p
 j
I (m
)
O (m
 j
); if p
 
1------ II (m
) O; when O < p
< 1Another requirement is that when two independent messages are received, the totalinformation content is – Sum of the information conveyed by each of the messages.Thus, we haveI (m
& m
q
) I (m
& m
q
) = I
mk 
+ I
mq
------ II
Can you now suggest a continuous function of pk that satisfies the constraintsspecified in I & II above?
We can define a measure of information as –  I (m
) = log
      
 p1------ III3

You're Reading a Free Preview

Download
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->