'

Information Theory and Source Coding

$

• Scope of Information Theory 1. Determine the irreducible limit below which a signal cannot be compressed. 2. Deduce the ultimate transmission rate for reliable communication over a noisy channel. 3. Define Channel Capacity - the intrinsic ability of a channel to convey information. The basic setup in Information Theory has: – a source, – a channel and – destination. The output from source is conveyed through the channel and received at the destination. The source is a random variable S, & %

. · · · . the choice of present symbol does not depend on the previous choices. sk−1 } with probabilities P (S = sk ) = pk where k = 0. s2 . • Information: Information is defined as & % . 1.' which takes symbols from a finite alphabet i. Source is memoryless i.e.e. · · · .. Source generates symbols that are statistically independent. 2. s1 . S = {s0 . k − 1 and k −1 $ pk = 1 k=0 The following assumptions are made about the source 1. 2.

' $ I (sk ) = logbase 1 pk • In digital commmunication. the ‘base ’ is always 2. & % . data is binary.

e. sun rising in the east as opposed to that of the occurence of the Tsunami a Give & % . 2.e.. f or pk = 1a .. Information is always positive i. Information conveyed by a deterministic event is nothing i. I (sk ) ≥ 0 f or pk ≤ 1 examples about deterministic events. I (sk ) = 0.' $ Properties of Information 1.

More information is conveyed by a less probable event than a more probable event I (sk ) > I (si ). I (sk ) is never negative $ 4.e.' 3.. Information is never lost i. f or pk < pi a program that runs without faults as opposed to a program that gives a segmentation fault aa & % .

then its entropy is bounded by 0 ≤ H(s) ≤ log2 K & % .' Properties of Information $ • Entropy: The Entropy(H(s)) of a source is defined as: the average information generated by a discrete memoryless source. If K is the number of symbols generated by a source. Entropy defines the uncertainity in the source. K −1 H (s ) = k=0 p k I (s k ) • Properties of Entropy: 1.

. all other probabilities should be zero i. i = k .e. pi = 0. ∀ i. then H(s) = log2 K $ • Proof: – H (s ) ≥ 0 If the source has one symbol sk which has P (s = sk ) = pk = 1. If the source generates all the symbols equiprobably i.e. Therefore.' 2. ∀ k . If the source has an inevitable symbol sk for which pk = 1 then H (s ) = 0 3. 1 pk = K .. from the definition of entropy & % . then according to the properties.

· · · . p1 .' K −1 $ H (s ) = k=0 pk log 1 pk 1 pk = pk log = 1 log(1) =0 Therefore H(s) ≥ 0 – H(s) ≤ log2 K Consider a discrete memoryless source which has two different probability distributions & p0 . pk % .

· · · . we consider the following inequality. · · · . loge x ≤ x − 1 & (2) % . s1 . sk We begin with the equation K −1 $ pk log k=0 qk pk 1 K −1 = pk loge loge 2 k=0 qk pk (1) To analyze Eq.' and q0 . q1 . qk on the alphabet S = s0 . 1.

% . k=0 qk log 1 qk = log2 K So. K −1 $ qk pk 1 K −1 pk ≤ loge 2 k=0 pk log k=0 qk −1 pk K −1 ≤ 1 loge 2 1 K −1 qk − k=0 k=0 pk (3) ≤ K −1 (1 − 1) = 0 loge 2 =⇒ k=0 pk log qk pk ≤0 K −1 Suppose.2 in Eq. 1. qk = & 1 K.' Now. using Eq.∀ k then.

3 becomes K −1 K −1 $ pk log pk ≤ 0 k=0 K −1 K −1 pk log qk − k=0 =⇒ − k=0 K −1 pk log K ≤ k=0 K −1 pk log pk 1 pk 1 pk =⇒ k=0 pk log K ≥ k=0 K −1 K −1 pk log =⇒ log K k=0 K −1 pk ≥ k=0 pk log =⇒ k=0 pk log 1 pk ≤ log2 K Therefore H(s) ≤ log2 K & % .' in Eq.

H(s) has a maximum value when p0 = 1 2 as demonstrated in Figure 1. & % . Let the probabilities of symbols 0 and 1 be p0 and 1 − p0 respectively. 1}.' $ Entropy of a binary memoryless channel Consider a discrete memoryless binary source shown defined on the alphabet S = {0. The entropy of this channel is given by H(s) = −p0 log2 p0 − (1 − p0 ) log2 (1 − p0 ) According to the properties of Entropy.

' $ H(s) 1 0 1/2 1 p Figure 1: Entropy of a discrete memoryless binary source & % .

Sign up to vote on this title
UsefulNot useful

Master Your Semester with Scribd & The New York Times

Special offer for students: Only $4.99/month.

Master Your Semester with a Special Offer from Scribd & The New York Times

Cancel anytime.