You are on page 1of 1

Focus Group: Electrical Engineering

Indian Institute of Technology Jodhpur

EE 321: Information Theory and Coding


2014-15 Second Semester (December 2014 - April 2015)

Homework 3
Entropy Rates of a Stochastic Process

Question 3.1
Suppose we observe one of two stochastic processes but dont know which. What is the entropy rate?
Specifically, let X11 , X12 , X13 , be a Bernoulli process with parameter p1 and let X21 , X22 , X23 ,
be a Bernoulli process with parameter p2 . Let
(
1 with probability 21
=
(1)
2 with probability 12
and let Yi = Xi , i = 1, 2, , be the stochastic process observed. Thus Y observes the process {X1i }
or {X2i }. Eventually, Y will know which.
1) Is {Yi } stationary?
2) Is {Yi } an i.i.d. process?
3) What is the entropy rate H of {Yi }?
4) Does
1
log(Y1 , Y2 , , Yn ) H?
n
5) Is there a code that achieves an expected per-symbol description length

1
n ELn

H?

Question 3.2
n where Y = (the number of 1s
Let {Xi } Bernoulli(p) . Consider the associated Markov chain Yi=1
i
in the current run of 1s). For example, if Xn = 101110 , we have Yn = 101230 .
1) Find the entropy rate of X n .
2) Find the entropy rate of Y n .

You might also like