You are on page 1of 2

# ECE 534 RANDOM PROCESSES FALL 2011

## PROBLEM SET 5 Due Tuesday, November 1

5. Two faces of Markov processesinference and dynamics
Assigned Reading: Sections 4.9 and 4.10, Chapter 5, and Chapter 6 through section 8.
Problems to be handed in:
1 Estimation of the parameter of an exponential in additive exponential noise Suppose
an observation Y has the form Y = Z +N, where Z and N are independent, Z has the exponential
distribution with parameter , N has the exponential distribution with parameter one, and > 0
is an unknown parameter. We consider two approaches to nding

ML
(y).
(a) Show that f
cd
(y, z|) =

e
y+(1)z
0 z y
0 else
(b) Find f(y|). The direct approach to nding

ML
(y) is to maximize f(y|) (or its log) with
respect to . You neednt attempt the maximization.
(c) Derive the EM algorithm for nding

ML
function dened by:
(y, ) = E[Z|y, ] =

1
1

y
exp((1)y)1
= 1
y
2
= 1
You neednt implement the algorithm.
2 Maximum likelihood estimation for HMMs Consider an HMM with unobserved data
Z = (Z
1
, . . . , Z
T
), observed data Y = (Y
1
, . . . , Y
T
), and parameter vector = (, A, B). Explain
how the forward-backward algorithm or the Viterbi algorithm can be used or modied to compute
the following:
(a) The ML estimator,

Z
ML
, of Z based on Y, assuming any initial state and any transitions i j
are possible for Z. (Hint: Your answer should not depend on or A.)
(b) The ML estimator,

Z
ML
, of Z based on Y, subject to the constraints that

Z
ML
takes values in
the set {z : P{Z = z} > 0}. (Hint: Your answer should depend on and A only through which
coordinates of and A are nonzero.)
(c) The ML estimator,

Z
1,ML
, of Z
1
based on Y.
(d) The ML estimator,

Z
to,ML
, of Z
to
based on Y, for some xed t
o
with 1 t
o
T.
3 Estimation of a critical transition time of hidden state in HMM Consider an HMM
with unobserved data Z = (Z
1
, . . . , Z
T
), observed data Y = (Y
1
, . . . , Y
T
), and parameter vector
= (, A, B). Let F {1, . . . , N
s
} and let
F
be the rst time t such that Z
t
F with the
convention that
F
= T + 1 if (Z
t
F for 1 t T).
(a) Describe how to nd the conditional distribution of
F
given Y, under the added assumption
that (a
i,j
= 0 for all (i, j) such that i F and j F), i.e. under the assumption that F is an
absorbing set for Z.
(b) Describe how to nd the conditional distribution of
F
given Y, without the added assumption
1
4 On distributions of three discrete-time Markov processes For each of the Markov pro-
cesses with indicated one-step transition probability diagrams, determine the set of equilibrium
distributions and whether lim
t

n
(t) exists for all choices of the initial distribution, (0), and
all states n.
(c)
1 1
1
0.5
0.5
0 1 2 3
(a)
. . .
0 1
2
3
(b)
1/2
2/3
1/3 1
1/2 3/4
1/4
. . .
0 1
2
3
3/4
1/4
2/3
1/3
1/2
1/2
5 On distributions of three continuous-time Markov processes For each of the Markov
processes with indicated one-step transition probability diagrams, determine the set of equilibrium
distributions and whether lim
t

n
(t) exists for all choices of the initial distribution, (0), and
all states n. (The answer for (c) may depend on the value of c; assume c 0.)
2
(a)
1
2 3
4
2
3
9
5
0 1
2
3 . . .
(c)
1 1 1 1
1+c 1+c/2 1+c/3 1+c/4
0 1
2
3
(b)
. . .
1 1 1 1
2 2 2
6 Generating a random spanning tree Let G = (V, E) be an undirected, connected graph
with n vertices and m edges (so |V | = n and |E| = m). Suppose that m n, so the graph has at
least one cycle. A spanning tree of G is a subset T of E with cardinality n 1 and no cycles. Let
S denote the set of all spanning trees of G. We shall consider a Markov process with state space S;
the one-step transition probabilities are described as follows. Given a state T, an edge e is selected
at random from among the mn+1 edges in E T, with all such edges having equal probability.
The set T {e} then has a single cycle. One of the edges in the cycle (possibly edge e) is selected
at random, with all edges in the cycle having equal probability of being selected, and is removed
from T {e} to produce the next state, T

.
(a) Is the Markov process irreducible (for any choice of G satisfying the conditions given)? Justify