Professional Documents
Culture Documents
CS771: Intro to ML
3
Probabilistic PCA (PPCA)
Assume obs as a linear mapping of a latent var + Gaussian noise
offset matrix
Drawn from a zero-mean -dim
𝒙 𝑛 =𝝁+𝑾 𝒛 𝑛 +𝝐𝑛 Gaussian
Equivalent to saying
Assume a zero-mean Gaussian prior on , so
CS771: Intro to ML
5
Learning PPCA using EM
The ILL is with integrated out
Ignoring for notational simplicity, ILL is
Can maximize ILL but requires solving eigen-decomposition (PRML: 12.2.1)
EM will instead maximize expected CLL, with CLL given by
Expected CLL will need and where the expectations are w.r.t. the conditional
posterior of
CS771: Intro to ML
6
Learning PPCA using EM Using and and Gaussian
reverse conditional property Ignoring the
parameter
The EM algo for PPCA alternates between two steps
Compute conditional posterior of given parameters
CS771: Intro to ML
7
Full EM algo for PPCA
CS771: Intro to ML
8
Other Generative Models for Dim-Red
Factor Analysis is similar to PPCA except that the noise covariance of a diagonal
matrix instead of
CS771: Intro to ML