Professional Documents
Culture Documents
A Gaussian Mixture Model comprises a weighted sum of M Gaussian probability density functions, as dened below:
M i=1
p(x|i ) =
2 i N (i , i )
(a) How should the weights i be constrained, in order to ensure that the distribution is normalized? (b) If we consider a generative approach to model the one dimensional training vectors, how many parameter vectors are to be estimated? (c) Given the set of n one dimensional training data {x1 , x2 , .....xn } , write down an expression for the likelihood function. (d) Derive closed form expressions for the Maximum Likelihood estimates of the mean 2 . i and variance i (e) By incorporating the constraint from part (a), derive a closed form expression for the Maximum Likelihood estimate of the weights i . Consider 2 equally probable categories 1 and 2 having Poisson distributions as the class conditional densities, with parameters 1 and 2 respectively. Also assume 1 > 2 . P (x|) = Compute (a) the Bayes classication decision (b) the Bayes error rate e x x!