Professional Documents
Culture Documents
Djamel Bouchaffra
z Introduction
z Mixture Densities and Identifiability
z ML Estimates
z Application to Normal Mixtures
z Other Neural Networks
z Introduction
where = [1 , 2 ,..., c ] t
Definition
1 x 1
P(x | ) = 1 (1 1 )1 x + 2x (1 2 )1 x
2 2
1
2 (1 + 2 ) if x = 1
=
1 - 1 (1 + 2 ) if x = 0
2
Assume that:
P(x = 1 | ) = 0.6 P(x = 0 | ) = 0.4
by replacing these probabilities values, we obtain:
1 + 2 = 1.2
z ML Estimates
n
i l = P(i | x k , ) i ln p(x k | i , i )
k =1
p(x k | i , i )P(i )
where : P(i | x k , ) = c This equation
enables clustering
p(x k | j , j )P( j )
j= 1
Case i i P(i) c
1 ? x x x
2 ? ? ? x
3 ? ? ? ?
[
ln p(x | i , i ) = ln (2)d / 2 i
1/ 2
] 21 (x )
i
t 1
i
(x i )
n
P(i | x k , ( j))x k
k =1
i ( j + 1) = n
P(i | x k , ( j))
k =1
Example (Class):
Consider the simple two-component one-dimensional
normal mixture
1 1 2 1
p( x | 1 , 2 ) = exp ( x 1 ) 2 + exp ( x 2 )2
3 2 2 3 2 2
(2 clusters!)
(which are not far from the true values: 1 = -2 and 2 = +2)
2 1 1 x 2 1 1
p( x | , ) = exp + exp x 2
2 2 . 2 2 2 2
1 1 1
p(x 1 | , 2 ) + exp x 12
2 2 2 2 2
For the rest of the samples:
1 1
p( x k | , 2 ) exp x k2
2 2 2
Finally,
1 1 1 1 n 2
p(x 1,..., x n | , 2 ) + exp x 12 exp 2 xk
14442 42443 (2 2 )n k =2
this term
0
Adding an assumption
Consider the largest of the finite local maxima of the
likelihood function and use the ML estimation.
We obtain the following:
1 n
P(i ) = P(i | xk , )
n k =1
n
P(i | xk , )xk
i = k =1n
Iterative
P(i | xk , )
scheme k =1
n
P(i | xk , )(xk i )(xk i )t
i = k =1 n
P(i | xk , )
k =1
Where:
1 / 2 1
i exp (x k i ) t i1(x k i )P(i )
P(i | x k , ) = c 2
1 / 2 1
j exp 2 (x k j ) t j 1(x k j )P( j )
j =1
K-Means Clustering
Begin
initialize n, c, 1, 2, , c(randomly
selected)
do classify n samples according to
nearest i
recompute i
until no change in I
return 1, 2, , c
End
2
Input x2 Output Units
units 3
x3
i=3 4
activation aj = xw
i =1
i ij = x T w j = w Tj x
value aj of
wk (t ) + ( x (t ) wk (t )) Weights Update Only for
Output j wk (t + 1) =
wk (t ) + ( x (t ) wk (t )) the winner Output k