Professional Documents
Culture Documents
K-Means and Hierarchical Clustering: Some Data
K-Means and Hierarchical Clustering: Some Data
Andrew would be delighted if you found this source material useful in giving your own lectures. Feel free to use these slides verbatim, or to modify them to fit your own needs. PowerPoint originals are available. If you make use of a significant portion of these slides in your own lecture, please include this message, or the following link to the source repository of Andrews tutorials: http://www.cs.cmu.edu/~awm/tutorials . Comments and corrections gratefully received.
Some Data
This could easily be modeled by a Gaussian Mixture (with 5 components) But lets look at an satisfying, friendly and infinitely popular alternative
Suppose you transmit the coordinates of points drawn randomly from this dataset. You can install decoding software at the receiver. Youre only allowed to send two bits per point. Itll have to be a lossy transmission. Loss = Sum Squared Error between decoded coords and original coords. What encoder/decoder will lose the least information?
Copyright 2001, 2004, Andrew W. Moore
Lossy Compression
Suppose you transmit the coordinates of points Break into a grid, drawn decode each bit-pair randomly from this dataset.
each You can install decoding grid-cell software at the receiver. as the middle of
Idea One
Youre only allowed to send two bits per point. Itll have to be a lossy transmission. Loss = Sum Squared Error between decoded coords and original coords. What encoder/decoder will lose the least information?
Copyright 2001, 2004, Andrew W. Moore
00
01
10
11
Any
eas? er Id Be t t
Suppose you transmit the Break into coordinates of points drawna grid, decode randomly from thiseach bit-pair as the dataset.
that grid-cell You can install decoding software at the receiver. centroid of all data in
Idea Two
Youre only allowed to send two bits per point. Itll have to be a lossy transmission. Loss = Sum Squared Error between decoded coords and original coords. What encoder/decoder will lose the least information?
Copyright 2001, 2004, Andrew W. Moore
00
01
10
11
as? r I de urthe ny F A
K-means and Hierarchical Clustering: Slide 5
K-means
1. Ask user how many clusters theyd like.
(e.g. k=5)
K-means
1. Ask user how many clusters theyd like.
(e.g. k=5)
K-means
1. Ask user how many clusters theyd like.
(e.g. k=5)
2. Randomly guess k cluster Center locations 3. Each datapoint finds out which Center its closest to. (Thus each Center owns a set of datapoints)
K-means
1. Ask user how many clusters theyd like.
(e.g. k=5)
2. Randomly guess k cluster Center locations 3. Each datapoint finds out which Center its closest to. 4. Each Center finds the centroid of the points it owns
K-means
1. Ask user how many clusters theyd like.
(e.g. k=5)
2. Randomly guess k cluster Center locations 3. Each datapoint finds out which Center its closest to. 4. Each Center finds the centroid of the points it owns 5. and jumps there 6. Repeat until terminated!
Copyright 2001, 2004, Andrew W. Moore K-means and Hierarchical Clustering: Slide 10
K-means Start
Advance apologies: in Black and White this example will deteriorate Example generated by Dan Pellegs super-duper fast K-means system:
Dan Pelleg and Andrew Moore. Accelerating Exact k-means Algorithms with Geometric Reasoning. Proc. Conference on Knowledge Discovery in Databases 1999, (KDD99) (available on www.autonlab.org/pap.html)
K-means continues
K-means continues
K-means continues
K-means continues
K-means continues
K-means continues
K-means continues
K-means continues
K-means terminates
10
K-means Questions
What is it trying to optimize? Are we sure it will terminate? Are we sure it will find an optimal clustering? How should we start it? How could we automatically choose the number of centers?
.well deal with these questions over the next few slides
Copyright 2001, 2004, Andrew W. Moore K-means and Hierarchical Clustering: Slide 21
Distortion
Given.. an encoder function: ENCODE : m [1..k] a decoder function: DECODE : [1..k] m Define
11
Distortion
Given.. an encoder function: ENCODE : m [1..k] a decoder function: DECODE : [1..k] m Define
DECODE[ j ] = c j
R i =1
so Distortion = (x i c ENCODE ( xi ) ) 2
12
What properties must centers c1 , c2 , , ck have when distortion is minimized? (1) xi must be encoded by its nearest center .why?
What properties must centers c1 , c2 , , ck have when distortion is minimized? (1) xi must be encoded by its nearest center Otherwise distortion could be .why? reduced by replacing ENCODE[xi] by the nearest center
13
What properties must centers c1 , c2 , , ck have when distortion is minimized? (2) The partial derivative of Distortion with respect to each center location must be zero.
(2) The partial derivative of Distortion with respect to each center location must be zero.
Distortion
= =
(x
i =1 k
c ENCODE ( xi ) ) 2
j =1 iOwnedBy(c j )
(x
c j )2
Distortion c j
= = =
c j
iOwnedBy(c j ) i iOwnedBy(c j )
(x
c j )2 cj)
(x
0 (for a minimum)
14
(2) The partial derivative of Distortion with respect to each center location must be zero.
Distortion
= =
(x
i =1 k
c ENCODE ( xi ) ) 2
j =1 iOwnedBy(c j )
(x
c j )2
Distortion c j
= = =
c j
iOwnedBy(c j ) i iOwnedBy(c j )
(x
c j )2 cj)
1 xi | OwnedBy(c j ) | iOwnedBy(c j )
K-means and Hierarchical Clustering: Slide 29
(x
What properties must centers c1 , c2 , , ck have when distortion is minimized? (1) xi must be encoded by its nearest center (2) Each Center must be at the centroid of points it owns.
15
What properties can be changed for centers c1 , c2 , , ck have when distortion is not minimized? (1) Change encoding so that xi is encoded by its nearest center (2) Set each Center to the centroid of points it owns. Theres no point applying either operation twice in succession. But it can be profitable to alternate. And thats K-means!
Easy to prove this procedure will terminate in a state at which neither (1) or (2) change the configuration. Why?
Copyright 2001, 2004, Andrew W. Moore K-means and Hierarchical Clustering: Slide 31
of way finite number R re are only a The . ( of possible 2 into k groups records Distortion = nux iberc ENCODE ( x i ) ) m roids of i only a finite=1 rs are the cent So there are hich all Cente w What properties canin n. configurations be changed for centers c1 , c2 , , ck t have ts they ow the distortion is not minimized? ration, it mus have when poin ges on an ite ation chan If the configur distortion. x is encoded by itsgo to a (1) Change encoding so that i ion changes it must nearest center improved the nfigurat ch time the co So eaCenter to the centroid before. (2) Set eachiguration its never been to of points tually run out it owns. conf would even on forever, it go So if it tr applying Theres no pointied to . either operation twice in succession. figurations of con
Easy to prove this procedure will terminate in a state at which neither (1) or (2) change the configuration. Why?
Copyright 2001, 2004, Andrew W. Moore K-means and Hierarchical Clustering: Slide 32
16
17
18
k=#Centers
R=#Records
19
20
21
22
1. Say Every point is its own cluster 2. Find most similar pair of clusters 3. Merge it into a parent cluster 4. Repeatuntil youve merged the whole dataset into one cluster
Also known in the trade as Hierarchical Agglomerative Clustering (note the acronym)
23
24