You are on page 1of 1

Multi-class Entropy

Last time, you saw this equation for entropy for a bucket with mmm red balls and
nnn blue balls:

We can state this in terms of probabilities instead for the number of red balls as
p1p_1p1 and the number of blue balls as p2p_2p2:

entropy=−p1log⁡2(p1)−p2log⁡2(p2)entropy = -p_1\log_2(p_1)-
p_2\log_2(p_2)entropy=−p1log2(p1)−p2log2(p2)

This entropy equation can be extended to the multi-class case, where we have three
or more possible values:

entropy=−p1log⁡2(p1)−p2log⁡2(p2)−…
−pnlog⁡2(pn)=−∑i=1npilog⁡2(pi)entropy = -p_1\log_2(p_1) -

You might also like