Professional Documents
Culture Documents
1. Establish the optimality of Bayes classfication rule. That is, any other
partition of the input space into two regions will result in a greater prob-
ability of error.
2. Extend the minimum error rate argument to include the case when there
are more than two classes.
3. Derive Bayes rule when observations come are discrete random variables.
That is, x = [x1 , x2 · · · xd ] where xi ∈ {0, 1}∀i. The following are the
parameters of the system :
• p(x|π1 ) = N (0, I)
• p(x|π2 ) = N ((1, 1), (I))
• p(x|π2 ) = 0.5N ((0.5, 0.5), I) + N ((−0.5, 0.5), I)
Assume that all the three classes are equi-probable. Classify the point
x = (0.3, 0.3)
6. Consider a two category case with two equiprobable classes (P (π1 ) =
P (π2 ) = 0.5)) and respective distributions given by p(x|πi ) ∼ N (µi , σ 2 ).
R∞ 2
Show that the probability of error Pe = √12π a e−u /2 du where a =
|µ1 −µ2 |
2σ