You are on page 1of 1

practice questions- set 1

January 23, 2018

1. Establish the optimality of Bayes classfication rule. That is, any other
partition of the input space into two regions will result in a greater prob-
ability of error.
2. Extend the minimum error rate argument to include the case when there
are more than two classes.
3. Derive Bayes rule when observations come are discrete random variables.
That is, x = [x1 , x2 · · · xd ] where xi ∈ {0, 1}∀i. The following are the
parameters of the system :

qi = prior probability of class i = 1, 2


pi (xi = 1|π1 ) = pi ; pi (xi = 1|π2 ) = ri
Derive the decision function with Naive Bayes assumption.
4. Incorporate loss of misclassification given by λ12 and λ21 into the previous
problem and find the decision cirterion.
5. Suppose we have three categories with the underlying distributions:

• p(x|π1 ) = N (0, I)
• p(x|π2 ) = N ((1, 1), (I))
• p(x|π2 ) = 0.5N ((0.5, 0.5), I) + N ((−0.5, 0.5), I)
Assume that all the three classes are equi-probable. Classify the point
x = (0.3, 0.3)
6. Consider a two category case with two equiprobable classes (P (π1 ) =
P (π2 ) = 0.5)) and respective distributions given by p(x|πi ) ∼ N (µi , σ 2 ).
R∞ 2
Show that the probability of error Pe = √12π a e−u /2 du where a =
|µ1 −µ2 |

You might also like