You are on page 1of 1

Assignment #1

EC: 3606: Introduction to Machine Intelligence

1. Given the following set of training samples from two classes ω1 and ω2 having
equally likely prior distribution,
(3, 8), (2, 6), (4, 6), (3, 4) ∈ ω1
(3, 0), (3, -4), (1, -2), (5, -2) ∈ ω2
determine the decision surface between ω1 and ω2. Assume class conditional
probability density functions follow normal distribution. Solve this problem using
(i) standard equation of discriminant function
(ii) simplified solution of discriminant function

2. Consider a 2-class classification where the class are A and B. The feature vectors are
bivariate normally distributed of the form (x, y) ʹ. Feature components are
independent. For class A, x and y have means of 0 and standard deviation of 1 and 2
respectively. For class B, x and y have mean 4 & 0 and standard deviation are 2 & 1
respectively. The Prior Probabilities are p (A) = 1/3 and p (B) = 2/3. Find out the
equation of optimal decision Boundary.

3. In a two-class problem with single feature, the class conditional probability density
functions are given below: -
(𝑥 − 1)⁄4 , 1 ≤ 𝑥 < 3
𝑝(𝑥|𝑤1 ) = {
(5 − 𝑥)⁄4 , 3≤𝑥<5
𝑝(𝑥|𝑤2 )= 1/3 4≤𝑥 <7
The Prior probabilities are 𝑝(𝑤1) = 2/3 and 𝑝(𝑤2) = 1/3.
a. Compute the total classification error if Bayes minimum error classifier is used.
b. Using Bayes min error classifier classify x = 2.5
c. Compute the accuracy of this classification decision.

4. Consider a two-class problem having 3 independent binary features xi, i =1, 2, 3.


Given p (w1 ) = p (𝑤2 ), pi = pr (1|𝑤1) = 0.8 and qi = Prob (xi = 1|𝑤2)
Sketch decision boundary between the two class

You might also like