You are on page 1of 6

NPTEL Online Certification Courses

Indian Institute of Technology Kharagpur

Deep Learning
Assignment- Week 2
TYPE OF QUESTION: MCQ/MSQ
Number of questions: 10 Total mark: 10 X 2 = 20
______________________________________________________________________________

QUESTION 1:
Two random variables 𝑋1 and 𝑋2 follows Gaussian distribution with following mean and
variance.
𝑋1~𝑁 (0, 3) 𝑎𝑛𝑑 𝑋2~𝑁 (0, 2)

Which of the following options is true?

a. Distribution of 𝑋1 will be flatter than the distribution of 𝑋2


b. Distribution of 𝑋2 will be flatter than the distribution of 𝑋1
c. Peak of the both distributions will be at same height
d. None of the above

Correct Answer: a

Detailed Solution:

As 𝑿𝟏 has more variance than the 𝑿𝟐, so the distribution of 𝑿𝟏 will be more spread than
the 𝑿𝟐. So, distribution of 𝑿𝟏 will be flatter than distribution of 𝑿𝟐.

QUESTION 2:
In which scenario the discriminant function will be linear when a two class Bayesian classifier is
used to classify two class of points distributed normally? Choose the correct option.

I. Σ1 = Σ2 but Σ is not an identity matrix


II.Σ1 = Σ2 but Σ is an identity matrix III. Σ1 ≠ Σ2

a. Only II
b. Both I and II
c. Only III
d. None of the above

Correct Answer: b
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

Detailed Solution:

Discriminant function is linear when Σ1 = Σ2

QUESTION 3:

Choose the correct option regarding discriminant functions gi(x) for multiclass classification (x is the
feature vector to be classified).

Statement i : Risk value R (α i|x) in Bayes minimum risk classifier can be used as a discriminant function.
Statement ii : Negative of Risk value R (α i|x) in Bayes minimum risk classifier can be used as a
discriminant function.
Statement iii : Negative of Aposteriori probability P(ωi|x) in Bayes minimum error classifier can be used
as a discriminant function.
Statement iv : Aposteriori probability P(ωi|x) in Bayes minimum error classifier can be used as a
discriminant function.

a. Only Statement i is true


b. Both Statements ii and iii are true
c. Both Statements i and iv are true
d. Both Statements ii and iv are true

Correct Answer: d

Detailed Solution:

Follow the Lecture 06 for detailed explanation

QUESTION 4:
If we choose the discriminant function 𝑔𝑖 (𝑥) as a function of posterior probability. i.e. 𝑔𝑖 (𝑥 ) =
𝑓(𝑝(𝑤𝑖⁄𝑥 )). Then, which of following cannot be the function 𝑓( )?

a. f (x) = a𝑥 , 𝑤ℎ𝑒𝑟𝑒 𝑎 > 1


b. f (x) = a−𝑥 ,𝑤ℎ𝑒𝑟𝑒 𝑎 > 1
c. f(x) = 2x + 3
d. 𝑓(𝑥) = exp(𝑥)

Correct Answer: b

Detailed Solution:

The function f () should be a monotonic increasing function.


NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

QUESTION 5:
For a two-class problem, the linear discriminant function is given by 𝑔(𝑥) = 𝑎𝑡 𝑦. What is
updation rule for finding the weight vector 𝑎? Here 𝑦 is augmented feature vector.

a. 𝑎( 𝑘 + 1 ) = 𝑎( 𝑘 ) + 𝜂 ∑ 𝑦
b. 𝑎( 𝑘 + 1 ) = 𝑎( 𝑘 ) − 𝜂 ∑ 𝑦
c. 𝑎( 𝑘 + 1 ) = 𝑎(𝑘 − 1) − 𝜂𝑎(𝑘)
d. 𝑎( 𝑘 + 1 ) = 𝑎(𝑘 − 1) + 𝜂𝑎(𝑘)

Correct Answer: a

Detailed Solution:

𝑎( 𝑘 + 1 ) = 𝑎( 𝑘 ) + 𝜂 ∑ 𝑦

For derivation refer to video lectures.

QUESTION 6:
You are given some data points for two different class.
Class 1 points: {(11, 11), (13, 11), (8, 10), (9, 9), (7, 7), (7, 5), (15, 3)}
Class 2 points: {(7, 11), (15, 9), (15, 7), (13, 5), (14, 4), (9, 3), (11, 3)}
Compute the mean vectors 𝜇1 and 𝜇 2 for these two classes and choose the correct option.
a. 𝜇1 = [10] and 𝜇 2 = [ 6 ]
10 12
b. 𝜇1 = [ ] and 𝜇 2 = [12]
12
8 7
10 10
c. 𝜇1 = [ ] and 𝜇 2 = [ ]
7 7
10 12
d. 𝜇1 = [ ] and 𝜇 2 = [ ]
8 6

Correct Answer: d

Detailed Solution: Add the points for each class and divide the output by number of points
i.e., 7
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

QUESTION 7:
You are given some data points for two different class es.
Class 1 points: {(11, 11), (13, 11), (8, 10), (9, 9), (7, 7), (7, 5), (15, 3)}
Class 2 points: {(7, 11), (15, 9), (15, 7), (13, 5), (14, 4), (9, 3), (11, 3)}
Compute the covariance matrices Σ1 and Σ2 for these two classes and choose the correct option.

a. Σ1 = [8.29 0 ] and Σ = [ 8.29 −1.0]


2
0 8.29 −1.0 8.29
b. Σ1 = [1 0 ] and Σ2 = [ 1 0 ]
0 1 0 1
c. Σ1 = [ 3.65 −1.0] and Σ2 = [ 9.67 −0.85]
−0.85 3.65 −0.85 9.67
d. Σ1 = [ 8.29 −0.85 ] and Σ2 = [ 8.29 −0.85]
−0.85 8.29 −0.85 8.29

Correct Answer: d

Detailed Solution:

𝚺𝟏 = 𝚺𝟐 = 𝚺 = [ 𝟖. 𝟐𝟗 −𝟎. 𝟖𝟓] , Follow the steps mentioned in the lecture video


−𝟎. 𝟖𝟓 𝟖. 𝟐𝟗

QUESTION 8:
You are given some data points for two different class.
Class 1 points: {(11, 11), (13, 11), (8, 10), (9, 9), (7, 7), (7, 5), (15, 3)}
Class 2 points: {(7, 11), (15, 9), (15, 7), (13, 5), (14, 4), (9, 3), (11, 3)}
Assume that the points are samples from normal distribution and a two class Bayesian classifier
is used to classify them. Also assume the prior probability of the classes are equal i.e.,
𝑝(𝜔1 ) = 𝑝(𝜔2 )
Which of the following is true about the corresponding decision boundary used in the classifier?
(Choose correct option regarding the given statements)

Statement i: Decision boundary passes through the midpoint of the line segment joining the
means of two classes
Statement ii: Decision boundary will be orthogonal bisector of the line joining the means of two
classes.

a. Only Statement i is true


NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

b. Only Statement ii is true

c. Both Statement i and ii are true

d. None of the statements are true

Correct Answer: a

Detailed Solution:

𝟖. 𝟐𝟗 −𝟎. 𝟖𝟓
𝚺𝟏 = 𝚺𝟐 = 𝚺 = [ ] but 𝚺 is not an identity matrix So, only option a is correct.
−𝟎. 𝟖𝟓 𝟖. 𝟐𝟗

QUESTION 9:
You are given some data points for two different class.
Class 1 points: {(11,11), (13,11), (8,10), (9,9), (7,7), (7,5), (15,3)}
Class 2 points: {(7,11), (15,9), (15,7), (13,5), (14,4), (9,3), (11,3)}
Classify the following two new samples (𝐴 = (6,11), 𝐵 = (14,3)) using K-nearest neighbor.
Where K=3. Use Manhattan distance as a distance function.

Given two points (x_1, y_1) and (x_2, y_2), the Manhattan Distance d between them is:
d = |x_1 - x_2| + |y_1 - y_2|

a. A belongs to class 1 and B belongs to class 1.


b. A belongs to class 2 and B belongs to class 2.
c. A belongs to class 1 and B belongs to class 2.
d. A belongs to class 2 and B belongs to class 1.

Correct Answer: c

Detailed Solution:
Calculate Manhattan distance of each point of Class 1 and 2 from A = (6,11). Find the
closest 3 points. From class 1, 2 points are closest to A=(6,11) whereas from class 2, 1 point
is close to A.
Follow similar procedure for Point B.

QUESTION 10:
Suppose if you are solving a four-class problem, how many discriminants function you will need
for solving?

a. 1
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

b. 2
c. 3
d. 4

Correct Answer: d

Detailed Solution:
For n class problem we need n number of discriminant function.

You might also like