Professional Documents
Culture Documents
KNN 2
KNN 2
K-Nearest Neighbors
Correct Answer
Partially Correct
Incorrect Answer
https://almabetter.onlinetests.app/Assess.aspx?guid=5D62B689D7AE4738A386198E3E9FC0F1&a=R1 1/10
12/17/22, 7:43 AM Assessment Review - AlmaBetter
1 You have given the following 2 statements, find which of these option is/are true in case of k-NN? 1) In
case of very large value of k, we may include points from other classes into the neighborhood. 2) In case
of too small value of k the algorithm is very sensitive to noise
Your Answer
1 and 2
Correct Answer
1 and 2
Justification
None.
https://almabetter.onlinetests.app/Assess.aspx?guid=5D62B689D7AE4738A386198E3E9FC0F1&a=R1 2/10
12/17/22, 7:43 AM Assessment Review - AlmaBetter
Your Answer
When you increase the k the bias will be increases
Correct Answer
When you increase the k the bias will be increases
Justification
None.
https://almabetter.onlinetests.app/Assess.aspx?guid=5D62B689D7AE4738A386198E3E9FC0F1&a=R1 3/10
12/17/22, 7:43 AM Assessment Review - AlmaBetter
3 For a large k value the k-nearest neighbor modelbecomes _____ and ______ .
Your Answer
Complex model, Underfit
Correct Answer
Simple model, Underfit
Justification
None.
https://almabetter.onlinetests.app/Assess.aspx?guid=5D62B689D7AE4738A386198E3E9FC0F1&a=R1 4/10
12/17/22, 7:43 AM Assessment Review - AlmaBetter
4 Which of the following machine learning algorithm can be used for imputing missing values of both
categorical and continuous variables?
Your Answer
k-NN
Correct Answer
k-NN
Justification
None.
https://almabetter.onlinetests.app/Assess.aspx?guid=5D62B689D7AE4738A386198E3E9FC0F1&a=R1 5/10
12/17/22, 7:43 AM Assessment Review - AlmaBetter
Your Answer
When you increase the k the bias will be increases
Correct Answer
When you increase the k the bias will be increases
Justification
None.
https://almabetter.onlinetests.app/Assess.aspx?guid=5D62B689D7AE4738A386198E3E9FC0F1&a=R1 6/10
12/17/22, 7:43 AM Assessment Review - AlmaBetter
6 Below are two statements given. Which of the following will be true both statements? 1) k-NN is a
memory-based approach is that the classifier immediately adapts as we collect new training data. 2) The
computational complexity for classifying new samples grows linearly with the number of samples in the
training dataset in the worst-case scenario.
Your Answer
1 and 2
Correct Answer
1 and 2
Justification
None.
https://almabetter.onlinetests.app/Assess.aspx?guid=5D62B689D7AE4738A386198E3E9FC0F1&a=R1 7/10
12/17/22, 7:43 AM Assessment Review - AlmaBetter
7 Which of the following will be Euclidean Distance between the two data point A(1,3) and B(2,3)?
Your Answer
1
Correct Answer
1
Justification
None.
https://almabetter.onlinetests.app/Assess.aspx?guid=5D62B689D7AE4738A386198E3E9FC0F1&a=R1 8/10
12/17/22, 7:43 AM Assessment Review - AlmaBetter
8 Following are the two statements given for k-NN algorthm, which of the statement(s) is/are true? 1) We
can choose optimal value of k with the help of cross validation 2) Euclidean distance treats each feature
as equally important
Your Answer
1 and 2
Correct Answer
1 and 2
Justification
None.
https://almabetter.onlinetests.app/Assess.aspx?guid=5D62B689D7AE4738A386198E3E9FC0F1&a=R1 9/10
12/17/22, 7:43 AM Assessment Review - AlmaBetter
Page 1 of 1
Summary
I'm done.
Software by
Version 11.2
https://almabetter.onlinetests.app/Assess.aspx?guid=5D62B689D7AE4738A386198E3E9FC0F1&a=R1 10/10