Professional Documents
Culture Documents
Unit2 QB
Unit2 QB
3) Define Entropy and Information Gain. Consider the following data set, Identify which
attribute gives maximum insight.
4) What is probability estimation? Give the steps to construct probability estimation tree with
smoothed probabilities.
5)Discuss the different measures for selecting the attribute with the best split. Give suitable
example.
6) Consider the training example shown in below Table for a binary classification problem.
i. What is the entropy of this collection of training examples with respect to the positive class?
ii. What are the information gains of a 1 and a 2 relative to these training examples?
7)Construct probability estimation tree with smoothed probabilities for
the following data set.
9) Construct a probability estimation tree with smoothed probability for the following data set.
10) Define entropy and information gain for the following data set, identify which attributes gives
maximum insight.
11) Construct probability estimation tree with smoothed probabilities for the following data set:
12)Consider the following data set for a binary class problem. (a) Calculate the information gain
when splitting on A and B? Which attribute is chosen as the best split?
13)Define Entropy and information gain. Construct a decision tree for the following data set.
14) What is the need of “Laplace Correction” for probability estimation? Construct a probability
estimation tree for the following training set with “smoothed” probabilities.
15) Construct the probability estimation tree for the given training set.
16) Construct a decision tree for the following data set using information gain. Predict the class
label for a data point with values <Green, Circle, Small, “ ? “>