You are on page 1of 5


1 classifier is a technique for classifying record using

a collection of if-then rules.

a) Bayesian classifier b) rule-based classifier c) nearest
neighbor classifier d)none of these
2. Inductive and deductive steps are used in
a) rule-based classifier b) Bayesian classifier c) nearest
neighbor classifier
d) Both a and b
3provides a graphical representation of probabilistic
relationships among set of random variables.
a) Bayesian belief network b) rule belief network c)nearest
neighbor network
d) DAG
4. Multi neural networks with atleast one hidden layer are
a) Universal approximators b) hypothesis space c)Model
overfitting d) none of these.
5. The generation in which objective is to find all the itemsets that
satisfy the minsup threshold is called
a) Rule generation b) FP growth generation c) frequent itemset
generation d) universal generation
6. The algorithm that take different approach to discovering
frequent itemsets is
a) Cluster algorithm b) DBSCAN c) skewed algorithm d) FPgrowth algorithm
7..approach allows the user to constrain the human
user in the loop

a) Template based b) visualization c)objective measure

d)subjective measure
8. The analysis that groups the data objects based on information
found in the data that describes the objects and relationships is
a) Cluster analysis b) Association analysis c) Classification
analysis d) both a,c
9. Which of the following is not a characteristic of clusters
a) Shape b) Size c) densities d) noise
10 is a grid based clustering algorithm that
methodically finds subspace clusters.
a) CLIQUE b) DBSCAN c) chameleon d) BIRCH
11. FP growth depends on
a) Compaction factor b) visualization c) template d) both a
and b
12. Initial paths in graph are called
a) Postfix paths b) prefix paths c) First fix paths d) none of
13. Nodes embedded in ANN are called
a) Hidden nodes b) neural nodes c) recurrent nodes d) feed
forward nodes
14. SVM stands for.
15. Formula to find true positive rate is .
16. Fraction of transactions that contain an item set is..
17. If an itemset is frequent, then all of its subsets must also be
frequent is .principle
18. OPOSSUM stands for

19.. and .. are examples of Ensemble methods.

20. FP growth algorithm uses representation.
21. The sequential covering algorithm is often used to extract
rules directly from data [ ]
22. Example of lazy learner is Rote classifier [ ]
23. FP growth algorithm takes similar approach to discover a
frequent itemsets [ ]
24. Apriori algorithm uses level wise approach for generating
association rules [ ]
25. Sparse data consists of symmetric attributes


26. Apriori algorithm increases with larger number of transactions

[ ]
27. Inversion property is the property of Objective measure [ ]
28. In fuzzy clustering every object belongs to every cluster with a
Weight that is between 0 and 1 [ ]
29. Prototype based clustering techniques creates two level
partitioning of data objects [ ]
30. MLE is maximum likelihood estimation [ ]

ANSWERS: 1.b 2.c 3.a 4.a 5.c

11.a 12.b 13.a


7.b 8.a

9.d 10.a

14. Support vector machines 15.TP/(TP+FN) 16. Support

17.Apriori principle
18. Optimal partitioning of sparse similarities using METIS.
Bagging and
21. T
30. T


20. FP tree representation.

22. T 23. F

24. T

25. F 26. T

27. T 28. T 29. F