# D E P AR T M E N T

Time Allowed: 3 Hrs

**UNIVERSITY OF ENGINEERING & TECHNOLOGY LAHORE
**

OF

COMPUTER SCIENCE

AN D

ENGINEERING

Artificial Intelligence

Maximum Marks: 100

Attempt Any Five Questions. All Questions Carry Equal Marks Question 1: A- Differentiate between informed and uninformed searches. 5

B- Consider the tree shown below. The numbers on the arcs are the arc lengths; the numbers near states B,C, and D are the heuristic estimates; all other states have a heuristic estimate of 0. Assume that the children of a node are expanded in alphabetical order when no other order is specified by the search and that the goal is state J. No “Visited” or “Expanded” lists are used. What order would the states be expanded by each type of the search? Write only the sequence of states expanded by each search. 1-Breadth First 2- Depth First 3- Progressive deepening Search 4- Best first 5- A* Please show all the calculations. 15

Question 2: A- Consider the graph shown below where the numbers on the links are link costs and the numbers next to the states are heuristic estimates. Here A be the start state and G be the goal state. Simulate A* search with a strict expanded list on this graph. Please show length of the path and total estimated cost along with the expanded list. 14

B- Prove that A* is an optimal search strategy. Question 3: A- What are the two consistency conditions in A* search. B- Explain arc consistency condition in CSP problems.

6 4 3

1/4

We always prefer over fitting B. 13 The naïve bayes assumption states
P ( X \ Ci ) = ∏ P ( x K \ Ci )
k =1
n
And the class prior probabilities can be estimated by And the Bayes rule is stated as
P(Ci ) =
Si S
P(Ci \ X ) =
A 0 0 1 0 0 1 1 1 1 1 0 B 1 0 0 0 0 0 1 0 1 0 0
P ( X \ Ci ) P(Ci ) P( X )
D 0 1 0 1 0 1 1 0 1 1 1 CLASS 1 1 1 1 1 0 0 0 0 0 ?
C 1 1 1 1 0 0 0 0 0 1 1
B – What is the class independence assumption and why we use it in Naïve Bayesian classification.Explain backtracking with forward checking technique for colouring the graph with constrint of not having same colurs at the adjacent boundaries. 4 C – In order to calculate class’s posterior probability we have to multiply all the class independent posterior probabilities which could result in a very small number.A Function B.An Attribute C.We some time prefer over fitting C.False III) The leaves of the decision trees bears A. Domains values of the three variables are given inside the ellipse. Please note that we are only interested in the maximum posterior probability and not with the exact answer. 3 Question 5: A.True B.A Class
2/4
. 13
Question 4: AUse naïve Bayesian networks on the data give in the table below to calculate the probability of the last sample in the table below.We never prefer over fitting II) Post pruning requires more computation but have more reliable and accurate results A.C.Chose appropriate answer 3 I) In constructing Decision Trees for rule extraction A. Please suggest any mathematical solution in order to overcome this problem.

True B. with (x. Discuss 5 C.Explain in your own words what is over fitting and how.False III) Clustering is an unsupervised classification technique A.Discuss. respectively.means algorithm to show: The 3 cluster centers after the FIRST round execution 10 Question 7: A. 10). Name the types of tree pruning techniques and discuss them briefly. y) representing their locations. 2).A3(8.Select appropriate answer 3 I) Clustering stops when cluster centers are moved to the mean of clusters A. Suppose we initially assign A1. 9) The distance function is Euclidean distance.A2(2.B2(7. You ask the patient to keep a journal of all the foods they eat for a week. One of your patients complains of a food allergy.True B.B3(6.D. A. and whether or not they had an allergic reaction on each day. B1 and C1 as the centre of each cluster.C1(1. 5).What do you understand by cross validation? How can we improve the learning process using kfold cross validation. into three clusters. the role and use of entropies in building decision trees
3
Question 6: A. in your opinion over fitting can cause problems in machine learning. Use the k .False II) Initial cluster centers have to be data points.True B.Please analyze critically the pros and cons of nearest neighbour algorithm especially in perspective of noisy data.False B.Differentiate between supervised and unsupervised classification 2
D. 5 C. 5
3/4
.A Rule B. 5 D. 5 B. 8). A1(2.You are an allergist. 5).C2(4. but cannot determine its cause. in no more than five lines. Here is the journal: You have to construct a decision tree with zero error.Suppose that the task is to cluster the following 8 points. 14
C. 4).In machine learning discuss the impact of memory.In decision trees when a tree is fully grown then some time in order to avoid over fitting we have to prune the branches of the tree. 4).B1(5. average and generalization on the learning process with examples.

What are the two types of ambiguity? Find ambiguity in the following sentences and how it can be removed? 5 London had snow yesterday. It will continue cold today.“that” (S ((gap NP)) () ) B. Rel Clause is a sentence having “that” word followed by missing NP gap.(Verb/ tr)(NP ?vpg0 ?vpg1) (VP ?vpg0 ?vpg0) : . It fell to a depth of 1 meter.(NP ?sg0 ?sg1) (VP ?sg1 ?sg2) (VP ?vpg0 ?vpg1) : .(Det) (Noun) (RelClause) (NP ?npg0 ?npg0) : . For parsing process.(Name) (NP ((gap NP)) () ) (RelClause) : .
4/4
.(Verb/ itr) (NP ?npg0 ?npg0) : . It also had fog.
10 5
C.Parse the following Rel Clause “The person that John called ran”.(Name) (NP ?npg0 ?npg0) : . Grammar is as follows: (S ?sg0 ?sg2) : .Question 8: A.Write down the problems with top-down and bottom-up parsing.