You are on page 1of 2

Code No: 55205/MT

NR
M.Tech. – II Semester Regular Examinations, September, 2008

PATTERN RECOGNITION & IMAGE PROCESSING


(Common to Computer Science & Engineering/ Computer Science)

Time: 3hours Max. Marks:60

Answer any FIVE questions


All questions carry equal marks
---

1.a) What is the Bayes error probability? Give an example for two class
case.
b) What is mixed probability density function?

2. Consider a two – class classification problem with the following


Gaussian class – conditional densities
P ( x / w1 ) ∼ N ( 0,1)
P ( x / w2 ) ∼ N (1, 2 )
Assume P (ω1 ) = P (ω2 ) and a 0-1 loss function
(a) Sketch the two densities on the same plot
(b) Suppose the following training sets D1 and D2 are available
from classes w1 and w2 respectively, to estimate µ1 , µ2,σ 12 , σ 22 .
Find the maximum likehood estimates µˆ1 , µˆ 2 , σˆ12 andσˆ 22
D1 = { 0.67, 1.19, -1.20. –0.02, -0.16 }
D2 = { 1.00, 0.55, 2.55, -1.65 , 1.61 }

3. Explain the application of hidden markov models for isolated


speech unit recognition. Give illustrate examples.

4. Write short notes on the following


(a) Image transforms (b) Contrast stretching (c) Binary images.

5. Consider a two – class classification problem and assume that the


classes are equally probable. Let the features X= ( x, x2 ,......, xd ) be
binary valued (1 or 0). Let Pij denote the probability that feature
xi takes the value 1 given class j. Let the features be conditionally
independent for both classes. Finally assume that d is odd and
that Pi1 = P > 1/ 2 and Pi 2 = 1 − P , for all i. Show that the optional
Bayes decision rule becomes: decide class one if
x1 + x2 + ...... + xd > d / 2 , and class two otherwise.

6.a) Explain the principle of Laplacian operator for edge detection.


b) Describe edge relaxation technique.

7. Write short notes on the following


(a) Similarity measures (b) Loss functions
(b) Feature selection

8.a) Give a definition and mathematical description of first order


markov model.
b) Briefly explain the basic problems of hidden markov models.

&_&_&_&

You might also like