Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Look up keyword
Like this
8Activity
0 of .
Results for:
No results containing your search query
P. 1
55205-mt----pattern recognition & image processing

55205-mt----pattern recognition & image processing

Ratings: (0)|Views: 245|Likes:
Published by SRINIVASA RAO GANTA

More info:

Published by: SRINIVASA RAO GANTA on Sep 19, 2008
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

05/09/2014

pdf

text

original

NR
Code No: 55205/MT
M.Tech. \u2013 II Semester Regular Examinations, September, 2008
PATTERN RECOGNITION & IMAGE PROCESSING
(Common to Computer Science & Engineering/ Computer Science)
Time: 3hours
Max. Marks:60
Answer any FIVE questions
All questions carry equal marks
- - -
1.a) What is the Bayes error probability? Give an example for two class
case.
b) What is mixed probability density function?
2.
Consider a two \u2013 class classification problem with the following
Gaussian class \u2013 conditional densities
(
)
(
1
/
0
P x w
N
\u223c
)
,1
)
,2
2
(
)
(
2
/
1
P x w
N
\u223c
AssumeP() ()
1
P
\u03c9
\u03c9
=
and a 0-1 loss function
(a)
Sketch the two densities on the same plot
(b)
Suppose the following training sets
and
are available
from classes
and
respectively, to estimate
1
D
2
D
1
w
2
w
2
2
1
2, 1
2
,
,
\u00b5 \u00b5\u03c3
\u03c3
2
2
1
2
\u02c6
and
.
Find the maximum likehood estimates
1
2
, ,
\u02c6\u02c6 \u02c6
\u00b5 \u00b5 \u03c3
\u03c3
1
D= { 0.67, 1.19, -1.20. \u20130.02, -0.16 }
2
D= { 1.00, 0.55, 2.55, -1.65 , 1.61 }
3.
Explain the application of hidden markov models for isolated
speech unit recognition. Give illustrate examples.
4.
Write short notes on the following
(a) Image transforms (b) Contrast stretching (c) Binary images.
5.
Consider a two \u2013 class classification problem and assume that the
classes are equally probable. Let the features X=(
)
2
, ,......,d
xx
xbe
binary valued (1 or 0). Let
denote the probability that feature
ij
P
i
xtakes the value 1 given class j. Let the features be conditionally
independent for both classes. Finally assume that d is odd and
that
and
1
1/ 2
i
P P
=
>
2
1
i
P
P
= \u2212, for all i. Show that the optional

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->