This action might not be possible to undo. Are you sure you want to continue?

BooksAudiobooksComicsSheet Music### Categories

### Categories

### Categories

### Publishers

Scribd Selects Books

Hand-picked favorites from

our editors

our editors

Scribd Selects Audiobooks

Hand-picked favorites from

our editors

our editors

Scribd Selects Comics

Hand-picked favorites from

our editors

our editors

Scribd Selects Sheet Music

Hand-picked favorites from

our editors

our editors

Top Books

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Audiobooks

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Comics

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Sheet Music

What's trending, bestsellers,

award-winners & more

award-winners & more

P. 1

Stprtool Pattern Recognition|Views: 224|Likes: 2

Published by John Soldera

See more

See less

https://www.scribd.com/doc/74136767/Stprtool-Pattern-Recognition

06/21/2013

text

original

The classiﬁed object is described by the vector of observations x* ∈ X ⊆ R*n

and a

binary hidden state* y ∈ {*1*,*2*}*. The class conditional distributions* p*X*|*Y(x*|y*),* y ∈ {*1*,*2*}
*

are known to be multi-variate Gaussian distributions. The parameters (µ1*,*Σ1) and

(µ2*,*Σ2) of these class distributions are unknown. However, it is known that the

parameters (µ1*,*Σ1) belong to a certain ﬁnite set of parameters* {*(µi

*,*Σi

):*i ∈ I*1*}*.

Similarly (µ2*,*Σ2) belong to a ﬁnite set* {*(µi

*,*Σi

):*i ∈ I*2*}*. Let* q*:*X ⊆ R*n

*→ {*1*,*2*}
*

17

−1.5

−1

−0.5

0

0.5

1

−0.2

0

0.2

0.4

0.6

0.8

1

1.2

Figure 2.4: Linear classiﬁer based on the Fisher Linear Discriminant.

be a binary linear classiﬁer (2.2) with discriminant function* f*(x) =* *w* ·*x* *+* b*. The

probability of misclassiﬁcation is deﬁned as

Err(w*,b*) = max

i*∈I*1*∪I*2

*ε*(w*,b,*µi

*,*Σi

)* ,
*

where* ε*(w*,b,*µi

*,*Σ*i
*

) is probability that the Gaussian random vector x with mean

vector µi

and the covariance matrix Σi

satisﬁes* q*(x) = 1 for* i ∈ I*2 or* q*(x) = 2 for

*i ∈ I*1. In other words, it is the probability that the vector x will be misclassiﬁed by

the linear rule* q*.

The Generalized Anderson’s task (GAT) is to ﬁnd the parameters (w*∗,b∗*) of the

linear classiﬁer

*q*(x) =

1* ,* for* f*(x) =* *w*∗ ·*x* *+* b∗ ≥* 0* ,
*2

such that the error Err(w*∗,b∗*) is minimal

(w*∗,b∗*) = argmin

** w**,b

Err(w*,b*) = argmin

** w**,b

max

i*∈I*1*∪I*2

*ε*(w*,b,*µi

*,*Σi

)* .* (2.14)

The original Anderson’s task is a special case of (2.14) when* |I*1*|* = 1 and* |I*2*|* = 1.

The probability* ε*(w*,b,*µi

*,*Σi

) is proportional to the reciprocal of the Mahalanobis

distance* r*i

between the (µi

*,*Σi

) and the nearest vector of the separating hyperplane

*H* =* {*x* ∈R*n

:* *w*·*x* *+* b* = 0*}*, i.e.,

*r*i

= min

*x**∈H
*

* *(µi

*−*x)*·*(Σi

)*−*1

(µi

*−*x)* * =* *w*·*µi

* *+* b
*

* *w*·*Σi

w*
*

*.
*

18

The exact relation between the probability* ε*(w*,b,*µi

*,*Σ*i
*

) and the corresponding Ma-

halanobis distance* r*i

is given by the integral

*ε*(w*,b,*µi

*,*Σ*i
*

) =

* ∞
*

ri

1

*√*2*π
*

*e−*1

2t2

*dt .
*

(2.15)

The optimization problem (2.14) can be equivalently rewritten as

(w*∗,b∗*) = argmax

** w**,b

*F*(w*,b*) = argmax

** w**,b

min

i*∈I*1*∪I*2

* *w*·*µi

* *+* b
*

* *w*·*Σi

w*
*

*.
*

which is more suitable for optimization. The objective function* F*(w*,b*) is proven to

be convex in the region where the probability of misclassiﬁcation Err(w*,b*) is less than

0*.*5. However, the objective function* F*(w*,b*) is not diﬀerentiable.

The STPRtool contains implementations of the algorithm solving the original An-

derson’s task as well as implementations of three diﬀerent approaches to solve the

Generalized Anderson’s task which are described bellow. An interactive demo on the

algorithms solving the Generalized Anderson’s task is implemented in demo anderson.

References: The original Anderson’s task was published in [1]. A detailed description

of the Generalized Anderson’s task and all the methods implemented in the STPRtool

is given in book [26].

- Read and print without ads
- Download to keep your version
- Edit, email or read offline

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

CANCEL

OK

You've been reading!

NO, THANKS

OK

scribd

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->