You are on page 1of 7

EF S5 -Dr. pramucitha Q2.

training data

Model Questions ! Superviseck


Learntrg.
as

Sales Carve

ex ponen tial dis tribut fon


2

b. MSE y:- f (ap]


bution for a shngle
a.. probability distr1
data polnt %i
P(%;lA ) = Ae
2
b. Jot distcibutio n
k
PCxI) = e
at minimum.
Conditons )
(ID
(assuming
log tikelibocd fun ctton n

lol P(z IA) )

E-27 9:- f)]


minirnem
t

(t1

2
at max mum

at nlnlmum!

L4:-fa]
- (9i-A)
5
e

-9-)
d. Linear system +lo e
and )
from

(9-)
2

- (9-)
20-2

W.
A

eAW.
Ameotbeiavertrble,

dernider
ALyAA
e. Aw
at
sing tle's W can be

foun d.
or else

Ay ATAW
(^A)Ay usimg thig n
y
Pseudo invers e
W Can be found.

Q3.
observattons :
n. (-2)0z (y;
2.

1D conctons
a. Assuming

(97-u)* [tn(Py le,o)- o


e

b log probab li ty
Qolplytp,c)) - Hi-H)
262
2 @4.
9i-)

nlp(y lA,o)] ZBn tZo e

place oo by
a propabititg of c lass

(w'x)
n
(9;-4)
PC mano |,w) I - Si9moid
t
dotapot

b. for a single datafomt


likei hood
n o l2n y-loo
i! = (signoid (wa;)] ,I-ti
P(; lx)*lt-sgmod (w'a)7t
|1D Conditons
c. ass CAm ing
00
P(zlx) T P(Z; x)
20. (-2).! 9 (g-5
(sgrmad (wta))(r-sgmsoid( N'a))
+ g-) teo
d negatve og probabiitg
entapy
a t mamem
-to lp(zlz)]
2
-to(sgmid(W)) -sgmoit(W'«:))
g:-uy'00
(I-sigmoid (x))
--Z[t; hsigmoid (w' ) +(-4)
tounea
oss.
bnary cross en tropy
d Introducing truo cated loss to etotpre]
he loss fun ctiorn, Tt is & loss
cohill terms Ohich
Cases higher errors han a ertal
Crur thres hold.
( OS.os.

(2)

(2)

-tal p(3lx)) ]
15

+ (t-t).I
t-o(ox;)
output of the fins t \ layer
o(3)
ol for j-0,t,2, 3.
# (t-t )
4
(treeajt ()
('
autput of the stcord (ay
-Zt:Z e ew e
lt eo
3 (2)

n
oly'6)lt-oly'b)]
o [ b ][ - o )
13 (14
P(lase positive )
P(tly) ply
falsepositlve naqa41ve
rtto
Pale pasitve) + P(false
Pfare) dy

classfication rule
Sheslfoce) P( facty JPlylbockground )
P( bactgrourd)dy
p(hackg raunol ly) > p face ly )
are SAme
t- background aSsum in priur probabili4tes

else false J Ply l fare )dy


t face postive
ratio
SrGlface )dy +fP(ylbackgroun dJely
class1fication rute

fP(9| backqreun d) p(backgreund ) 3(-)


>P(yl face p(face)
1
t= bac lk qreun cd

etse iv. false R(false negtive)


t= face
PLase pastttve) t P(false negotive )
ratto
decision rule
JP(y 1f&celbac karounc) P(eacteraund)dy
if l P ylbactground ) > Pl9tface3
t = backgrounc felg lback grouná)P(backgtounct) dye
+\V Pylce) ptface) dy
else
t face
ass uni ng prror probabilitres are Same

false JPlylbackground)dy
backg ra unol nagattve
t ratto
else t= face.
decisro îs back groun d
Rg - reqion co here
Rf - reqron cahere decis rorn s eefoa
P(fatse pas itive)
p(ye Re ,Gface ) 008-00E
fa lse false pasitlve
face face positive
PCylBingomel ) P(Bocgnottrd ) dy
atto total ngatire
Wpustthve false pas ttye
P salse reg attye )
false posAlve t Arue negatie
Plye Rf, background ) P(felse positive )
P (y lbsc kground ) PCbackground) dy
alse
pove P(false Pusithve )

P(false pos/t(ve) + P(true ncqative )


plylface ) Pfae) dy
dy
SPLylsace ecieee )dy t Ply|lbackeckgroand) PBockgyov)
=

4
assuming prior prababi lites the Sanme
o. 25
folse
postive
ratio
Wo,o
(2)

,(2)
P9 fae ) dy
N (2)

()
false PC false positre ) for = o, , 2, 3
positive
ratto PHalse posittve ) + PCfrae negote)
bakgunc backgreund for - o1, 2, 3
PlylBae) P aee ) dy

Pgl bockg round) PCbockgtunc ) dy.


pror probabilities
Some
asSuming

false 3P(g1face )Rataes dy


pasitive 9, 0 (9,)
rotfo
J P 9 eockgraund ) dy

3(1-%) 3

false
iv. negotve PLfalse neqoive )
ratto Pfale negatve) t PCfrue positre)
fpCgiiae) Pcfoce)dy
(o)
(2)

asugoing Same pror Probabi li Btes o»y). +b]


false P (9lface) dy
regative for i-0,1,2,3
C
2fPgtace) dy
() (8)
upaate cquaHon
y.

o(z,) P k iteaton of the parameter Pr)


()
Learning rat e

Pe
D96 considangg param Cters,

8
(2)
Pi,o (2)

o ( ) (t- olz)<;
(2)
Dz,* paraetecs.
Considerig
(2) 2)

(2)

(2)

olz(1-o(e,^))
binary chss1sication problem
since it's a b parame ters
fnction be binary eonsi dering
lt the loss
entopy loss

by

(-9)

(1-y)

DE

(2)

You might also like