You are on page 1of 3
1. suppose that you have trained a logistic regression classifier, and it outputs on a new example za prediction hg(2) =0.4. This means check al that apply: Our estimate for Plu = O36) is 0.4 Ourestimate for P(y = Ofe36) is 06. Ho Our estimate for Ply 230) is 0.4 o ur estimate for P(y = 1230) 505. 2. Suppose you have the following training set, and fit a logistic regression classifier g(x) = (80 + x21 + O72). mf» [y 1 | os [0 i) 2{1 [4 a |1[e 2 7 ° 1 + 0 as ° 1 . zs © Which of the following are true? Check al tat apply ‘Adding polynomial features (eg, instead using Ing(2) = 90 +0121 + ote + 052% + 042122 + 0523) ) could increase how well wwe can the taining data Arte optimal value of 8g, found by finune) we wil have J(@) > 0 [7] Adaing potynomia features eg, instead using Ing) = 90 +8121 + Bag + 8323 + 042122 + 523) ) woud increase J(B) because we are now summing over more terms []Itwe train gradient descent for enough iterations, for some examples ze in the training Seti is possible to obtain hy(2)) > 1 3. For logistic regression, the gradient is given by 32 (8) = SP, (ho(2!9) — yl}20_ whieh of theses a correc gradient descent update for logistic regression with a learning rate ofa? Check all that app ou o a oO 0; a LM, (haz) — yl )2 (stmutaneously update forall 8, aL SP, (hole) — y}a. cimultancousty update for al. 8,26 0 EET, (yay —0)eP orton vst aa 00a LDN, (672-9) 2 4. Which ofthe foliowing statements are true? Check all that apply. The sigmoid function g(2) isnever greater than one(> 1). The cost function J(6) for logistic regression trained with m > 1 examples is always greater than or equal tozera, Linear regression always works well for classification if you classify by using a threshold ‘on the prediction made by linear regression For logistic regression, sometimes gradient descent wil converge to a local minimum (and fal to find the global minimum), This s the reason we prefer more advanced ‘optimization algorithms such as fminunc (conjugate gradient/BFGS/L-BFGSIeta). 5. Suppose you tran alogistic lassferhy(2) = a(@p +121 +429) Suppose 9 = 6,8 = 0,82 — 1 wnienof te folowing figures repeats the deceion boundary found by your classifier | O Aewe: Lecture s Slide 10

You might also like