You are on page 1of 4
& Shallow Neural Networks ue Aug23,259 PM 08 Congratulations! You passed! cone Ei 98.75% oPass sor hse Shallow Neural Networks 98.75% 1. Which ofthe following are rue? (Check al that app ax> Cat denotes activation vector ofthe 12" layer on the 2" raining example X's amaunxin whieh each row sone waining example 2) denotes the activation vector ofthe 2 ayer forthe 12 ining ena, WY conset af ste aration opucdy tea neuronof te yer covreet EX teamannicin wticn each column sone training example. Y connect 1D af etre atatin output ofthe 2 ayer forthe traning example io cenotes tne actvaton vectra the ayer, Y coma, 2. The tanh ateationis not ays better than sro acthoton function forhddenunisbecaurethe mesnofis cup EE Is corer to zero and so enters the dts making ering complex forte nest ayer Tusa? © tase Y conract Yes. Asseenin etre he outputof theta ieee and, thus centers the dota which kes the Iearing simpler forthe nes ayer 23. Which ofthese core vectorize implementation of forward propagation for lye Iahere 1 <1 <1? axa @+ whe wnaengen + Al ohatty + 2 = wal 4 oat gP(aM) «abe rz + aha weal got + all=aM(2) covrect ‘You are bul a binary case fo reagnzng cucumbers Yet). watermelons rOL Which oneofthesesciaton ED functors would you recommend sing forthe ouput yer? O teaky Rew © samois O raw Y connect ‘Yes Srvid curputs value beeen Oand hich makes ia vey go09 che fr binary cleat, You an das 350 he outputs ess than 5 and classy as 1 the outputs mare than can be done wth tan aswel but tise convensnt athe ouput betwee and 1. consider he oloning code ax> [A= nprandomrandria38 = np sur auls= 1 keeps = Te) ‘ote Bshape? i youre not sure fel fe tora thin python ton ou Ow @un Ow D0.) connect ‘Yes. use heepdis = Tue} to make sue tat Ashapels (4) and net, itmakes our cde mere robust supose younave butane newer. Yousedde ints ne wees anaasesiobeze wncnerhefolowng SD tach neuran inthe est den ayer vllerform he same computation nthe fist eration But afer one terton| of radiant descent hey we to compute ferent ngs because We hav “broken symme. {© en neuron inthe fst hese ayer wl perform the same comput. Seen after muse teratons of _raiene desc ach neuron nthe ayer wl be computing te ame ting ss oher meron © #zerneuron inthe fst en lye wll compute the same thing, bu neuron in fleet ayers wllompute ferenethngs tus we have accomplshed symmetry breaking 2 descnbedn eure © the estrcen ayers neurons wl perform aferent computations fom each the even nthe fst trate: ther parameters wilthus keep evhingin der wn way Y conact Logie regressions weights w should beiniatzed randomiycather than tallzeos beause you intalzetoalzeos ED "enogsic repression lt lear 9 sel decion Boundary because wl ato ereak symone, rls? @ fase connect Yes, Logic Regression does have ain layer. you itz the weight 20, thes example fadin telat regesin wl cutput eo bul the deiaes othe Lagi Regen pend an theo “(becaoe there's node aye) which ent er. Soa the secon eration, the weights lie flow es ‘inbuton and are aferent rom ean other snot a constant vector. 1. Youhavebuit anetworkusinghe tanh aetuaon for alte hidden uns. You ine the weights tortie arge ‘luce wengnprandomandnt..)"1000, Wit appen? © ‘Triswa cause he npurs ofthe tanh to aso beverage, causing the unis tobe gh actNated” and hus speed up leering compared tthe weigh ad to start rm smal vais (© isa cause ne inputs othe tant kobe very ag, thu eausing grass toa become lrg. You ‘heteore have seta tobe very malt prevent divergence hs wt slon down arin. O teces mater. So ong as you talz the weighs randomly radlent descent isnot fetes by wheter he ‘wigs are age oral © rs mitcause the npus of tre tanh ols be very large ths causing gradient tobe dose to 20. The ‘pumizaion ago vl thus became iow. Y conact ‘Yes. tan becomes fi fr large vals this eas eration to be dase a zr, Tis slows own the optimsston slsrtom 9. Consider he foloning 1 hidden yer neural networks ® »®O © hich of he allowing statement re Tue? (Checkal hat ape 5 (PA wtnave shape 1) (We wa nave nape 29 Wa nave nape 2 tna shape 1) covrect BB anave shape 1) Y connect DW wa nave snopes BY nae shape 1) Wat navehape( 4) Y conact ‘You did select alte coset anamers work the previous question, what re the dimensions of Zand A? © 2 and Al are © al and A are(42y © Zana atlarevam ) and Al are.) Y conect

You might also like