You are on page 1of 30
i Field of Sttdy that gives Computer the ability +0 learn, wade being erplicitly Programed ats Mitchell - A computes *Fregrom iS Said +p learn from perenne E with sesrect 4o Some task T - come Performance: measure P- if its pexformance perience easuxed by P, innfeoves sith en ont, asm = Fans Machine Aeoxning Algorithms —— ———— ee = Supesvised Aeasning axning ao ongupesvised xc ‘ ; , » % yecommenider SYSTEMS — Reinforcement deasning » : Supesviced Acatins Fe ee i st ead Resto (Psedict continuos Valued —— olf) Here inthis case its (Price) decast Cances (malignant, benign) Lewabeiticarion Problem] classification Pisesete valued olp (0 ov) , urior Side 4 at ‘ a w eiibe we considered “tumor Size 2S, coe = ie is Must fov one attsibute: exing ~mose than A attsibute axe infinite no of FEautures \peunervised enening “i : : i | ti cae ; wt ile ‘ Ons Oy d 4 Supervised heaming - is as Aearnins ‘ oe Fleve) wart a (66,61) ’ ©0/% outline Staxt with Some 00/81 4a vedilée J (008) (Say S020, 01:0) Keer Changing 64,61 ontil we hopefulls end Up at minimun Giradien t rlescent eee + ae ———— oe r —_ ' Ca e543 30,01) Cfov 7 Fr Simulyaneousld V Plate : O0GA Cossect Simultaneous “pawl tml ¢ update i : or 7 temo: 2.09-43q 7 Oo/0) sempo = Oo-Xdy ‘3(00,1) “560 emP 152 O,-oh > _J100M) (6q)= ters a és Coistempo empl: 2O17A W000" Oyrstemhs : 6,12 tenes. Sa jm Ootd Q 0,0, +4) O00, Cis aon ea Rigi : on & jent descent algorithm tat eS : Refeat until Convergence B= 05 “CE, SOHO) Csimutiancously oped ; en 00661] : Acayning aa ro = vate . o Oe R. a GO. Ojala TCO]: 01 207" ; ees ol tte) Age ae rn aS + = — 1S 00 louse srouten tener Cann ‘overshoot he minimum - Ct may ‘Failte conve oy, soot ee 52 fs. eR fy S Ceto 97] pata =P 8 340,01 = hete)) = ¥" ay ae ha 251040) al 04. ie “o Oy sy cnt algorithm [Batch Carer jent si ’ Pescent] " @... A), Go, 61) -s f nel EN 2000 ona "Cant peaforn: ‘Should ei of se motyiceS bo Lil sal 30 . 6 ee : ot oe yore #4 on) . bad ; fin ‘ s = : aa pe ee prio | wince: ion | a Axi XS : ‘ ey ar bi 3 4 Ly res: 3x) “ oe Bs -hot H1oX025 HY GFN SSHR ODS esr | Lewes l+ gsaxo28 RDF be 6 ] _[ 40 +2104xox5 5 h(t 49 Lifi(9b949 ip ral tay . Murinaniate A ineay Regression i MVE 22s lake cae aml adh Gat OinrtOo%*Os797 5%” Onn For convimience of notation define F024 | qs) =A me sag Ly: ‘ef § CO Oo 2\) 0 ed pir fete Bele ee ; \ dn ‘ On ne agitation 195% -— Cain * i, Sy t e~ | Gradient descent.for multiple feauhat® _— Hyfo-thesis: he) . On 2 tot Oy 4 2 Onn e$e1S > 0,0, -- On~ or Assume a vector O ction - ese ne AE la O) Es) Af i ‘gp a ee Rescent in practice j - Feast bth Scaling — - Feapetuae sealing = 1 mats wit da? aay: 4 ” le Make . suse io ose ‘on & ~ Similas Scale eq y= eiqetd i000 feed) I, = No of bedytom (3) Tf You inaplement is sie ten “gradient descent will yoke a ine we seach global minima $ on az SiseFet) 500 %, = Noot Hediooms O<%,<1 esust so). = Mean Normatiyation Co not apply to tozy —EEEE | 1: Sizve— Mh} NS Ty 1A: £OV9 Value —_—_— a ee 6 1 2800 GD teat Be oe PARE BROS HONG Gab ey MM; ie <.. ++ mG Sgueos 6. TOs SHS05 | Faro need net be ease Tipe ge How to moKe Suxe grat aearee 5 ay ieee comsecty JUG Atahat 6 foint gthe heisih (S$ JO) athe flor iS Showing as No i . 7 of interations ee dhe vows of Jol 1S Via For Sufficiently small 4 le) Should decreos® eves iteration Rut if of is oo ematl, asadion Slow to Converse + descomt Ca oS Summard Beas 0 eal + Slou) Conassen crates ato rapge: 2eel 7s. ig on tat lO » mas Oi roa" en ew Loy istic RES segsion) a tte E 2 pny | Classification A Discrete olf ' © Wego = he 20-5 predict J =! het) < oS, predict 3, a + Kogsric Repession Coc hs] HYforh esis RE ped entation — coo oe 3): o£ \+e hes nc dogistic fone Same a) > estimated ad , aba ge oni P16 SE oie Jp {ge} + 2 i * Pls -ola6) 21 Pl¥-i]28) 2 [eta-oin.0 4 19) a] Predict yer Vf hgBZ08 pyedict 420 if-/hp <05 . . cy F faegict pm tee TSC Tre BS 41.2% OSs eatutt, +0 &- Feautuses HG % § 00'S, Orne ee | ho) -g(5-9) Jot ho? goY oh ‘ ee SP ecesion ond MogqiStic Segression cost function j ea b Training Set 5 SN Ys) are} += -~ Umdn)§ j a KE é , ky No= 4 sepo,'4 ae ————— “ ees pane ae 0, (+e tow to choase © ? © Yegqresqion Cost function Se eth y) -. ¢ -1og the), tga >= ce logti-he) Tee ee in genet * = — he _ =e RG beat, at ’ TF ed stlnen he 4 C= . loz But as he® 50 kak sal J Stntaested ple O84 eri Robrwe ob Sms Gb ibe Bees a; ; : ~ above in Fig Dim cdt. 5 S420 ia 4:0 oo hg 2) ay whde dataset fehl eae ht, sieptified Cost function § Gradient descent "Greatien’ feseon kai | Tloy= aia" 19g hott ally ates “Sieae wat ‘i Refar $ nh Oj 205 -AL TO) 3j__- ie gmutyancously uate OM f i . uy) se ee i ges: x3 tet) ohio Sie OF Oe ae eee 2 a —< a ema came = a 5 vege) Advan ced —— ion a Say mpettion aierths oes” j = E (iticlient descent — Coniugate gradient — ofGs — L-eras b Noneed 0 manually Pick yo” Often ee enna Cited ecko Pui owes) we aye converting so wort AC | each diffaert ae ” birayy classificat! ; ma (ss ys (t) - Pirie w make @ Pr She probiem whew yer fia ting Eaamitt ‘dinear ved venion : Price , siye Ae i 27 Oot O12 Sadia Uadertitting hig hf JUst right Sonia ener ivting «High variance » 3 4 ; GotOxt 5% + OKT Our fit High variance’ Wg rf we have 400 many feautures “Yeamed hypothesis may fit, ‘the craw ell (0) uy ui) 1 ha + vey welt (S ‘saz Baha A ea ail Ze a “tp new Cromples- Reguiasivos ion F Small Nalues fot Parameters 00,017 ~~” 8n , os Prone to overfittng 3 “impler” hy fothesis : : ww ip. a CaO Housing > Feautwres: tits =O -Oi6e S Pasamctess * 60,60 ~

You might also like