You are on page 1of 42

Departamento de Física Médica - Centro atómico Bariloche - IB

Función de costo y optimización


Ariel Hernán Curiale La mayoría de los slides fueron adaptados de Fei Fei
ariel.curiale@cab.cnea.gov.ar Li, J. Johnson y S. Yeung, cs231n, Stanford 2017.

UNCUYO
UNIVERSIDAD
NACIONAL DE CUYO
Clasificación de imágenes

❖ Nearest Neighbors
❖ K-Nearest Neighbors
❖ Linear Classifier :
Nos quedaba pendiente
❖ Support Vector Machine
❖ Softmax Classifier
Clasificación lineal
bias x0 = 1
f (x, W ) = W x = Ŵ x + b
<latexit sha1_base64="M3TAK+CWUEoVaxSGqd5n/jwXh4A=">AAACAXicbVDLSgMxFL3js9bXqBvBTbAIFaXMVEE3QtGNywq2U2iHkkkzbWjmQZKRlqFu/BU3LhRx61+4829M21lo64F7OZxzL8k9XsyZVJb1bSwsLi2vrObW8usbm1vb5s5uXUaJILRGIh6Jhocl5SykNcUUp41YUBx4nDpe/2bsOw9USBaF92oYUzfA3ZD5jGClpba57xcHp84xukLOQLdWD6vUGQ1OvLZZsErWBGie2BkpQIZq2/xqdSKSBDRUhGMpm7YVKzfFQjHC6SjfSiSNMenjLm1qGuKASjedXDBCR1rpID8SukKFJurvjRQHUg4DT08GWPXkrDcW//OaifIv3ZSFcaJoSKYP+QlHKkLjOFCHCUoUH2qCiWD6r4j0sMBE6dDyOgR79uR5Ui+X7LNS+e68ULnO4sjBARxCEWy4gArcQhVqQOARnuEV3own48V4Nz6mowtGtrMHf2B8/gCE3JUH</latexit>
<latexit sha1_base64="YN9Q4TjjIqbPTr+J0ZjVFyAk2oE=">AAAB7HicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9CIUvXisYNpCG8pmO2mXbjZhdyOW0t/gxYMiXv1B3vw3btsctPXBwOO9GWbmhang2rjut7Oyura+sVnYKm7v7O7tlw4OGzrJFEOfJSJRrZBqFFyib7gR2EoV0jgU2AyHt1O/+YhK80Q+mFGKQUz7kkecUWMl/6nrXnvdUtmtuDOQZeLlpAw56t3SV6eXsCxGaZigWrc9NzXBmCrDmcBJsZNpTCkb0j62LZU0Rh2MZ8dOyKlVeiRKlC1pyEz9PTGmsdajOLSdMTUDvehNxf+8dmaiq2DMZZoZlGy+KMoEMQmZfk56XCEzYmQJZYrbWwkbUEWZsfkUbQje4svLpFGteOeV6v1FuXaTx1GAYziBM/DgEmpwB3XwgQGHZ3iFN0c6L8678zFvXXHymSP4A+fzB/5PjiU=</latexit>

0.2 1.0 0.0 … 0.0 1 0.2 cat

1.1 2.4 -2 … -2 112 1.1 deer

0.3 4 -0.5 … -0.5 ·


<latexit sha1_base64="yfc6NYWr3Im+ZBK4Ggu2CK70rVE=">AAAB7HicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGCaQttKJvNpl262YTdiVBKf4MXD4p49Qd589+4bXPQ1gcDj/dmmJkXZlIYdN1vZ219Y3Nru7RT3t3bPzisHB23TJprxn2WylR3Qmq4FIr7KFDyTqY5TULJ2+Hobua3n7g2IlWPOM54kNCBErFgFK3k91iUYr9SdWvuHGSVeAWpQoFmv/LVi1KWJ1whk9SYrudmGEyoRsEkn5Z7ueEZZSM64F1LFU24CSbzY6fk3CoRiVNtSyGZq78nJjQxZpyEtjOhODTL3kz8z+vmGN8EE6GyHLlii0VxLgmmZPY5iYTmDOXYEsq0sLcSNqSaMrT5lG0I3vLLq6RVr3mXtfrDVbVxW8RRglM4gwvw4BoacA9N8IGBgGd4hTdHOS/Ou/OxaF1zipkT+APn8wfaWI62</latexit>

123 =
<latexit sha1_base64="MWbL6R2hZsQThSAdMNvF0orSXwY=">AAAB6HicbVDLSgNBEOyNrxhfUY9eBoPgKexGQS9C0IvHBMwDkiXMTnqTMbOzy8ysEEK+wIsHRbz6Sd78GyfJHjSxoKGo6qa7K0gE18Z1v53c2vrG5lZ+u7Czu7d/UDw8auo4VQwbLBaxagdUo+ASG4Ybge1EIY0Cga1gdDfzW0+oNI/lgxkn6Ed0IHnIGTVWqt/0iiW37M5BVomXkRJkqPWKX91+zNIIpWGCat3x3MT4E6oMZwKnhW6qMaFsRAfYsVTSCLU/mR86JWdW6ZMwVrakIXP198SERlqPo8B2RtQM9bI3E//zOqkJr/0Jl0lqULLFojAVxMRk9jXpc4XMiLEllClubyVsSBVlxmZTsCF4yy+vkmal7F2UK/XLUvU2iyMPJ3AK5+DBFVThHmrQAAYIz/AKb86j8+K8Ox+L1pyTzRzDHzifP44bjMU=</latexit>

0.3 dog

2 -0.7 30 … 30 … 2 frog

-0.8 0.1 1.0 … 1.0 45 -0.8 horse

W
<latexit sha1_base64="yxUgNFppSsebkUqo0HdMiZJUGss=">AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rEF+wFtKJvtpF272YTdjVBCf4EXD4p49Sd589+4bXPQ1gcDj/dmmJkXJIJr47rfztr6xubWdmGnuLu3f3BYOjpu6ThVDJssFrHqBFSj4BKbhhuBnUQhjQKB7WB8N/PbT6g0j+WDmSToR3QoecgZNVZqtPulsltx5yCrxMtJGXLU+6Wv3iBmaYTSMEG17npuYvyMKsOZwGmxl2pMKBvTIXYtlTRC7WfzQ6fk3CoDEsbKljRkrv6eyGik9SQKbGdEzUgvezPxP6+bmvDGz7hMUoOSLRaFqSAmJrOvyYArZEZMLKFMcXsrYSOqKDM2m6INwVt+eZW0qhXvslJtXJVrt3kcBTiFM7gAD66hBvdQhyYwQHiGV3hzHp0X5935WLSuOfnMCfyB8/kDtYOM3w==</latexit>
x
<latexit sha1_base64="hL+FaLtOT9luwfLW3Ut08xl3Pcw=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbRRI9ELx4hkUcCGzI79MLI7OxmZtZICF/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR7cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89idcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmpWyd1Gu1C9L1ZssjjycwCmcgwdXUIU7qEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AOeHjQA=</latexit>

¿Cómo calculamos los pesos? S = f (x, W )


<latexit sha1_base64="+NU1KmWmPppVa6Y7UEMyRu+aDpU=">AAAB8XicbVBNS8NAEJ3Ur1q/oh69LBahgpSkCnoRil48VrQf2Iay2W7apZtN2N2IJfRfePGgiFf/jTf/jds2B219MPB4b4aZeX7MmdKO823llpZXVtfy64WNza3tHXt3r6GiRBJaJxGPZMvHinImaF0zzWkrlhSHPqdNf3g98ZuPVCoWiXs9iqkX4r5gASNYG+nhDl2ioPR00jzu2kWn7EyBFombkSJkqHXtr04vIklIhSYcK9V2nVh7KZaaEU7HhU6iaIzJEPdp21CBQ6q8dHrxGB0ZpYeCSJoSGk3V3xMpDpUahb7pDLEeqHlvIv7ntRMdXHgpE3GiqSCzRUHCkY7Q5H3UY5ISzUeGYCKZuRWRAZaYaBNSwYTgzr+8SBqVsntartyeFatXWRx5OIBDKIEL51CFG6hBHQgIeIZXeLOU9WK9Wx+z1pyVzezDH1ifP1ZRj2Q=</latexit>

¿Porque no usar cuadrados mínimos?


Loss Function, Cost Function, Objective
Función de costo
❖ Función de costo: loss function, cost function, objective
❖ Nos da una idea de cuán bien estamos resolviendo el
N
problema dado un conjunto de datos {(x , y )}
i i i=1 <latexit sha1_base64="bkidP2V4Fk8bEIzgwRP/kLP/1eg=">AAAB/3icbVDLSsNAFJ3UV62vqODGzWARKkhJqqAboejGlVSwD2himEwn7dDJJMxMxBKz8FfcuFDErb/hzr9x2mah1QMXDufcy733+DGjUlnWl1GYm19YXCoul1ZW19Y3zM2tlowSgUkTRywSHR9JwignTUUVI51YEBT6jLT94cXYb98RIWnEb9QoJm6I+pwGFCOlJc/ccdLKvUcP4cijB052e+Wl9MzOPLNsVa0J4F9i56QMcjQ889PpRTgJCVeYISm7thUrN0VCUcxIVnISSWKEh6hPuppyFBLpppP7M7ivlR4MIqGLKzhRf06kKJRyFPq6M0RqIGe9sfif101UcOqmlMeJIhxPFwUJgyqC4zBgjwqCFRtpgrCg+laIB0ggrHRkJR2CPfvyX9KqVe2jau36uFw/z+Mogl2wByrABiegDi5BAzQBBg/gCbyAV+PReDbejPdpa8HIZ7bBLxgf352+lTM=</latexit>

1 X
loss = L(f (xi , W ), yi )
N i
<latexit sha1_base64="yzOireLQ5ztcqr1+6p23MjT8bl4=">AAACGnicbVDLSgNBEJyNrxhfUY9eBoOQQAi7UdCLEPTiQSSCSYRsWGYns8mQ2QczvZKw7Hd48Ve8eFDEm3jxb5w8DppY0FBUddPd5UaCKzDNbyOztLyyupZdz21sbm3v5Hf3miqMJWUNGopQ3rtEMcED1gAOgt1HkhHfFazlDi7HfuuBScXD4A5GEev4pBdwj1MCWnLylg1sCIkIlUrxObY9SWhipclNaqvYdzi+LnrFocPLuFUq45HDS06+YFbMCfAisWakgGaoO/lPuxvS2GcBUEGUaltmBJ2ESOBUsDRnx4pFhA5Ij7U1DYjPVCeZvJbiI610sRdKXQHgifp7IiG+UiPf1Z0+gb6a98bif147Bu+sk/AgioEFdLrIiwWGEI9zwl0uGQUx0oRQyfWtmPaJTgd0mjkdgjX/8iJpVivWcaV6e1KoXcziyKIDdIiKyEKnqIauUB01EEWP6Bm9ojfjyXgx3o2PaWvGmM3soz8wvn4AS++ftA==</latexit>

❖ Existen diferentes formas de definir esta función de


costo y dependen fuertemente del tipo de problema.
Support Vector Machine
❖ Idea general de Support Vector Machine: encontrar un
hiperplano de margen máximo
T
w x i + bi > 0 Solución: w T x i + bi = 0
Función de clasificación:
f (x) = sig(wT x + bi = 0)
Característica 2

❖ SVM es un clasificador lineal que


w optimiza la función de costo bisagra
o Hinge loss con una regularización
L2
Recordar: Aclarar problemas
w T x i + bi < 0
linealmente separables con
Característica 1 kernels al final de todo
Hinge loss: Multi Class SVM Loss
❖ Hinge loss: busca que la clasificación correcta de la
imagen tenga un valor mayor al de la incorrecta
valores obtenido para la
considerando un pequeño margen.
clasificación verdadera
X
Sj
<latexit sha1_base64="dJ+mL7goYoZav9GhE/JGI+hUfmY=">AAAB6nicbVDLTgJBEOzFF+IL9ehlIjHxRHaRRI8kXjxikEcCGzI7zMLI7OxmpteEED7BiweN8eoXefNvHGAPClbSSaWqO91dQSKFQdf9dnIbm1vbO/ndwt7+weFR8fikZeJUM95ksYx1J6CGS6F4EwVK3kk0p1EgeTsY38799hPXRsTqAScJ9yM6VCIUjKKVGo3+Y79YcsvuAmSdeBkpQYZ6v/jVG8QsjbhCJqkxXc9N0J9SjYJJPiv0UsMTysZ0yLuWKhpx408Xp87IhVUGJIy1LYVkof6emNLImEkU2M6I4sisenPxP6+bYnjjT4VKUuSKLReFqSQYk/nfZCA0ZygnllCmhb2VsBHVlKFNp2BD8FZfXietStm7Klfuq6VaNYsjD2dwDpfgwTXU4A7q0AQGQ3iGV3hzpPPivDsfy9ack82cwh84nz8mzo2q</latexit>

L= max(0, Sj Syi + )
<latexit sha1_base64="oSv46RYGIiKGpKzVEg5C3GMgK8U=">AAACHXicbVDLSgMxFM34rPVVdekmWISKWmZqQTdCURcuXChaLXTKkElv22iSGZOMWIb+iBt/xY0LRVy4Ef/GtHbh68C9HM65l+SeMOZMG9f9cEZGx8YnJjNT2emZ2bn53MLiuY4SRaFKIx6pWkg0cCahapjhUIsVEBFyuAiv9vv+xQ0ozSJ5ZroxNARpS9ZilBgrBbnyEd7Fvk5EkF5iX8I17gash31BbgvuBj4NLvGm7WlfXcf+AXBD1oJc3i26A+C/xBuSPBriOMi9+c2IJgKkoZxoXffc2DRSogyjHHpZP9EQE3pF2lC3VBIBupEOruvhVas0cStStqTBA/X7RkqE1l0R2klBTEf/9vrif149Ma2dRspknBiQ9OuhVsKxiXA/KtxkCqjhXUsIVcz+FdMOUYQaG2jWhuD9PvkvOS8Vva1i6aScr+wN48igZbSCCshD26iCDtExqiKK7tADekLPzr3z6Lw4r1+jI85wZwn9gPP+CXAwn6c=</latexit>
j6=yi

donde S = f (x, W )
Sy i
<latexit sha1_base64="wE/imfAg2AVK8mKbeJLLCFls5Ic=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lqQY8FLx4r2g9oQ9hsN+3S3U3Y3Qgh9Ed48aCIV3+PN/+N2zYHbX0w8Hhvhpl5YcKZNq777ZQ2Nre2d8q7lb39g8Oj6vFJV8epIrRDYh6rfog15UzSjmGG036iKBYhp71wejv3e09UaRbLR5Ml1Bd4LFnECDZW6j0EeRawWVCtuXV3AbROvILUoEA7qH4NRzFJBZWGcKz1wHMT4+dYGUY4nVWGqaYJJlM8pgNLJRZU+/ni3Bm6sMoIRbGyJQ1aqL8nciy0zkRoOwU2E73qzcX/vEFqohs/ZzJJDZVkuShKOTIxmv+ORkxRYnhmCSaK2VsRmWCFibEJVWwI3urL66TbqHtX9cZ9s9ZqFnGU4QzO4RI8uIYW3EEbOkBgCs/wCm9O4rw4787HsrXkFDOn8AfO5w9/qY+h</latexit>
<latexit sha1_base64="+NU1KmWmPppVa6Y7UEMyRu+aDpU=">AAAB8XicbVBNS8NAEJ3Ur1q/oh69LBahgpSkCnoRil48VrQf2Iay2W7apZtN2N2IJfRfePGgiFf/jTf/jds2B219MPB4b4aZeX7MmdKO823llpZXVtfy64WNza3tHXt3r6GiRBJaJxGPZMvHinImaF0zzWkrlhSHPqdNf3g98ZuPVCoWiXs9iqkX4r5gASNYG+nhDl2ioPR00jzu2kWn7EyBFombkSJkqHXtr04vIklIhSYcK9V2nVh7KZaaEU7HhU6iaIzJEPdp21CBQ6q8dHrxGB0ZpYeCSJoSGk3V3xMpDpUahb7pDLEeqHlvIv7ntRMdXHgpE3GiqSCzRUHCkY7Q5H3UY5ISzUeGYCKZuRWRAZaYaBNSwYTgzr+8SBqVsntartyeFatXWRx5OIBDKIEL51CFG6hBHQgIeIZXeLOU9WK9Wx+z1pyVzezDH1ifP1ZRj2Q=</latexit>

<latexit sha1_base64="bdAS1DU9pLDwmRSuoG8etVlJoIw=">AAAB7XicbVBNS8NAEN34WetX1aOXxSJ4KkkV9FjUg8cK9gPaUDbbSbt2kw27E6GE/gcvHhTx6v/x5r9x2+agrQ8GHu/NMDMvSKQw6Lrfzsrq2vrGZmGruL2zu7dfOjhsGpVqDg2upNLtgBmQIoYGCpTQTjSwKJDQCkY3U7/1BNoIFT/gOAE/YoNYhIIztFKzewsSWa9UdivuDHSZeDkpkxz1Xumr21c8jSBGLpkxHc9N0M+YRsElTIrd1EDC+IgNoGNpzCIwfja7dkJPrdKnodK2YqQz9fdExiJjxlFgOyOGQ7PoTcX/vE6K4ZWfiThJEWI+XxSmkqKi09dpX2jgKMeWMK6FvZXyIdOMow2oaEPwFl9eJs1qxTuvVO8vyrXrPI4COSYn5Ix45JLUyB2pkwbh5JE8k1fy5ijnxXl3PuatK04+c0T+wPn8AWHajwA=</latexit>
Support Vector Machine
❖ Supongamos tenemos sólo 3 clases y 3 ejemplos

<latexit sha1_base64="WML4l4JQNxe8ZE9XW9VSYUKjYgc=">AAAB8XicbVDLSgNBEOyNrxhfUY9eBoPgKexGQS9CUA8eI5gHJkuYnXSSIbOzy8ysEJb8hRcPinj1b7z5N06SPWhiQUNR1U13VxALro3rfju5ldW19Y38ZmFre2d3r7h/0NBRohjWWSQi1QqoRsEl1g03AluxQhoGApvB6GbqN59QaR7JBzOO0Q/pQPI+Z9RY6bFzi8JQckW8brHklt0ZyDLxMlKCDLVu8avTi1gSojRMUK3bnhsbP6XKcCZwUugkGmPKRnSAbUslDVH76eziCTmxSo/0I2VLGjJTf0+kNNR6HAa2M6RmqBe9qfif105M/9JPuYwTg5LNF/UTQUxEpu+THlfIjBhbQpni9lbChlRRZmxIBRuCt/jyMmlUyt5ZuXJ/XqpeZ3Hk4QiO4RQ8uIAq3EEN6sBAwjO8wpujnRfn3fmYt+acbOYQ/sD5/AEHho/W</latexit>
=1
Img. source: Stanford CS231n 2017
X
L= max(0, Sj Syi + )
<latexit sha1_base64="oSv46RYGIiKGpKzVEg5C3GMgK8U=">AAACHXicbVDLSgMxFM34rPVVdekmWISKWmZqQTdCURcuXChaLXTKkElv22iSGZOMWIb+iBt/xY0LRVy4Ef/GtHbh68C9HM65l+SeMOZMG9f9cEZGx8YnJjNT2emZ2bn53MLiuY4SRaFKIx6pWkg0cCahapjhUIsVEBFyuAiv9vv+xQ0ozSJ5ZroxNARpS9ZilBgrBbnyEd7Fvk5EkF5iX8I17gash31BbgvuBj4NLvGm7WlfXcf+AXBD1oJc3i26A+C/xBuSPBriOMi9+c2IJgKkoZxoXffc2DRSogyjHHpZP9EQE3pF2lC3VBIBupEOruvhVas0cStStqTBA/X7RkqE1l0R2klBTEf/9vrif149Ma2dRspknBiQ9OuhVsKxiXA/KtxkCqjhXUsIVcz+FdMOUYQaG2jWhuD9PvkvOS8Vva1i6aScr+wN48igZbSCCshD26iCDtExqiKK7tADekLPzr3z6Lw4r1+jI85wZwn9gPP+CXAwn6c=</latexit>
j6=yi

= max(0, 5.1 3.2 + 1) +


max(0, 1.7 3.2 + 1)
= max(0, 2.9) + max(0, 3.9)
=
<latexit sha1_base64="+rDED8PHPY8lnRli1QqYKnSBs+Q=">AAACeXicbVFNT+MwEHWyfAYWChzhMEtFVRYaJe0i2AMSgsseQdoCUlNVjjstFo4T2c5qq6j/gd/GjT/Chcu6JUJdYCRLb957Mx6P40xwbYLgyXG/zM0vLC4teyurX9fWKxub1zrNFcM2S0WqbmOqUXCJbcONwNtMIU1igTfx/cVEv/mDSvNU/jajDLsJHUo+4IwaS/UqDxDFOOSyoErR0bgQQoy9GpxCDaKE/q0Hh3Dkh9CAlt+EAwj3ITo8gCiynhlLI/SPZzxTeaZF0/+5b4Uya7Rs+maxmgcRyn45Qa9SDfxgGvARhCWokjIue5XHqJ+yPEFpmKBad8IgM13bzXAmcOxFucaMsns6xI6Fkiaou8V0c2PYs0wfBqmyRxqYsrMVBU20HiWxdSbU3On32oT8TOvkZnDSLbjMcoOSvV40yAWYFCbfAH2ukBkxsoAyxe2swO6ooszYz/LsEsL3T/4Irpt+aFd+9aN6dl6uY4lsk11SJyE5JmfkF7kkbcLIs7Pj7Dk158X95tbd769W1ylrtsh/4bb+ATX6rxc=</latexit>
2.9
L: 2.9
Support Vector Machine
❖ Supongamos tenemos sólo 3 clases y 3 ejemplos

Img. source: Stanford CS231n 2017


X
L= max(0, Sj Syi + )
<latexit sha1_base64="oSv46RYGIiKGpKzVEg5C3GMgK8U=">AAACHXicbVDLSgMxFM34rPVVdekmWISKWmZqQTdCURcuXChaLXTKkElv22iSGZOMWIb+iBt/xY0LRVy4Ef/GtHbh68C9HM65l+SeMOZMG9f9cEZGx8YnJjNT2emZ2bn53MLiuY4SRaFKIx6pWkg0cCahapjhUIsVEBFyuAiv9vv+xQ0ozSJ5ZroxNARpS9ZilBgrBbnyEd7Fvk5EkF5iX8I17gash31BbgvuBj4NLvGm7WlfXcf+AXBD1oJc3i26A+C/xBuSPBriOMi9+c2IJgKkoZxoXffc2DRSogyjHHpZP9EQE3pF2lC3VBIBupEOruvhVas0cStStqTBA/X7RkqE1l0R2klBTEf/9vrif149Ma2dRspknBiQ9OuhVsKxiXA/KtxkCqjhXUsIVcz+FdMOUYQaG2jWhuD9PvkvOS8Vva1i6aScr+wN48igZbSCCshD26iCDtExqiKK7tADekLPzr3z6Lw4r1+jI85wZwn9gPP+CXAwn6c=</latexit>
j6=yi
1 X
loss = L(f (xi , W ), yi )
N i
<latexit sha1_base64="yzOireLQ5ztcqr1+6p23MjT8bl4=">AAACGnicbVDLSgNBEJyNrxhfUY9eBoOQQAi7UdCLEPTiQSSCSYRsWGYns8mQ2QczvZKw7Hd48Ve8eFDEm3jxb5w8DppY0FBUddPd5UaCKzDNbyOztLyyupZdz21sbm3v5Hf3miqMJWUNGopQ3rtEMcED1gAOgt1HkhHfFazlDi7HfuuBScXD4A5GEev4pBdwj1MCWnLylg1sCIkIlUrxObY9SWhipclNaqvYdzi+LnrFocPLuFUq45HDS06+YFbMCfAisWakgGaoO/lPuxvS2GcBUEGUaltmBJ2ESOBUsDRnx4pFhA5Ij7U1DYjPVCeZvJbiI610sRdKXQHgifp7IiG+UiPf1Z0+gb6a98bif147Bu+sk/AgioEFdLrIiwWGEI9zwl0uGQUx0oRQyfWtmPaJTgd0mjkdgjX/8iJpVivWcaV6e1KoXcziyKIDdIiKyEKnqIauUB01EEWP6Bm9ojfjyXgx3o2PaWvGmM3soz8wvn4AS++ftA==</latexit>

L: 2.9 0.0 12.9 loss = 5.27


Support Vector Machine
❖ Supongamos tenemos sólo 3 clases y 3 ejemplos

¿Usamos 1 o 100?
¿De que depende?
Img. source: Stanford CS231n 2017
X
L= max(0, Sj Syi + )
<latexit sha1_base64="oSv46RYGIiKGpKzVEg5C3GMgK8U=">AAACHXicbVDLSgMxFM34rPVVdekmWISKWmZqQTdCURcuXChaLXTKkElv22iSGZOMWIb+iBt/xY0LRVy4Ef/GtHbh68C9HM65l+SeMOZMG9f9cEZGx8YnJjNT2emZ2bn53MLiuY4SRaFKIx6pWkg0cCahapjhUIsVEBFyuAiv9vv+xQ0ozSJ5ZroxNARpS9ZilBgrBbnyEd7Fvk5EkF5iX8I17gash31BbgvuBj4NLvGm7WlfXcf+AXBD1oJc3i26A+C/xBuSPBriOMi9+c2IJgKkoZxoXffc2DRSogyjHHpZP9EQE3pF2lC3VBIBupEOruvhVas0cStStqTBA/X7RkqE1l0R2klBTEf/9vrif149Ma2dRspknBiQ9OuhVsKxiXA/KtxkCqjhXUsIVcz+FdMOUYQaG2jWhuD9PvkvOS8Vva1i6aScr+wN48igZbSCCshD26iCDtExqiKK7tADekLPzr3z6Lw4r1+jI85wZwn9gPP+CXAwn6c=</latexit>
j6=yi
1 X
loss = L(f (xi , W ), yi )
N i
<latexit sha1_base64="yzOireLQ5ztcqr1+6p23MjT8bl4=">AAACGnicbVDLSgNBEJyNrxhfUY9eBoOQQAi7UdCLEPTiQSSCSYRsWGYns8mQ2QczvZKw7Hd48Ve8eFDEm3jxb5w8DppY0FBUddPd5UaCKzDNbyOztLyyupZdz21sbm3v5Hf3miqMJWUNGopQ3rtEMcED1gAOgt1HkhHfFazlDi7HfuuBScXD4A5GEev4pBdwj1MCWnLylg1sCIkIlUrxObY9SWhipclNaqvYdzi+LnrFocPLuFUq45HDS06+YFbMCfAisWakgGaoO/lPuxvS2GcBUEGUaltmBJ2ESOBUsDRnx4pFhA5Ij7U1DYjPVCeZvJbiI610sRdKXQHgifp7IiG+UiPf1Z0+gb6a98bif147Bu+sk/AgioEFdLrIiwWGEI9zwl0uGQUx0oRQyfWtmPaJTgd0mjkdgjX/8iJpVivWcaV6e1KoXcziyKIDdIiKyEKnqIauUB01EEWP6Bm9ojfjyXgx3o2PaWvGmM3soz8wvn4AS++ftA==</latexit>

L: 2.9 0.0 12.9 loss = 5.27


Support Vector Machine
❖ Supongamos tenemos sólo 3 clases y 3 ejemplos

X
L= max(0, Sj Syi + )
<latexit sha1_base64="oSv46RYGIiKGpKzVEg5C3GMgK8U=">AAACHXicbVDLSgMxFM34rPVVdekmWISKWmZqQTdCURcuXChaLXTKkElv22iSGZOMWIb+iBt/xY0LRVy4Ef/GtHbh68C9HM65l+SeMOZMG9f9cEZGx8YnJjNT2emZ2bn53MLiuY4SRaFKIx6pWkg0cCahapjhUIsVEBFyuAiv9vv+xQ0ozSJ5ZroxNARpS9ZilBgrBbnyEd7Fvk5EkF5iX8I17gash31BbgvuBj4NLvGm7WlfXcf+AXBD1oJc3i26A+C/xBuSPBriOMi9+c2IJgKkoZxoXffc2DRSogyjHHpZP9EQE3pF2lC3VBIBupEOruvhVas0cStStqTBA/X7RkqE1l0R2klBTEf/9vrif149Ma2dRspknBiQ9OuhVsKxiXA/KtxkCqjhXUsIVcz+FdMOUYQaG2jWhuD9PvkvOS8Vva1i6aScr+wN48igZbSCCshD26iCDtExqiKK7tADekLPzr3z6Lw4r1+jI85wZwn9gPP+CXAwn6c=</latexit>
j6=yi
1 X
loss = L(f (xi , W ), yi )
Img. source: Stanford CS231n 2017 N i
<latexit sha1_base64="yzOireLQ5ztcqr1+6p23MjT8bl4=">AAACGnicbVDLSgNBEJyNrxhfUY9eBoOQQAi7UdCLEPTiQSSCSYRsWGYns8mQ2QczvZKw7Hd48Ve8eFDEm3jxb5w8DppY0FBUddPd5UaCKzDNbyOztLyyupZdz21sbm3v5Hf3miqMJWUNGopQ3rtEMcED1gAOgt1HkhHfFazlDi7HfuuBScXD4A5GEev4pBdwj1MCWnLylg1sCIkIlUrxObY9SWhipclNaqvYdzi+LnrFocPLuFUq45HDS06+YFbMCfAisWakgGaoO/lPuxvS2GcBUEGUaltmBJ2ESOBUsDRnx4pFhA5Ij7U1DYjPVCeZvJbiI610sRdKXQHgifp7IiG+UiPf1Z0+gb6a98bif147Bu+sk/AgioEFdLrIiwWGEI9zwl0uGQUx0oRQyfWtmPaJTgd0mjkdgjX/8iJpVivWcaV6e1KoXcziyKIDdIiKyEKnqIauUB01EEWP6Bm9ojfjyXgx3o2PaWvGmM3soz8wvn4AS++ftA==</latexit>

¿Qué pasa con el valor de loss


si cambian un poco los
resultados para el ej. del auto?

L: 2.9 0.0 12.9 loss = 5.27


Support Vector Machine
❖ Supongamos tenemos sólo 3 clases y 3 ejemplos

X
L= max(0, Sj Syi + )
<latexit sha1_base64="oSv46RYGIiKGpKzVEg5C3GMgK8U=">AAACHXicbVDLSgMxFM34rPVVdekmWISKWmZqQTdCURcuXChaLXTKkElv22iSGZOMWIb+iBt/xY0LRVy4Ef/GtHbh68C9HM65l+SeMOZMG9f9cEZGx8YnJjNT2emZ2bn53MLiuY4SRaFKIx6pWkg0cCahapjhUIsVEBFyuAiv9vv+xQ0ozSJ5ZroxNARpS9ZilBgrBbnyEd7Fvk5EkF5iX8I17gash31BbgvuBj4NLvGm7WlfXcf+AXBD1oJc3i26A+C/xBuSPBriOMi9+c2IJgKkoZxoXffc2DRSogyjHHpZP9EQE3pF2lC3VBIBupEOruvhVas0cStStqTBA/X7RkqE1l0R2klBTEf/9vrif149Ma2dRspknBiQ9OuhVsKxiXA/KtxkCqjhXUsIVcz+FdMOUYQaG2jWhuD9PvkvOS8Vva1i6aScr+wN48igZbSCCshD26iCDtExqiKK7tADekLPzr3z6Lw4r1+jI85wZwn9gPP+CXAwn6c=</latexit>
j6=yi
1 X
loss = L(f (xi , W ), yi )
Img. source: Stanford CS231n 2017 N i
<latexit sha1_base64="yzOireLQ5ztcqr1+6p23MjT8bl4=">AAACGnicbVDLSgNBEJyNrxhfUY9eBoOQQAi7UdCLEPTiQSSCSYRsWGYns8mQ2QczvZKw7Hd48Ve8eFDEm3jxb5w8DppY0FBUddPd5UaCKzDNbyOztLyyupZdz21sbm3v5Hf3miqMJWUNGopQ3rtEMcED1gAOgt1HkhHfFazlDi7HfuuBScXD4A5GEev4pBdwj1MCWnLylg1sCIkIlUrxObY9SWhipclNaqvYdzi+LnrFocPLuFUq45HDS06+YFbMCfAisWakgGaoO/lPuxvS2GcBUEGUaltmBJ2ESOBUsDRnx4pFhA5Ij7U1DYjPVCeZvJbiI610sRdKXQHgifp7IiG+UiPf1Z0+gb6a98bif147Bu+sk/AgioEFdLrIiwWGEI9zwl0uGQUx0oRQyfWtmPaJTgd0mjkdgjX/8iJpVivWcaV6e1KoXcziyKIDdIiKyEKnqIauUB01EEWP6Bm9ojfjyXgx3o2PaWvGmM3soz8wvn4AS++ftA==</latexit>

¿Qué valores mínimos y


máximos puede tomar?

L: 2.9 0.0 12.9 loss = 5.27


Support Vector Machine
❖ Supongamos tenemos sólo 3 clases y 3 ejemplos

X
L= max(0, Sj Syi + )
<latexit sha1_base64="oSv46RYGIiKGpKzVEg5C3GMgK8U=">AAACHXicbVDLSgMxFM34rPVVdekmWISKWmZqQTdCURcuXChaLXTKkElv22iSGZOMWIb+iBt/xY0LRVy4Ef/GtHbh68C9HM65l+SeMOZMG9f9cEZGx8YnJjNT2emZ2bn53MLiuY4SRaFKIx6pWkg0cCahapjhUIsVEBFyuAiv9vv+xQ0ozSJ5ZroxNARpS9ZilBgrBbnyEd7Fvk5EkF5iX8I17gash31BbgvuBj4NLvGm7WlfXcf+AXBD1oJc3i26A+C/xBuSPBriOMi9+c2IJgKkoZxoXffc2DRSogyjHHpZP9EQE3pF2lC3VBIBupEOruvhVas0cStStqTBA/X7RkqE1l0R2klBTEf/9vrif149Ma2dRspknBiQ9OuhVsKxiXA/KtxkCqjhXUsIVcz+FdMOUYQaG2jWhuD9PvkvOS8Vva1i6aScr+wN48igZbSCCshD26iCDtExqiKK7tADekLPzr3z6Lw4r1+jI85wZwn9gPP+CXAwn6c=</latexit>
j6=yi
1 X
loss = L(f (xi , W ), yi )
Img. source: Stanford CS231n 2017 N i
<latexit sha1_base64="yzOireLQ5ztcqr1+6p23MjT8bl4=">AAACGnicbVDLSgNBEJyNrxhfUY9eBoOQQAi7UdCLEPTiQSSCSYRsWGYns8mQ2QczvZKw7Hd48Ve8eFDEm3jxb5w8DppY0FBUddPd5UaCKzDNbyOztLyyupZdz21sbm3v5Hf3miqMJWUNGopQ3rtEMcED1gAOgt1HkhHfFazlDi7HfuuBScXD4A5GEev4pBdwj1MCWnLylg1sCIkIlUrxObY9SWhipclNaqvYdzi+LnrFocPLuFUq45HDS06+YFbMCfAisWakgGaoO/lPuxvS2GcBUEGUaltmBJ2ESOBUsDRnx4pFhA5Ij7U1DYjPVCeZvJbiI610sRdKXQHgifp7IiG+UiPf1Z0+gb6a98bif147Bu+sk/AgioEFdLrIiwWGEI9zwl0uGQUx0oRQyfWtmPaJTgd0mjkdgjX/8iJpVivWcaV6e1KoXcziyKIDdIiKyEKnqIauUB01EEWP6Bm9ojfjyXgx3o2PaWvGmM3soz8wvn4AS++ftA==</latexit>

Si delta=1 y W es pequeño,
entonces s ~ 0.
Siendo esto así, ¿qué valor
tiene loss ?

L: 2.9 0.0 12.9 loss = 5.27


Support Vector Machine
❖ Supongamos tenemos sólo 3 clases y 3 ejemplos

X
L= max(0, Sj Syi + )
<latexit sha1_base64="oSv46RYGIiKGpKzVEg5C3GMgK8U=">AAACHXicbVDLSgMxFM34rPVVdekmWISKWmZqQTdCURcuXChaLXTKkElv22iSGZOMWIb+iBt/xY0LRVy4Ef/GtHbh68C9HM65l+SeMOZMG9f9cEZGx8YnJjNT2emZ2bn53MLiuY4SRaFKIx6pWkg0cCahapjhUIsVEBFyuAiv9vv+xQ0ozSJ5ZroxNARpS9ZilBgrBbnyEd7Fvk5EkF5iX8I17gash31BbgvuBj4NLvGm7WlfXcf+AXBD1oJc3i26A+C/xBuSPBriOMi9+c2IJgKkoZxoXffc2DRSogyjHHpZP9EQE3pF2lC3VBIBupEOruvhVas0cStStqTBA/X7RkqE1l0R2klBTEf/9vrif149Ma2dRspknBiQ9OuhVsKxiXA/KtxkCqjhXUsIVcz+FdMOUYQaG2jWhuD9PvkvOS8Vva1i6aScr+wN48igZbSCCshD26iCDtExqiKK7tADekLPzr3z6Lw4r1+jI85wZwn9gPP+CXAwn6c=</latexit>
j6=yi
1 X
loss = L(f (xi , W ), yi )
Img. source: Stanford CS231n 2017 N i
<latexit sha1_base64="yzOireLQ5ztcqr1+6p23MjT8bl4=">AAACGnicbVDLSgNBEJyNrxhfUY9eBoOQQAi7UdCLEPTiQSSCSYRsWGYns8mQ2QczvZKw7Hd48Ve8eFDEm3jxb5w8DppY0FBUddPd5UaCKzDNbyOztLyyupZdz21sbm3v5Hf3miqMJWUNGopQ3rtEMcED1gAOgt1HkhHfFazlDi7HfuuBScXD4A5GEev4pBdwj1MCWnLylg1sCIkIlUrxObY9SWhipclNaqvYdzi+LnrFocPLuFUq45HDS06+YFbMCfAisWakgGaoO/lPuxvS2GcBUEGUaltmBJ2ESOBUsDRnx4pFhA5Ij7U1DYjPVCeZvJbiI610sRdKXQHgifp7IiG+UiPf1Z0+gb6a98bif147Bu+sk/AgioEFdLrIiwWGEI9zwl0uGQUx0oRQyfWtmPaJTgd0mjkdgjX/8iJpVivWcaV6e1KoXcziyKIDdIiKyEKnqIauUB01EEWP6Bm9ojfjyXgx3o2PaWvGmM3soz8wvn4AS++ftA==</latexit>

¿Qué pasa si la suma es


sobre todas las clases?

L: 2.9 0.0 12.9 loss = 5.27


Support Vector Machine
❖ Supongamos tenemos sólo 3 clases y 3 ejemplos

X
L= max(0, Sj Syi + )
<latexit sha1_base64="oSv46RYGIiKGpKzVEg5C3GMgK8U=">AAACHXicbVDLSgMxFM34rPVVdekmWISKWmZqQTdCURcuXChaLXTKkElv22iSGZOMWIb+iBt/xY0LRVy4Ef/GtHbh68C9HM65l+SeMOZMG9f9cEZGx8YnJjNT2emZ2bn53MLiuY4SRaFKIx6pWkg0cCahapjhUIsVEBFyuAiv9vv+xQ0ozSJ5ZroxNARpS9ZilBgrBbnyEd7Fvk5EkF5iX8I17gash31BbgvuBj4NLvGm7WlfXcf+AXBD1oJc3i26A+C/xBuSPBriOMi9+c2IJgKkoZxoXffc2DRSogyjHHpZP9EQE3pF2lC3VBIBupEOruvhVas0cStStqTBA/X7RkqE1l0R2klBTEf/9vrif149Ma2dRspknBiQ9OuhVsKxiXA/KtxkCqjhXUsIVcz+FdMOUYQaG2jWhuD9PvkvOS8Vva1i6aScr+wN48igZbSCCshD26iCDtExqiKK7tADekLPzr3z6Lw4r1+jI85wZwn9gPP+CXAwn6c=</latexit>
j6=yi
1 X
loss = L(f (xi , W ), yi )
Img. source: Stanford CS231n 2017 N i
<latexit sha1_base64="yzOireLQ5ztcqr1+6p23MjT8bl4=">AAACGnicbVDLSgNBEJyNrxhfUY9eBoOQQAi7UdCLEPTiQSSCSYRsWGYns8mQ2QczvZKw7Hd48Ve8eFDEm3jxb5w8DppY0FBUddPd5UaCKzDNbyOztLyyupZdz21sbm3v5Hf3miqMJWUNGopQ3rtEMcED1gAOgt1HkhHfFazlDi7HfuuBScXD4A5GEev4pBdwj1MCWnLylg1sCIkIlUrxObY9SWhipclNaqvYdzi+LnrFocPLuFUq45HDS06+YFbMCfAisWakgGaoO/lPuxvS2GcBUEGUaltmBJ2ESOBUsDRnx4pFhA5Ij7U1DYjPVCeZvJbiI610sRdKXQHgifp7IiG+UiPf1Z0+gb6a98bif147Bu+sk/AgioEFdLrIiwWGEI9zwl0uGQUx0oRQyfWtmPaJTgd0mjkdgjX/8iJpVivWcaV6e1KoXcziyKIDdIiKyEKnqIauUB01EEWP6Bm9ojfjyXgx3o2PaWvGmM3soz8wvn4AS++ftA==</latexit>

¿Qué pasa si usamos el


promedio en lugar de la
suma?

L: 2.9 0.0 12.9 loss = 5.27


Support Vector Machine
❖ Supongamos tenemos sólo 3 clases y 3 ejemplos

X
L= max(0, Sj Syi + )
<latexit sha1_base64="oSv46RYGIiKGpKzVEg5C3GMgK8U=">AAACHXicbVDLSgMxFM34rPVVdekmWISKWmZqQTdCURcuXChaLXTKkElv22iSGZOMWIb+iBt/xY0LRVy4Ef/GtHbh68C9HM65l+SeMOZMG9f9cEZGx8YnJjNT2emZ2bn53MLiuY4SRaFKIx6pWkg0cCahapjhUIsVEBFyuAiv9vv+xQ0ozSJ5ZroxNARpS9ZilBgrBbnyEd7Fvk5EkF5iX8I17gash31BbgvuBj4NLvGm7WlfXcf+AXBD1oJc3i26A+C/xBuSPBriOMi9+c2IJgKkoZxoXffc2DRSogyjHHpZP9EQE3pF2lC3VBIBupEOruvhVas0cStStqTBA/X7RkqE1l0R2klBTEf/9vrif149Ma2dRspknBiQ9OuhVsKxiXA/KtxkCqjhXUsIVcz+FdMOUYQaG2jWhuD9PvkvOS8Vva1i6aScr+wN48igZbSCCshD26iCDtExqiKK7tADekLPzr3z6Lw4r1+jI85wZwn9gPP+CXAwn6c=</latexit>
j6=yi
1 X
loss = L(f (xi , W ), yi )
Img. source: Stanford CS231n 2017 N i
<latexit sha1_base64="yzOireLQ5ztcqr1+6p23MjT8bl4=">AAACGnicbVDLSgNBEJyNrxhfUY9eBoOQQAi7UdCLEPTiQSSCSYRsWGYns8mQ2QczvZKw7Hd48Ve8eFDEm3jxb5w8DppY0FBUddPd5UaCKzDNbyOztLyyupZdz21sbm3v5Hf3miqMJWUNGopQ3rtEMcED1gAOgt1HkhHfFazlDi7HfuuBScXD4A5GEev4pBdwj1MCWnLylg1sCIkIlUrxObY9SWhipclNaqvYdzi+LnrFocPLuFUq45HDS06+YFbMCfAisWakgGaoO/lPuxvS2GcBUEGUaltmBJ2ESOBUsDRnx4pFhA5Ij7U1DYjPVCeZvJbiI610sRdKXQHgifp7IiG+UiPf1Z0+gb6a98bif147Bu+sk/AgioEFdLrIiwWGEI9zwl0uGQUx0oRQyfWtmPaJTgd0mjkdgjX/8iJpVivWcaV6e1KoXcziyKIDdIiKyEKnqIauUB01EEWP6Bm9ojfjyXgx3o2PaWvGmM3soz8wvn4AS++ftA==</latexit>

¿Qué pasa si usamos esta


métrica ?
X
L= max(0, Sj Syi + 1)2
<latexit sha1_base64="Q1SkruhBbsfgob9j8HwN1VJwJEU=">AAACGXicbVDLSgMxFM3UV62vqks3wSJU1DJTC7oRCm5cuKhoH9CpQyZN27RJZkwyYhn6G278FTcuFHGpK//G9LHQ1gP3cjjnXpJ7/JBRpW3720rMzS8sLiWXUyura+sb6c2tigoiiUkZByyQNR8pwqggZU01I7VQEsR9Rqp+73zoV++JVDQQN7ofkgZHbUFbFCNtJC9tX8Iz6KqIe3EXuoLcwb5HB9Dl6CFrH8JrrwuPTI+H6oGzf5v30hk7Z48AZ4kzIRkwQclLf7rNAEecCI0ZUqru2KFuxEhqihkZpNxIkRDhHmqTuqECcaIa8eiyAdwzShO2AmlKaDhSf2/EiCvV576Z5Eh31LQ3FP/z6pFunTZiKsJIE4HHD7UiBnUAhzHBJpUEa9Y3BGFJzV8h7iCJsDZhpkwIzvTJs6SSzznHufxVIVMsTOJIgh2wC7LAASegCC5ACZQBBo/gGbyCN+vJerHerY/xaMKa7GyDP7C+fgAXa53M</latexit>
j6=yi

L: 2.9 0.0 12.9 loss = 5.27


Regularización
1 X
loss = L(f (xi , W ), yi )
N i
<latexit sha1_base64="yzOireLQ5ztcqr1+6p23MjT8bl4=">AAACGnicbVDLSgNBEJyNrxhfUY9eBoOQQAi7UdCLEPTiQSSCSYRsWGYns8mQ2QczvZKw7Hd48Ve8eFDEm3jxb5w8DppY0FBUddPd5UaCKzDNbyOztLyyupZdz21sbm3v5Hf3miqMJWUNGopQ3rtEMcED1gAOgt1HkhHfFazlDi7HfuuBScXD4A5GEev4pBdwj1MCWnLylg1sCIkIlUrxObY9SWhipclNaqvYdzi+LnrFocPLuFUq45HDS06+YFbMCfAisWakgGaoO/lPuxvS2GcBUEGUaltmBJ2ESOBUsDRnx4pFhA5Ij7U1DYjPVCeZvJbiI610sRdKXQHgifp7IiG+UiPf1Z0+gb6a98bif147Bu+sk/AgioEFdLrIiwWGEI9zwl0uGQUx0oRQyfWtmPaJTgd0mjkdgjX/8iJpVivWcaV6e1KoXcziyKIDdIiKyEKnqIauUB01EEWP6Bm9ojfjyXgx3o2PaWvGmM3soz8wvn4AS++ftA==</latexit>

Img. source: Stanford CS231n 2017


Regularización
1 X
loss = L(f (xi , W ), yi )
N i
<latexit sha1_base64="yzOireLQ5ztcqr1+6p23MjT8bl4=">AAACGnicbVDLSgNBEJyNrxhfUY9eBoOQQAi7UdCLEPTiQSSCSYRsWGYns8mQ2QczvZKw7Hd48Ve8eFDEm3jxb5w8DppY0FBUddPd5UaCKzDNbyOztLyyupZdz21sbm3v5Hf3miqMJWUNGopQ3rtEMcED1gAOgt1HkhHfFazlDi7HfuuBScXD4A5GEev4pBdwj1MCWnLylg1sCIkIlUrxObY9SWhipclNaqvYdzi+LnrFocPLuFUq45HDS06+YFbMCfAisWakgGaoO/lPuxvS2GcBUEGUaltmBJ2ESOBUsDRnx4pFhA5Ij7U1DYjPVCeZvJbiI610sRdKXQHgifp7IiG+UiPf1Z0+gb6a98bif147Bu+sk/AgioEFdLrIiwWGEI9zwl0uGQUx0oRQyfWtmPaJTgd0mjkdgjX/8iJpVivWcaV6e1KoXcziyKIDdIiKyEKnqIauUB01EEWP6Bm9ojfjyXgx3o2PaWvGmM3soz8wvn4AS++ftA==</latexit>

Img. source: Stanford CS231n 2017


Regularización
1 X
loss = L(f (xi , W ), yi )
N i
<latexit sha1_base64="yzOireLQ5ztcqr1+6p23MjT8bl4=">AAACGnicbVDLSgNBEJyNrxhfUY9eBoOQQAi7UdCLEPTiQSSCSYRsWGYns8mQ2QczvZKw7Hd48Ve8eFDEm3jxb5w8DppY0FBUddPd5UaCKzDNbyOztLyyupZdz21sbm3v5Hf3miqMJWUNGopQ3rtEMcED1gAOgt1HkhHfFazlDi7HfuuBScXD4A5GEev4pBdwj1MCWnLylg1sCIkIlUrxObY9SWhipclNaqvYdzi+LnrFocPLuFUq45HDS06+YFbMCfAisWakgGaoO/lPuxvS2GcBUEGUaltmBJ2ESOBUsDRnx4pFhA5Ij7U1DYjPVCeZvJbiI610sRdKXQHgifp7IiG+UiPf1Z0+gb6a98bif147Bu+sk/AgioEFdLrIiwWGEI9zwl0uGQUx0oRQyfWtmPaJTgd0mjkdgjX/8iJpVivWcaV6e1KoXcziyKIDdIiKyEKnqIauUB01EEWP6Bm9ojfjyXgx3o2PaWvGmM3soz8wvn4AS++ftA==</latexit>

Img. source: Stanford CS231n 2017


Regularización
1 X
loss = L(f (xi , W ), yi ) + R(W )
N i
<latexit sha1_base64="1888QJ1iSX/hhV6ImuBb1q1z9cU=">AAACKXicbVDLSgNBEJz1GeMr6tHLYBASDGE3CnoRgl48iEQxJpANy+xkNhky+2CmVxKW/R0v/ooXBUW9+iNOHgc1FjQUVd10d7mR4ApM88OYm19YXFrOrGRX19Y3NnNb23cqjCVldRqKUDZdopjgAasDB8GakWTEdwVruP3zkd+4Z1LxMLiFYcTaPukG3OOUgJacXNUGNoBEhEql+BTbniQ0sdLkKrVV7DscXxa8wsDhJdwolvDQ4UV8gG2hF3QIvik0ik4ub5bNMfAssaYkj6aoObkXuxPS2GcBUEGUallmBO2ESOBUsDRrx4pFhPZJl7U0DYjPVDsZf5rifa10sBdKXQHgsfpzIiG+UkPf1Z0+gZ76643E/7xWDN5JO+FBFAML6GSRFwsMIR7FhjtcMgpiqAmhkutbMe0RHRbocLM6BOvvy7PkrlK2DsuV66N89WwaRwbtoj1UQBY6RlV0gWqojih6QE/oFb0Zj8az8W58TlrnjOnMDvoF4+sbEDWkjA==</latexit>

Img. source: Stanford CS231n 2017


Regularización
N X
X
1
loss = max(0, f (xi , W )j f (xi , W )yi + ) + R(W )
N i
<latexit sha1_base64="zIwF0+jWYiOWKaNu9f9zoue5Bls=">AAACWXicbZHPb9MwFMedwEYpP1bYkcsTFVInRpWMSewyaRocOE0D0XVSUyLHfdm8OU5mv0ytrPyTHJAQ/woHnK6HsfEkSx9/v+/J9tdZpaSlKPoVhA8erq0/6jzuPnn67PlG78XLE1vWRuBIlKo0pxm3qKTGEUlSeFoZ5EWmcJxdfmz98TUaK0v9jRYVTgt+pmUuBScvpb0qIZyTU6W1DexDkhsuXNy4oyaxdZHK70ewBHeRaLyCRSqbpODzQbQN+WCeym0Yb6UX8O7WzrVN8BaST6iIb7Wk/IVmHL4OvN3rR8NoWXAf4hX02aqO096PZFaKukBNQnFrJ3FU0dRxQ1IobLpJbbHi4pKf4cSj5gXaqVsm08Abr8wgL41fmmCp3p5wvLB2UWS+s+B0bu96rfg/b1JTvjd1Ulc1oRY3B+W1AiqhjRlm0qAgtfDAhZH+riDOuQ+X/Gd0fQjx3Sffh5OdYfx+uPNlt39wuIqjw16x12zAYvaBHbDP7JiNmGA/2Z9gLVgPfodB2Am7N61hsJrZZP9UuPkXTkGw7w==</latexit>
j6=yi
P
❖ L2 k,l W 2
k,l R(W ) = 0.5 * L2
P
<latexit sha1_base64="jyzc2rE2Lun3z3WukWXxjtmwgYM=">AAACBHicbZA7SwNBFIVn4yvG16plmsEgRJC4GwttAkEbyygmG0jWZXZ2Nhky+2BmVghLChv9G3Y2ForYWlha2vlvnGRTaPTAwMe593LnHjdmVEjD+NJyc/MLi0v55cLK6tr6hr651RJRwjFp4ohFvO0iQRgNSVNSyUg75gQFLiOWOzgd161rwgWNwks5jIkdoF5IfYqRVJajFy/K1h6swa5IAicd7LMRtK6qGTl6yagYE8G/YE6hVD94v7/78GoNR//sehFOAhJKzJAQHdOIpZ0iLilmZFToJoLECA9Qj3QUhiggwk4nR4zgrnI86EdcvVDCiftzIkWBEMPAVZ0Bkn0xWxub/9U6ifSP7ZSGcSJJiLNFfsKgjOA4EehRTrBkQwUIc6r+CnEfcYSlyq2gQjBnT/4LrWrFPKxUz81S/QRkyoMi2AFlYIIjUAdnoAGaAIMb8ACewLN2qz1qL9pr1prTpjPb4Je0t29e7Jpg</latexit>

❖ L1 R(W ) = k,l |Wk,l |


P
<latexit sha1_base64="8v5zBYOeP6ByVf9bq65vSBnj1Z0=">AAACBHicbZA7TwJBFIVn8YX4WrWkmUhMMDG4i4U2JEQbSzTCksBmMzs7wITZR2ZmTchCYaN/w87GQmNsLSwt7fw3DiyFgieZ5Mu59+bOPW7EqJCG8a1lFhaXlleyq7m19Y3NLX17pyHCmGNSxyELedNFgjAakLqkkpFmxAnyXUYst38+rls3hAsaBtdyEBHbR92AdihGUlmOnr8qWgewAtsi9p2kf8hGcGilMHT0glEyJoLzYE6hUD36eLj/9Co1R/9qeyGOfRJIzJAQLdOIpJ0gLilmZJRrx4JECPdRl7QUBsgnwk4mR4zgvnI82Am5eoGEE/f3RIJ8IQa+qzp9JHtitjY2/6u1Ytk5tRMaRLEkAU4XdWIGZQjHiUCPcoIlGyhAmFP1V4h7iCMsVW45FYI5e/I8NMol87hUvjQL1TOQKgvyYA8UgQlOQBVcgBqoAwxuwSN4Bi/anfakvWpvaWtGm87sgj/S3n8A/rGayA==</latexit>

2
❖ Elastic net (L1 + L2) R(W ) =
<latexit sha1_base64="eVcV3KTxNjHtB3ERmYlv1Ow4uJg=">AAACFnicbVC7SgNBFJ2Nrxhfq5Y2g0GIqHE3FtoEgjaWUcwDkrjMzk6SIbOzy8ysEDap/QAbS3/DxiIigpXY+TdOHoUmHhg495x7uXOPGzIqlWV9G4m5+YXFpeRyamV1bX3D3NwqyyASmJRwwAJRdZEkjHJSUlQxUg0FQb7LSMXtXAz9yh0Rkgb8RnVD0vBRi9MmxUhpyTGPrjOVfZiHdRn5Ttw5ZH1Yd4lCsDKubnPwAPYmRc8x01bWGgHOEntC0oXjwdP9h5cvOuZX3Qtw5BOuMENS1mwrVI0YCUUxI/1UPZIkRLiDWqSmKUc+kY14dFYf7mnFg81A6McVHKm/J2LkS9n1Xd3pI9WW095Q/M+rRap51ogpDyNFOB4vakYMqgAOM4IeFQQr1tUEYUH1XyFuI4Gw0kmmdAj29MmzpJzL2ifZ3JWdLpyDMZJgB+yCDLDBKSiAS1AEJYDBA3gGA/BqPBovxpvxPm5NGJOZbfAHxucPK4yhQA==</latexit>
k,l Wk,l + |Wk,l |
❖ Max norm regularization
❖ Dropout
❖ Batch normalization también aplica un efecto de regularización
Clasificación de imágenes

❖ Nearest Neighbors
❖ K-Nearest Neighbors
❖ Linear Classifier :
❖ Support Vector Machine
❖ Softmax Classifier Nos quedaba pendiente
Softmax Classifier
❖ Al igual que SVM, en Softmax la función de activación
es lineal
f (x, W ) = W x
<latexit sha1_base64="UC8OUwJ6qqWh68UG7aJIY5Xd+fQ=">AAAB8nicbVBNS8NAEN34WetX1aOXxSJUkJJUQS9C0YvHCrYppKFstpt26SYbdifSEvozvHhQxKu/xpv/xm2bg7Y+GHi8N8PMvCARXINtf1srq2vrG5uFreL2zu7efungsKVlqihrUimkagdEM8Fj1gQOgrUTxUgUCOYGw7up7z4xpbmMH2GcMD8i/ZiHnBIwkhdWRufuGb7B7qhbKttVewa8TJyclFGORrf01elJmkYsBiqI1p5jJ+BnRAGngk2KnVSzhNAh6TPP0JhETPvZ7OQJPjVKD4dSmYoBz9TfExmJtB5HgemMCAz0ojcV//O8FMJrP+NxkgKL6XxRmAoMEk//xz2uGAUxNoRQxc2tmA6IIhRMSkUTgrP48jJp1arORbX2cFmu3+ZxFNAxOkEV5KArVEf3qIGaiCKJntErerPAerHerY9564qVzxyhP7A+fwA7OI/q</latexit>

done cambia su función de costo, si lo comparamos con


SVM
❖ Softmax es la generalización del clasificador conocido
como Binary Logistic Regression
❖ La función de costo Softmax también es conocida como
Categorical Cross Entropy
Softmax Classifier
❖ Interpretando los resultados como log probabilidades
no normalizadas para cada clase tenemos
fj (xi )
e
P (Y = j|X = xi ) = P
e fk (xi )
<latexit sha1_base64="mE/2iUoC0LXrIkb4rfD03fYzNbU=">AAACIHicbZC7TsMwFIYdrqXcCowsFhVSWaqkDGWpKLAwFoleUFMix3VaN85FtoOoQh+FhSdBYmEAIdjgLXgDnLQDtPySpV/fOUfH57dDRoXU9U9tbn5hcWk5s5JdXVvf2MxtbTdEEHFM6jhgAW/ZSBBGfVKXVDLSCjlBns1I03bPknrzhnBBA/9SDkPS8VDPpw7FSCpk5cq1wlVlcNeq3Fr0AFag6XCEY3IdO9agkLDRKDZF5FkuTKE7gVYurxf1VHDWGBOTrx7nv09g+7Fm5T7MboAjj/gSMyRE29BD2YkRlxQzMsqakSAhwi7qkbayPvKI6MTpgSO4r0gXOgFXz5cwpb8nYuQJMfRs1ekh2RfTtQT+V2tH0jnqxNQPI0l8PF7kRAzKACZpwS7lBEs2VAZhTtVfIe4jFZFUmWZVCMb0ybOmUSoah8XShZGvnoKxMmAX7IECMEAZVME5qIE6wOAePIEX8Ko9aM/am/Y+bp3TJjM74I+0rx/QW6XK</latexit>
k

softmax
zi
e
fi (z) = P z
e j
<latexit sha1_base64="UMRc4gjaQ0grXPZIpftDPOJJArg=">AAACEHicbVC7TsMwFHXKq5RXgZHFokKUpSRlgAWpwMLYSvQhNSFyXKd16zxkO0htlI2VhZmVL2BhACFWRja+hBX3MUDLka50fM698r3HCRkVUte/tNTc/MLiUno5s7K6tr6R3dyqiSDimFRxwALecJAgjPqkKqlkpBFygjyHkbrTuxj69RvCBQ38K9kPieWhtk9dipFUkp3dd22aHxzAU2i6HOGYXMcDmyZJbIrIs7tw9O4miZ3N6QV9BDhLjAnJlc4q3w+3h49lO/tptgIcecSXmCEhmoYeSitGXFLMSJIxI0FChHuoTZqK+sgjwopHByVwTykt6AZclS/hSP09ESNPiL7nqE4PyY6Y9obif14zku6JFVM/jCTx8fgjN2JQBnCYDmxRTrBkfUUQ5lTtCnEHqWCkyjCjQjCmT54ltWLBOCoUK0audA7GSIMdsAvywADHoAQuQRlUAQZ34Am8gFftXnvW3rT3cWtKm8xsgz/QPn4A5Y6g4g==</latexit>
j

❖ Ahora buscamos maximizar la log likelihood


L=
<latexit sha1_base64="QloTy/d44qL3MggvTEuGo6W7KXg=">AAACAHicbVC7SgNBFJ2NrxhfqxYWNkOCEBHDbiy0CSzaWFhEMA/JLsvsZDYZM/tgZlZcYhq/wT+wsVDE1s+wy984eRSaeODC4Zx7ufceL2ZUSMMYapmFxaXllexqbm19Y3NL396piyjhmNRwxCLe9JAgjIakJqlkpBlzggKPkYbXuxj5jXvCBY3CG5nGxAlQJ6Q+xUgqydX3rmAFHtss6sBq8bZy99isPLj0ELp6wSgZY8B5Yk5JwcrbR89DK626+rfdjnASkFBihoRomUYsnT7ikmJGBjk7ESRGuIc6pKVoiAIinP74gQE8UEob+hFXFUo4Vn9P9FEgRBp4qjNAsitmvZH4n9dKpH/m9GkYJ5KEeLLITxiUERylAduUEyxZqgjCnKpbIe4ijrBUmeVUCObsy/OkXi6ZJ6XytVmwzsEEWbAP8qAITHAKLHAJqqAGMBiAF/AG3rUn7VX70D4nrRltOrML/kD7+gF+tpdn</latexit>
log P (Y = j|X = xi )
Softmax Classifier
❖ Al igual que SVM sigue siendo un clasificador lineal
(Wx), pero en lugar de utilizar la función de costo de
Hinge utilizamos la log likelihood:
✓ ◆ ⇣P ⌘
fy
e i fj
L = log P efj = fyi + log j e
j
<latexit sha1_base64="fl4ZaEP98bchUI5EiBkFILd6Oag=">AAACVHicbVFNSxtBGJ5dtWra2lhP4uXFIFhKw6496EVI7cWDBwWjQjYus5N3kzGzH8y8K4Rlf5G/pYd6EPpLvAg6+TgY7QsDD8/HfDwT5Uoa8rx/jruwuPRheWW19vHT57Uv9fWvFyYrtMC2yFSmryJuUMkU2yRJ4VWukSeRwsto+HusX96iNjJLz2mUYzfh/VTGUnCyVFgfnsAh/AhU1odAYUy7EMSaixKvyzgsR6GsqqoMTJGENzDhbqoq0LI/oG/jIMxM8B3m9pgLwCwQ1hte05sMvAf+DDRav9Td85+7zdOwfh/0MlEkmJJQ3JiO7+XULbkmKRRWtaAwmHMx5H3sWJjyBE23nJRSwY5lehBn2q6UYMK+TpQ8MWaURNaZcBqYt9qY/J/WKSg+6JYyzQvCVEwPigsFlMG4YehJjYLUyAIutLR3BTHgtlSy/1CzJfhvn/weXOw1/Z/NvTO/0Tpi01lhW2yb7TKf7bMWO2anrM0E+8seHeY4zoPz5C64S1Or68wyG2xu3LUXJfa2WQ==</latexit>

1 X
loss = L(f (xi , W ), yi ) + R(W )
N i
<latexit sha1_base64="1888QJ1iSX/hhV6ImuBb1q1z9cU=">AAACKXicbVDLSgNBEJz1GeMr6tHLYBASDGE3CnoRgl48iEQxJpANy+xkNhky+2CmVxKW/R0v/ooXBUW9+iNOHgc1FjQUVd10d7mR4ApM88OYm19YXFrOrGRX19Y3NnNb23cqjCVldRqKUDZdopjgAasDB8GakWTEdwVruP3zkd+4Z1LxMLiFYcTaPukG3OOUgJacXNUGNoBEhEql+BTbniQ0sdLkKrVV7DscXxa8wsDhJdwolvDQ4UV8gG2hF3QIvik0ik4ub5bNMfAssaYkj6aoObkXuxPS2GcBUEGUallmBO2ESOBUsDRrx4pFhPZJl7U0DYjPVDsZf5rifa10sBdKXQHgsfpzIiG+UkPf1Z0+gZ76643E/7xWDN5JO+FBFAML6GSRFwsMIR7FhjtcMgpiqAmhkutbMe0RHRbocLM6BOvvy7PkrlK2DsuV66N89WwaRwbtoj1UQBY6RlV0gWqojih6QE/oFb0Zj8az8W58TlrnjOnMDvoF4+sbEDWkjA==</latexit>
Softmax Classifier

✓ ◆ ⇣P ⌘
fy
e i fj
L= log P
e fj = fyi + log j e
j
<latexit sha1_base64="fl4ZaEP98bchUI5EiBkFILd6Oag=">AAACVHicbVFNSxtBGJ5dtWra2lhP4uXFIFhKw6496EVI7cWDBwWjQjYus5N3kzGzH8y8K4Rlf5G/pYd6EPpLvAg6+TgY7QsDD8/HfDwT5Uoa8rx/jruwuPRheWW19vHT57Uv9fWvFyYrtMC2yFSmryJuUMkU2yRJ4VWukSeRwsto+HusX96iNjJLz2mUYzfh/VTGUnCyVFgfnsAh/AhU1odAYUy7EMSaixKvyzgsR6GsqqoMTJGENzDhbqoq0LI/oG/jIMxM8B3m9pgLwCwQ1hte05sMvAf+DDRav9Td85+7zdOwfh/0MlEkmJJQ3JiO7+XULbkmKRRWtaAwmHMx5H3sWJjyBE23nJRSwY5lehBn2q6UYMK+TpQ8MWaURNaZcBqYt9qY/J/WKSg+6JYyzQvCVEwPigsFlMG4YehJjYLUyAIutLR3BTHgtlSy/1CzJfhvn/weXOw1/Z/NvTO/0Tpi01lhW2yb7TKf7bMWO2anrM0E+8seHeY4zoPz5C64S1Or68wyG2xu3LUXJfa2WQ==</latexit>

Img. source: Stanford CS231n 2017


Softmax Classifier
❖ Teoría de la información: La entropía cruzada entre la
distribución real p y la distribución estimada q se define

P
H(p, q) =
<latexit sha1_base64="uF9ZaPtC+2bMtVvUBRSzqDG0NnA=">AAACCHicbZC7SgNBFIZn4y2ut1VLCwdDIAENu7HQRgjapIxgLpANYXYy2QzO7G5mZiUhpLTxHXwCGwtFbH0EG/VB7J1cCk38YeDjP+dw5vxexKhUtv1lJBYWl5ZXkqvm2vrG5pa1vVORYSwwKeOQhaLmIUkYDUhZUcVILRIEcY+Rqnd9MapXb4iQNAyuVD8iDY78gLYpRkpbTWu/mIkOu1l4Bo9cGfNmD0aZXha6LPRhV1PTStk5eyw4D84UUoX09+fHvemXmta72wpxzEmgMENS1h07Uo0BEopiRoamG0sSIXyNfFLXGCBOZGMwPmQI09ppwXYo9AsUHLu/JwaIS9nnnu7kSHXkbG1k/lerx6p92hjQIIoVCfBkUTtmUIVwlApsUUGwYn0NCAuq/wpxBwmElc7O1CE4syfPQyWfc45z+UsnVTgHEyXBHjgAGeCAE1AARVACZYDBLXgAT+DZuDMejRfjddKaMKYzu+CPjLcfHHGbQg==</latexit>
x p(x) log q(x)

❖ El clasificador softmax minimiza la entropía cruzada entre


las probabilidades estimadas (log prob. normalizada) y la
distribución real (i.e p=(0,…,1,…0))
❖ En otras palabras, esta función de costo busca que la
distribución estimada tengan su centro de masa en la
clasificación correcta
SVM vs Softmax

❖ SVM: Sólo interesa que la correcta gane, sin importar la


diferencia entre las otras clases.
❖ Softmax: Busca mejorar la relación entre las clases de la
clasificación ya que intenta maximizar la probabilidad
de la correcta (que sea 1) y minimizar la de la incorrecta
a cero.
❖ Hasta acá definimos tres formas de clasificar imágenes,
KNN, SVM y Softmax.
❖ SVM y Softmax son ambos lineales y su principal
diferencia radica en la función de costo que utilizan
Hinge vs Categorical Cross Entropy
❖ También vimos el efecto de la regularización y posibles
formas de aplicarla (L1, L2 dropoouts, …)

¿Cómo calculamos los pesos o parámetro de


nuestro modelo de clasificación lineal?
Optimización
Optimización

Copyright: Michael Werlberger


Optimización
❖ Gradiente descendente

f (w)
w(0)
↵rw f (w)
w(1) = w(0) ↵rw f(w)

Mínimo local Mínimo absoluto w


Optimización
❖ Buscamos minimizar la función de costo

1 X
loss = L(f (xi , W ), yi ) + R(W )
N i
<latexit sha1_base64="1888QJ1iSX/hhV6ImuBb1q1z9cU=">AAACKXicbVDLSgNBEJz1GeMr6tHLYBASDGE3CnoRgl48iEQxJpANy+xkNhky+2CmVxKW/R0v/ooXBUW9+iNOHgc1FjQUVd10d7mR4ApM88OYm19YXFrOrGRX19Y3NnNb23cqjCVldRqKUDZdopjgAasDB8GakWTEdwVruP3zkd+4Z1LxMLiFYcTaPukG3OOUgJacXNUGNoBEhEql+BTbniQ0sdLkKrVV7DscXxa8wsDhJdwolvDQ4UV8gG2hF3QIvik0ik4ub5bNMfAssaYkj6aoObkXuxPS2GcBUEGUallmBO2ESOBUsDRrx4pFhPZJl7U0DYjPVDsZf5rifa10sBdKXQHgsfpzIiG+UkPf1Z0+gZ76643E/7xWDN5JO+FBFAML6GSRFwsMIR7FhjtcMgpiqAmhkutbMe0RHRbocLM6BOvvy7PkrlK2DsuV66N89WwaRwbtoj1UQBY6RlV0gWqojih6QE/oFb0Zj8az8W58TlrnjOnMDvoF4+sbEDWkjA==</latexit>

X
❖ SVM L= max(0, fj f yi + )
<latexit sha1_base64="KBRpQONXb/XTRP4RQqdVW8LGhE4=">AAACHXicbVBNSwMxEM36WetX1aOXYBEUteyqoBehqAcPHhRsFbplyaazNm2SXZOsWJb+ES/+FS8eFPHgRfw3prUHrT6Y4fHeDMm8MOFMG9f9dEZGx8YnJnNT+emZ2bn5wsJiVcepolChMY/VVUg0cCahYpjhcJUoICLkcBm2j3r+5S0ozWJ5YToJ1AW5lixilBgrBYXdU3yAfZ2KIGthX8IN7gSsi31B7tbcTRwFLbxle9ZTN7B/DNyQ9aBQdEtuH/gv8QakiAY4CwrvfiOmqQBpKCda1zw3MfWMKMMoh27eTzUkhLbJNdQslUSArmf967p41SoNHMXKljS4r/7cyIjQuiNCOymIaephryf+59VSE+3XMyaT1ICk3w9FKccmxr2ocIMpoIZ3LCFUMftXTJtEEWpsoHkbgjd88l9S3S55O6Xt891i+XAQRw4toxW0hjy0h8roBJ2hCqLoHj2iZ/TiPDhPzqvz9j064gx2ltAvOB9frHSfzQ==</latexit>
j6=yi
Optimización
0.2 1.0 2.0 1 190.2 cat w1T xi
·
<latexit sha1_base64="tAawmPWMO4MOOxJ+ZasNr5SHOic=">AAAB8HicbVBNSwMxEJ2tX7V+VT16CRbBU9mtguKp4MVjhX5Juy7ZNG1Dk+ySZNWy9Fd48aCIV3+ON/+NabsHbX0w8Hhvhpl5YcyZNq777eRWVtfWN/Kbha3tnd294v5BU0eJIrRBIh6pdog15UzShmGG03asKBYhp61wdD31Ww9UaRbJuhnH1Bd4IFmfEWysdPcYePd19BSwoFhyy+4MaJl4GSlBhlpQ/Or2IpIIKg3hWOuO58bGT7EyjHA6KXQTTWNMRnhAO5ZKLKj209nBE3RilR7qR8qWNGim/p5IsdB6LELbKbAZ6kVvKv7ndRLTv/RTJuPEUEnmi/oJRyZC0+9RjylKDB9bgoli9lZEhlhhYmxGBRuCt/jyMmlWyt5ZuXJ7XqpeZXHk4QiO4RQ8uIAq3EANGkBAwDO8wpujnBfn3fmYt+acbOYQ/sD5/AEVIY/p</latexit>

1.1 2.4 -2 <latexit sha1_base64="yfc6NYWr3Im+ZBK4Ggu2CK70rVE=">AAAB7HicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGCaQttKJvNpl262YTdiVBKf4MXD4p49Qd589+4bXPQ1gcDj/dmmJkXZlIYdN1vZ219Y3Nru7RT3t3bPzisHB23TJprxn2WylR3Qmq4FIr7KFDyTqY5TULJ2+Hobua3n7g2IlWPOM54kNCBErFgFK3k91iUYr9SdWvuHGSVeAWpQoFmv/LVi1KWJ1whk9SYrudmGEyoRsEkn5Z7ueEZZSM64F1LFU24CSbzY6fk3CoRiVNtSyGZq78nJjQxZpyEtjOhODTL3kz8z+vmGN8EE6GyHLlii0VxLgmmZPY5iYTmDOXYEsq0sLcSNqSaMrT5lG0I3vLLq6RVr3mXtfrDVbVxW8RRglM4gwvw4BoacA9N8IGBgGd4hTdHOS/Ou/OxaF1zipkT+APn8wfaWI62</latexit>

90 =
<latexit sha1_base64="MWbL6R2hZsQThSAdMNvF0orSXwY=">AAAB6HicbVDLSgNBEOyNrxhfUY9eBoPgKexGQS9C0IvHBMwDkiXMTnqTMbOzy8ysEEK+wIsHRbz6Sd78GyfJHjSxoKGo6qa7K0gE18Z1v53c2vrG5lZ+u7Czu7d/UDw8auo4VQwbLBaxagdUo+ASG4Ybge1EIY0Cga1gdDfzW0+oNI/lgxkn6Ed0IHnIGTVWqt/0iiW37M5BVomXkRJkqPWKX91+zNIIpWGCat3x3MT4E6oMZwKnhW6qMaFsRAfYsVTSCLU/mR86JWdW6ZMwVrakIXP198SERlqPo8B2RtQM9bI3E//zOqkJr/0Jl0lqULLFojAVxMRk9jXpc4XMiLEllClubyVsSBVlxmZTsCF4yy+vkmal7F2UK/XLUvU2iyMPJ3AK5+DBFVThHmrQAAYIz/AKb86j8+K8Ox+L1pyTzRzDHzifP44bjMU=</latexit>

117.1 deer w2T xi


<latexit sha1_base64="kTreocRGcaRVGiWz/bIwkInbZU0=">AAAB8HicbVBNSwMxEJ2tX7V+VT16CRbBU9mtguKp4MVjhX5Juy7ZNG1Dk+ySZNWy9Fd48aCIV3+ON/+NabsHbX0w8Hhvhpl5YcyZNq777eRWVtfWN/Kbha3tnd294v5BU0eJIrRBIh6pdog15UzShmGG03asKBYhp61wdD31Ww9UaRbJuhnH1Bd4IFmfEWysdPcYVO7r6ClgQbHklt0Z0DLxMlKCDLWg+NXtRSQRVBrCsdYdz42Nn2JlGOF0UugmmsaYjPCAdiyVWFDtp7ODJ+jEKj3Uj5QtadBM/T2RYqH1WIS2U2Az1IveVPzP6ySmf+mnTMaJoZLMF/UTjkyEpt+jHlOUGD62BBPF7K2IDLHCxNiMCjYEb/HlZdKslL2zcuX2vFS9yuLIwxEcwyl4cAFVuIEaNICAgGd4hTdHOS/Ou/Mxb8052cwh/IHz+QMWq4/q</latexit>

0.3 4 -4.5 50 135.3 dog


w3T xi
W xi
<latexit sha1_base64="HYbfjgGaRCmI8j+M0Errr+OmJEA=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120i7dbMLuRiyhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEsG1cd1vZ2V1bX1js7BV3N7Z3dsvHRw2dZwqhg0Wi1i1A6pRcIkNw43AdqKQRoHAVjC6mfqtR1Sax/LBjBP0IzqQPOSMGivdP/V4r1R2K+4MZJl4OSlDjnqv9NXtxyyNUBomqNYdz02Mn1FlOBM4KXZTjQllIzrAjqWSRqj9bHbqhJxapU/CWNmShszU3xMZjbQeR4HtjKgZ6kVvKv7ndVITXvkZl0lqULL5ojAVxMRk+jfpc4XMiLEllClubyVsSBVlxqZTtCF4iy8vk2a14p1XqncX5dp1HkcBjuEEzsCDS6jBLdShAQwG8Ayv8OYI58V5dz7mrStOPnMEf+B8/gBh3o3c</latexit>
<latexit sha1_base64="RufGyfExjnaIUvybJ7ppKffB6Xs=">AAAB8HicbVBNSwMxEJ31s9avqkcvwSJ4KrutoHgqePFYoV/Srks2zbahSXZJsmop/RVePCji1Z/jzX9j2u5BWx8MPN6bYWZemHCmjet+Oyura+sbm7mt/PbO7t5+4eCwqeNUEdogMY9VO8SaciZpwzDDaTtRFIuQ01Y4vJ76rQeqNItl3YwS6gvclyxiBBsr3T0Glfs6egpYUCi6JXcGtEy8jBQhQy0ofHV7MUkFlYZwrHXHcxPjj7EyjHA6yXdTTRNMhrhPO5ZKLKj2x7ODJ+jUKj0UxcqWNGim/p4YY6H1SIS2U2Az0IveVPzP66QmuvTHTCapoZLMF0UpRyZG0+9RjylKDB9Zgoli9lZEBlhhYmxGeRuCt/jyMmmWS16lVL49L1avsjhycAwncAYeXEAVbqAGDSAg4Ble4c1Rzovz7nzMW1ecbOYI/sD5/AEYNY/r</latexit>

X
<latexit sha1_base64="yxUgNFppSsebkUqo0HdMiZJUGss=">AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rEF+wFtKJvtpF272YTdjVBCf4EXD4p49Sd589+4bXPQ1gcDj/dmmJkXJIJr47rfztr6xubWdmGnuLu3f3BYOjpu6ThVDJssFrHqBFSj4BKbhhuBnUQhjQKB7WB8N/PbT6g0j+WDmSToR3QoecgZNVZqtPulsltx5yCrxMtJGXLU+6Wv3iBmaYTSMEG17npuYvyMKsOZwGmxl2pMKBvTIXYtlTRC7WfzQ6fk3CoDEsbKljRkrv6eyGik9SQKbGdEzUgvezPxP6+bmvDGz7hMUoOSLRaFqSAmJrOvyYArZEZMLKFMcXsrYSOqKDM2m6INwVt+eZW0qhXvslJtXJVrt3kcBTiFM7gAD66hBvdQhyYwQHiGV3hzHp0X5935WLSuOfnMCfyB8/kDtYOM3w==</latexit>

1 X L = max(0, w T
wyTi xi +
loss = L(f (xi , W ), yi ) + R(W ) j xi )
N i j6=yi <latexit sha1_base64="ah7xrA2DAle2RF4ogo7uZLjA5ts=">AAACKXicbZDLSgMxFIYz3q23qks3wSJU1DKjgm6Eoi5cuFCwF+jUIZOeajTJjElGLUNfx42v4kZBUbe+iGmdhbcDgS//fw7J+cOYM21c980ZGBwaHhkdG89NTE5Nz+Rn56o6ShSFCo14pOoh0cCZhIphhkM9VkBEyKEWXu71/No1KM0ieWI6MTQFOZOszSgxVgry5UO8g32diCC98CVc4U7AutgX5LboruKb4OL0BN8GDK9ZTntedl/B/j5wQ5aDfMEtuf3Cf8HLoICyOgryT34rookAaSgnWjc8NzbNlCjDKIduzk80xIRekjNoWJREgG6m/U27eMkqLdyOlD3S4L76fSIlQuuOCG2nIOZc//Z64n9eIzHt7WbKZJwYkPTroXbCsYlwLzbcYgqo4R0LhCpm/4rpOVGEGhtuzobg/V75L1TXS95Gaf14s1DezeIYQwtoERWRh7ZQGR2gI1RBFN2hB/SMXpx759F5dd6/WgecbGYe/Sjn4xPdWaSL</latexit>

1
<latexit sha1_base64="1888QJ1iSX/hhV6ImuBb1q1z9cU=">AAACKXicbVDLSgNBEJz1GeMr6tHLYBASDGE3CnoRgl48iEQxJpANy+xkNhky+2CmVxKW/R0v/ooXBUW9+iNOHgc1FjQUVd10d7mR4ApM88OYm19YXFrOrGRX19Y3NnNb23cqjCVldRqKUDZdopjgAasDB8GakWTEdwVruP3zkd+4Z1LxMLiFYcTaPukG3OOUgJacXNUGNoBEhEql+BTbniQ0sdLkKrVV7DscXxa8wsDhJdwolvDQ4UV8gG2hF3QIvik0ik4ub5bNMfAssaYkj6aoObkXuxPS2GcBUEGUallmBO2ESOBUsDRrx4pFhPZJl7U0DYjPVDsZf5rifa10sBdKXQHgsfpzIiG+UkPf1Z0+gZ76643E/7xWDN5JO+FBFAML6GSRFwsMIR7FhjtcMgpiqAmhkutbMe0RHRbocLM6BOvvy7PkrlK2DsuV66N89WwaRwbtoj1UQBY6RlV0gWqojih6QE/oFb0Zj8az8W58TlrnjOnMDvoF4+sbEDWkjA==</latexit>

loss = [(max(0, w1T x1 w2T x1 + ) + max(0, w3T x1 w2T x1 + )) y1 = 2


2
<latexit sha1_base64="5Btzo6mxJMTbMWh5PPqfqT3cTHw=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoAhCwYvHCvYD2lA22027dLMJuxMhlP4ILx4U8erv8ea/cdvmoK0PBh7vzTAzL0ikMOi6305hbX1jc6u4XdrZ3ds/KB8etUycasabLJax7gTUcCkUb6JAyTuJ5jQKJG8H47uZ337i2ohYPWKWcD+iQyVCwShaqZ31PXJLav1yxa26c5BV4uWkAjka/fJXbxCzNOIKmaTGdD03QX9CNQom+bTUSw1PKBvTIe9aqmjEjT+ZnzslZ1YZkDDWthSSufp7YkIjY7IosJ0RxZFZ9mbif143xfDanwiVpMgVWywKU0kwJrPfyUBozlBmllCmhb2VsBHVlKFNqGRD8JZfXiWtWtW7qNYeLiv1mzyOIpzAKZyDB1dQh3toQBMYjOEZXuHNSZwX5935WLQWnHzmGP7A+fwBrjGOdA==</latexit>

+ (max(0, w1T x2 w3T x2 + ) + max(0, w2T x2 w3T x2 + ))] y2 = 3


<latexit sha1_base64="BnkzFcr+e6sA1mM+68KScogu9Sc=">AAAB7nicbVBNS8NAEJ34WetX1aOXxSJ4KkkrKIJQ8OKxgv2ANpTNdtIu3WzC7kYooT/CiwdFvPp7vPlv3LY5aOuDgcd7M8zMCxLBtXHdb2dtfWNza7uwU9zd2z84LB0dt3ScKoZNFotYdQKqUXCJTcONwE6ikEaBwHYwvpv57SdUmsfy0UwS9CM6lDzkjBortSf9KrkltX6p7FbcOcgq8XJShhyNfumrN4hZGqE0TFCtu56bGD+jynAmcFrspRoTysZ0iF1LJY1Q+9n83Ck5t8qAhLGyJQ2Zq78nMhppPYkC2xlRM9LL3kz8z+umJrz2My6T1KBki0VhKoiJyex3MuAKmRETSyhT3N5K2IgqyoxNqGhD8JZfXiWtasWrVaoPl+X6TR5HAU7hDC7Agyuowz00oAkMxvAMr/DmJM6L8+58LFrXnHzmBP7A+fwBsT2Odg==</latexit>

<latexit sha1_base64="AMN4RntgoyQPXIG4f0tq5dqZud4=">AAADCnicfZJPb9MwGMad8G+EP+vgyOUVFVMroEpSJBAS0iQ4cBxoXSfVIXIcp7PmOMF2oFWUMxe+ChcOIMSVT8CNb4PTBTTWsVey9NP7Ps4TP3ZSCq6N7/9y3AsXL12+snHVu3b9xs3N3tatfV1UirIJLUShDhKimeCSTQw3gh2UipE8EWyaHD1v59N3TGleyD2zLFmUk7nkGafE2Fa85YCHEzbnstbWzjQeNmxhalFo3cA2PAOcKULroKnDBmYDnJPFwH8A7+PgzR4s4gAeWg47vg/4BROGDFv6qxyfpxwCxh6cqG3AbyuSWsWaWbj6xLjjs83C85TD6P9uWNjQUgKvB9Ohh5lM/wQS9/r+yF8VrEPQQR91tRv3fuK0oFXOpKGCaD0L/NJENVGGU8FswpVmJaFHZM5mFiXJmY7q1VU2cM92UsgKZZc0sOqe3FGTXOtlnlhlTsyhPj1rm2fNZpXJnkQ1l2VlmKTHRlklwBTQvgtIuWLUiKUFQhW3/wr0kNi7N/b1tCEEp4+8DvvhKBiPwleP+jtPuzg20B10Fw1QgB6jHfQS7aIJos4H55PzxfnqfnQ/u9/c78dS1+n23Eb/lPvjN5Ig3+s=</latexit>
Corresponde a los pesos de la clase verdadera, y_1=2 + R(W )
❖ gradiente descendente W = W ↵rW loss Notar que se escala la imagen <latexit sha1_base64="JmTgHAJCX7nHSmXGp6T1N7gT6BY=">AAACDXicbVA9SwNBEN3zM8avqKXNYhRsDHdR0EYI2lhGMLlALoS5zSZZsrd37M6J4cgfsPGv2FgoYmtv579x81H49WDg8d4MM/PCRAqDrvvpzM0vLC4t51byq2vrG5uFre26iVPNeI3FMtaNEAyXQvEaCpS8kWgOUSi5Hw4ux75/y7URsbrBYcJbEfSU6AoGaKV2Yd+n59SnRzQAmfSBBgpCCe3MHwXI7zCTsTGjdqHoltwJ6F/izUiRzFBtFz6CTszSiCtkEoxpem6CrQw0Cib5KB+khifABtDjTUsVRNy0ssk3I3pglQ7txtqWQjpRv09kEBkzjELbGQH2zW9vLP7nNVPsnrUyoZIUuWLTRd1UUozpOBraEZozlENLgGlhb6WsDxoY2gDzNgTv98t/Sb1c8o5L5euTYuViFkeO7JI9ckg8ckoq5IpUSY0wck8eyTN5cR6cJ+fVeZu2zjmzmR3yA877FzCNmxM=</latexit>

1 “1” de una forma en particular


rW loss = (rW L1 + rW L2 ) + rW R(W )
2 Primer componente de la imagen “1”
0 1
<latexit sha1_base64="klKsZyJ+PFfGFAyeLcFuTGGnpyU=">AAACQHicbVDPSxtBFJ5N64+mVmM99jI0FBIKYTcKilAIeOmhBxXjBrJheTuZNUNmZ5eZt6Vh2T+tF/8Eb5578VARr56cxKBp7AcD3/u+93hvviiTwqDrXjuVN29XVtfW31Xfb3zY3Kptfzw3aa4Z77JUproXgeFSKN5FgZL3Ms0hiST3o/HR1Pd/cm1Eqs5wkvFBAhdKxIIBWims+YGCSELo0wD5LyxkakxJv9Eg1sAKryzaZeO55Ufo0a90oWw3p7W064bwop82/GZYq7stdwb6mnhzUidzHIe1q2CYsjzhCpkEY/qem+GgAI2CSV5Wg9zwDNgYLnjfUgUJN4NiFkBJv1hlSONU26eQztTFiQISYyZJZDsTwJFZ9qbi/7x+jvHBoBAqy5Er9rQoziXFlE7TpEOhOUM5sQSYFvZWykZgo0ObedWG4C1/+TU5b7e83Vb7ZK/eOZzHsU4+kc+kQTyyTzrkOzkmXcLIb/KH/CW3zqVz49w590+tFWc+s0P+gfPwCHQNrSY=</latexit>

I(w1T x1 w2T x1 + )(x1 )1 ··· I(w1T x1 w2T x1 + )(x1 )3


rW L1 = @ I(w1T x1 w2T x1 + )(x1 )1 I(w3T x1 w2T x1 + )(x1 )1 ··· ··· A
<latexit sha1_base64="vm8X4gIGo9eMvO8v8kEvNN03+FY=">AAADY3iclZJRaxNBEMc3d9rWq9q0+ibCYFBSpOEuERRBKOiDgg8VkqaQTY/dvUm6dG/v2N2zDUe+pG+++eL3cJMGsa0iGVj43cz8b3Zmh5dKWhfH3xtBeOfuxubWvWj7/oOHO83dvWNbVEbgQBSqMCecWVRS48BJp/CkNMhyrnDIz98v4sOvaKwsdN/NShznbKrlRArmvCvdbcyoZlyxdAif0wTeQUQ5TqWuy5w5Iy/nEfVwxnn9ad6+SJPTPlz6vAO4SLunfY8v6QdUju1D23/sp8kLoCIrnAUPayl7lEYHaxaD64LeWrdbgi+65i/gXx3+V9qLKOrs92TTZivuxEuD25CsoEVWdpQ2v9GsEFWO2gnFrB0lcenGNTNOCoX+pSqLJRPnbIojj5rlaMf1ckfm8Nx7MpgUxh/tYOn9U1Gz3NpZzn3moil7M7Zw/i02qtzkzbiWuqwcanFVaFIpcAUsFg4yaVA4NfPAhJH+riDOmGHC+bWM/BCSmy3fhuNuJ+l1ul9etQ7frsaxRZ6QZ6RNEvKaHJKP5IgMiGj8CDaCnaAZ/Ay3w73w8VVq0FhpHpFrFj79BZD5ApQ=</latexit>
I(w3T x1 w2T x1 + )(x1 )1 ··· I(w3T x1 w2T x1 + )(x1 )3
Optimización
0.2 1.0 2.0 1 190.2 cat w1T xi
·
<latexit sha1_base64="tAawmPWMO4MOOxJ+ZasNr5SHOic=">AAAB8HicbVBNSwMxEJ2tX7V+VT16CRbBU9mtguKp4MVjhX5Juy7ZNG1Dk+ySZNWy9Fd48aCIV3+ON/+NabsHbX0w8Hhvhpl5YcyZNq777eRWVtfWN/Kbha3tnd294v5BU0eJIrRBIh6pdog15UzShmGG03asKBYhp61wdD31Ww9UaRbJuhnH1Bd4IFmfEWysdPcYePd19BSwoFhyy+4MaJl4GSlBhlpQ/Or2IpIIKg3hWOuO58bGT7EyjHA6KXQTTWNMRnhAO5ZKLKj209nBE3RilR7qR8qWNGim/p5IsdB6LELbKbAZ6kVvKv7ndRLTv/RTJuPEUEnmi/oJRyZC0+9RjylKDB9bgoli9lZEhlhhYmxGBRuCt/jyMmlWyt5ZuXJ7XqpeZXHk4QiO4RQ8uIAq3EANGkBAwDO8wpujnBfn3fmYt+acbOYQ/sD5/AEVIY/p</latexit>

1.1 2.4 -2 <latexit sha1_base64="yfc6NYWr3Im+ZBK4Ggu2CK70rVE=">AAAB7HicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGCaQttKJvNpl262YTdiVBKf4MXD4p49Qd589+4bXPQ1gcDj/dmmJkXZlIYdN1vZ219Y3Nru7RT3t3bPzisHB23TJprxn2WylR3Qmq4FIr7KFDyTqY5TULJ2+Hobua3n7g2IlWPOM54kNCBErFgFK3k91iUYr9SdWvuHGSVeAWpQoFmv/LVi1KWJ1whk9SYrudmGEyoRsEkn5Z7ueEZZSM64F1LFU24CSbzY6fk3CoRiVNtSyGZq78nJjQxZpyEtjOhODTL3kz8z+vmGN8EE6GyHLlii0VxLgmmZPY5iYTmDOXYEsq0sLcSNqSaMrT5lG0I3vLLq6RVr3mXtfrDVbVxW8RRglM4gwvw4BoacA9N8IGBgGd4hTdHOS/Ou/OxaF1zipkT+APn8wfaWI62</latexit>

90 =
<latexit sha1_base64="MWbL6R2hZsQThSAdMNvF0orSXwY=">AAAB6HicbVDLSgNBEOyNrxhfUY9eBoPgKexGQS9C0IvHBMwDkiXMTnqTMbOzy8ysEEK+wIsHRbz6Sd78GyfJHjSxoKGo6qa7K0gE18Z1v53c2vrG5lZ+u7Czu7d/UDw8auo4VQwbLBaxagdUo+ASG4Ybge1EIY0Cga1gdDfzW0+oNI/lgxkn6Ed0IHnIGTVWqt/0iiW37M5BVomXkRJkqPWKX91+zNIIpWGCat3x3MT4E6oMZwKnhW6qMaFsRAfYsVTSCLU/mR86JWdW6ZMwVrakIXP198SERlqPo8B2RtQM9bI3E//zOqkJr/0Jl0lqULLFojAVxMRk9jXpc4XMiLEllClubyVsSBVlxmZTsCF4yy+vkmal7F2UK/XLUvU2iyMPJ3AK5+DBFVThHmrQAAYIz/AKb86j8+K8Ox+L1pyTzRzDHzifP44bjMU=</latexit>

117.1 deer w2T xi


<latexit sha1_base64="kTreocRGcaRVGiWz/bIwkInbZU0=">AAAB8HicbVBNSwMxEJ2tX7V+VT16CRbBU9mtguKp4MVjhX5Juy7ZNG1Dk+ySZNWy9Fd48aCIV3+ON/+NabsHbX0w8Hhvhpl5YcyZNq777eRWVtfWN/Kbha3tnd294v5BU0eJIrRBIh6pdog15UzShmGG03asKBYhp61wdD31Ww9UaRbJuhnH1Bd4IFmfEWysdPcYVO7r6ClgQbHklt0Z0DLxMlKCDLWg+NXtRSQRVBrCsdYdz42Nn2JlGOF0UugmmsaYjPCAdiyVWFDtp7ODJ+jEKj3Uj5QtadBM/T2RYqH1WIS2U2Az1IveVPzP6ySmf+mnTMaJoZLMF/UTjkyEpt+jHlOUGD62BBPF7K2IDLHCxNiMCjYEb/HlZdKslL2zcuX2vFS9yuLIwxEcwyl4cAFVuIEaNICAgGd4hTdHOS/Ou/Mxb8052cwh/IHz+QMWq4/q</latexit>

0.3 4 -4.5 50 135.3 dog


w3T xi
W xi
<latexit sha1_base64="HYbfjgGaRCmI8j+M0Errr+OmJEA=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120i7dbMLuRiyhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEsG1cd1vZ2V1bX1js7BV3N7Z3dsvHRw2dZwqhg0Wi1i1A6pRcIkNw43AdqKQRoHAVjC6mfqtR1Sax/LBjBP0IzqQPOSMGivdP/V4r1R2K+4MZJl4OSlDjnqv9NXtxyyNUBomqNYdz02Mn1FlOBM4KXZTjQllIzrAjqWSRqj9bHbqhJxapU/CWNmShszU3xMZjbQeR4HtjKgZ6kVvKv7ndVITXvkZl0lqULL5ojAVxMRk+jfpc4XMiLEllClubyVsSBVlxqZTtCF4iy8vk2a14p1XqncX5dp1HkcBjuEEzsCDS6jBLdShAQwG8Ayv8OYI58V5dz7mrStOPnMEf+B8/gBh3o3c</latexit>
<latexit sha1_base64="RufGyfExjnaIUvybJ7ppKffB6Xs=">AAAB8HicbVBNSwMxEJ31s9avqkcvwSJ4KrutoHgqePFYoV/Srks2zbahSXZJsmop/RVePCji1Z/jzX9j2u5BWx8MPN6bYWZemHCmjet+Oyura+sbm7mt/PbO7t5+4eCwqeNUEdogMY9VO8SaciZpwzDDaTtRFIuQ01Y4vJ76rQeqNItl3YwS6gvclyxiBBsr3T0Glfs6egpYUCi6JXcGtEy8jBQhQy0ofHV7MUkFlYZwrHXHcxPjj7EyjHA6yXdTTRNMhrhPO5ZKLKj2x7ODJ+jUKj0UxcqWNGim/p4YY6H1SIS2U2Az0IveVPzP66QmuvTHTCapoZLMF0UpRyZG0+9RjylKDB9Zgoli9lZEBlhhYmxGeRuCt/jyMmmWS16lVL49L1avsjhycAwncAYeXEAVbqAGDSAg4Ble4c1Rzovz7nzMW1ecbOYI/sD5/AEYNY/r</latexit>

X
<latexit sha1_base64="yxUgNFppSsebkUqo0HdMiZJUGss=">AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rEF+wFtKJvtpF272YTdjVBCf4EXD4p49Sd589+4bXPQ1gcDj/dmmJkXJIJr47rfztr6xubWdmGnuLu3f3BYOjpu6ThVDJssFrHqBFSj4BKbhhuBnUQhjQKB7WB8N/PbT6g0j+WDmSToR3QoecgZNVZqtPulsltx5yCrxMtJGXLU+6Wv3iBmaYTSMEG17npuYvyMKsOZwGmxl2pMKBvTIXYtlTRC7WfzQ6fk3CoDEsbKljRkrv6eyGik9SQKbGdEzUgvezPxP6+bmvDGz7hMUoOSLRaFqSAmJrOvyYArZEZMLKFMcXsrYSOqKDM2m6INwVt+eZW0qhXvslJtXJVrt3kcBTiFM7gAD66hBvdQhyYwQHiGV3hzHp0X5935WLSuOfnMCfyB8/kDtYOM3w==</latexit>

1 X L= max(0, wjT xi wyTi xi + )


loss = L(f (xi , W ), yi ) + R(W )
N i <latexit sha1_base64="ah7xrA2DAle2RF4ogo7uZLjA5ts=">AAACKXicbZDLSgMxFIYz3q23qks3wSJU1DKjgm6Eoi5cuFCwF+jUIZOeajTJjElGLUNfx42v4kZBUbe+iGmdhbcDgS//fw7J+cOYM21c980ZGBwaHhkdG89NTE5Nz+Rn56o6ShSFCo14pOoh0cCZhIphhkM9VkBEyKEWXu71/No1KM0ieWI6MTQFOZOszSgxVgry5UO8g32diCC98CVc4U7AutgX5LboruKb4OL0BN8GDK9ZTntedl/B/j5wQ5aDfMEtuf3Cf8HLoICyOgryT34rookAaSgnWjc8NzbNlCjDKIduzk80xIRekjNoWJREgG6m/U27eMkqLdyOlD3S4L76fSIlQuuOCG2nIOZc//Z64n9eIzHt7WbKZJwYkPTroXbCsYlwLzbcYgqo4R0LhCpm/4rpOVGEGhtuzobg/V75L1TXS95Gaf14s1DezeIYQwtoERWRh7ZQGR2gI1RBFN2hB/SMXpx759F5dd6/WgecbGYe/Sjn4xPdWaSL</latexit>
j6=yi
<latexit sha1_base64="1888QJ1iSX/hhV6ImuBb1q1z9cU=">AAACKXicbVDLSgNBEJz1GeMr6tHLYBASDGE3CnoRgl48iEQxJpANy+xkNhky+2CmVxKW/R0v/ooXBUW9+iNOHgc1FjQUVd10d7mR4ApM88OYm19YXFrOrGRX19Y3NnNb23cqjCVldRqKUDZdopjgAasDB8GakWTEdwVruP3zkd+4Z1LxMLiFYcTaPukG3OOUgJacXNUGNoBEhEql+BTbniQ0sdLkKrVV7DscXxa8wsDhJdwolvDQ4UV8gG2hF3QIvik0ik4ub5bNMfAssaYkj6aoObkXuxPS2GcBUEGUallmBO2ESOBUsDRrx4pFhPZJl7U0DYjPVDsZf5rifa10sBdKXQHgsfpzIiG+UkPf1Z0+gZ76643E/7xWDN5JO+FBFAML6GSRFwsMIR7FhjtcMgpiqAmhkutbMe0RHRbocLM6BOvvy7PkrlK2DsuV66N89WwaRwbtoj1UQBY6RlV0gWqojih6QE/oFb0Zj8az8W58TlrnjOnMDvoF4+sbEDWkjA==</latexit>

❖ gradiente descendente W = W <latexit sha1_base64="JmTgHAJCX7nHSmXGp6T1N7gT6BY=">AAACDXicbVA9SwNBEN3zM8avqKXNYhRsDHdR0EYI2lhGMLlALoS5zSZZsrd37M6J4cgfsPGv2FgoYmtv579x81H49WDg8d4MM/PCRAqDrvvpzM0vLC4t51byq2vrG5uFre26iVPNeI3FMtaNEAyXQvEaCpS8kWgOUSi5Hw4ux75/y7URsbrBYcJbEfSU6AoGaKV2Yd+n59SnRzQAmfSBBgpCCe3MHwXI7zCTsTGjdqHoltwJ6F/izUiRzFBtFz6CTszSiCtkEoxpem6CrQw0Cib5KB+khifABtDjTUsVRNy0ssk3I3pglQ7txtqWQjpRv09kEBkzjELbGQH2zW9vLP7nNVPsnrUyoZIUuWLTRd1UUozpOBraEZozlENLgGlhb6WsDxoY2gDzNgTv98t/Sb1c8o5L5euTYuViFkeO7JI9ckg8ckoq5IpUSY0wck8eyTN5cR6cJ+fVeZu2zjmzmR3yA877FzCNmxM=</latexit>


↵rW loss
¡¡ Podemos evitar un for con un
1 producto matricial !!
rW loss = (rW L1 + rW L2 ) + rW R(W )
2
<latexit sha1_base64="klKsZyJ+PFfGFAyeLcFuTGGnpyU=">AAACQHicbVDPSxtBFJ5N64+mVmM99jI0FBIKYTcKilAIeOmhBxXjBrJheTuZNUNmZ5eZt6Vh2T+tF/8Eb5578VARr56cxKBp7AcD3/u+93hvviiTwqDrXjuVN29XVtfW31Xfb3zY3Kptfzw3aa4Z77JUproXgeFSKN5FgZL3Ms0hiST3o/HR1Pd/cm1Eqs5wkvFBAhdKxIIBWims+YGCSELo0wD5LyxkakxJv9Eg1sAKryzaZeO55Ufo0a90oWw3p7W064bwop82/GZYq7stdwb6mnhzUidzHIe1q2CYsjzhCpkEY/qem+GgAI2CSV5Wg9zwDNgYLnjfUgUJN4NiFkBJv1hlSONU26eQztTFiQISYyZJZDsTwJFZ9qbi/7x+jvHBoBAqy5Er9rQoziXFlE7TpEOhOUM5sQSYFvZWykZgo0ObedWG4C1/+TU5b7e83Vb7ZK/eOZzHsU4+kc+kQTyyTzrkOzkmXcLIb/KH/CW3zqVz49w590+tFWc+s0P+gfPwCHQNrSY=</latexit>

2 # ---- Evitar loops


3 # Vamos a utilizar el producto punto para sumar los ejemplos
4 # escalados como indica el gradiente que esta almacenado en binary
5 dW = x_batch.T.dot(binary)
6 dW /= x_batch.shape[0]
7 # sumamos el gradiente de la regularizacion
8 sW += reg * self.W
Optimización
❖ Gradiente descendente: W = W <latexit sha1_base64="JmTgHAJCX7nHSmXGp6T1N7gT6BY=">AAACDXicbVA9SwNBEN3zM8avqKXNYhRsDHdR0EYI2lhGMLlALoS5zSZZsrd37M6J4cgfsPGv2FgoYmtv579x81H49WDg8d4MM/PCRAqDrvvpzM0vLC4t51byq2vrG5uFre26iVPNeI3FMtaNEAyXQvEaCpS8kWgOUSi5Hw4ux75/y7URsbrBYcJbEfSU6AoGaKV2Yd+n59SnRzQAmfSBBgpCCe3MHwXI7zCTsTGjdqHoltwJ6F/izUiRzFBtFz6CTszSiCtkEoxpem6CrQw0Cib5KB+khifABtDjTUsVRNy0ssk3I3pglQ7txtqWQjpRv09kEBkzjELbGQH2zW9vLP7nNVPsnrUyoZIUuWLTRd1UUozpOBraEZozlENLgGlhb6WsDxoY2gDzNgTv98t/Sb1c8o5L5euTYuViFkeO7JI9ckg8ckoq5IpUSY0wck8eyTN5cR6cJ+fVeZu2zjmzmR3yA877FzCNmxM=</latexit>
↵rW loss
XN X
1 L= max(0, Sj Syi + )
rW loss = rW L
N <latexit sha1_base64="oSv46RYGIiKGpKzVEg5C3GMgK8U=">AAACHXicbVDLSgMxFM34rPVVdekmWISKWmZqQTdCURcuXChaLXTKkElv22iSGZOMWIb+iBt/xY0LRVy4Ef/GtHbh68C9HM65l+SeMOZMG9f9cEZGx8YnJjNT2emZ2bn53MLiuY4SRaFKIx6pWkg0cCahapjhUIsVEBFyuAiv9vv+xQ0ozSJ5ZroxNARpS9ZilBgrBbnyEd7Fvk5EkF5iX8I17gash31BbgvuBj4NLvGm7WlfXcf+AXBD1oJc3i26A+C/xBuSPBriOMi9+c2IJgKkoZxoXffc2DRSogyjHHpZP9EQE3pF2lC3VBIBupEOruvhVas0cStStqTBA/X7RkqE1l0R2klBTEf/9vrif149Ma2dRspknBiQ9OuhVsKxiXA/KtxkCqjhXUsIVcz+FdMOUYQaG2jWhuD9PvkvOS8Vva1i6aScr+wN48igZbSCCshD26iCDtExqiKK7tADekLPzr3z6Lw4r1+jI85wZwn9gPP+CXAwn6c=</latexit>
j6=yi
X
k=1
max(0, wjT xi wyTi xi +
<latexit sha1_base64="5Ii5aaHRf646vSRmbSg4Xztn4jc=">AAACJnicbVDLSgMxFM34rPVVdekmWARXZaYKuikU3biQUsFaoVOHTJqxoZnMkNwRS5ivceOvuHFREXHnp5jWgs8DgcM593BzT5gKrsF135yZ2bn5hcXCUnF5ZXVtvbSxeamTTFHWoolI1FVINBNcshZwEOwqVYzEoWDtcHAy9tu3TGmeyAsYpqwbkxvJI04JWCko1XxJQkEC0859YHdgRKJ1jmvYjxShxstNI8e+zuLADGpeft3AXwF8FpTKbsWdAP8l3pSU0RTNoDTyewnNYiaBCqJ1x3NT6BqigFPB8qKfaZYSOiA3rGOpJDHTXTM5M8e7VunhKFH2ScAT9XvCkFjrYRzayZhAX//2xuJ/XieD6KhruEwzYJJ+LooygSHB485wjytGQQwtIVRx+1dM+8T2A7bZoi3B+33yX3JZrXj7ler5Qbl+PK2jgLbRDtpDHjpEdXSKmqiFKLpHj2iEnp0H58l5cV4/R2ecaWYL/YDz/gETHKYY</latexit>

L= )
<latexit sha1_base64="ah7xrA2DAle2RF4ogo7uZLjA5ts=">AAACKXicbZDLSgMxFIYz3q23qks3wSJU1DKjgm6Eoi5cuFCwF+jUIZOeajTJjElGLUNfx42v4kZBUbe+iGmdhbcDgS//fw7J+cOYM21c980ZGBwaHhkdG89NTE5Nz+Rn56o6ShSFCo14pOoh0cCZhIphhkM9VkBEyKEWXu71/No1KM0ieWI6MTQFOZOszSgxVgry5UO8g32diCC98CVc4U7AutgX5LboruKb4OL0BN8GDK9ZTntedl/B/j5wQ5aDfMEtuf3Cf8HLoICyOgryT34rookAaSgnWjc8NzbNlCjDKIduzk80xIRekjNoWJREgG6m/U27eMkqLdyOlD3S4L76fSIlQuuOCG2nIOZc//Z64n9eIzHt7WbKZJwYkPTroXbCsYlwLzbcYgqo4R0LhCpm/4rpOVGEGhtuzobg/V75L1TXS95Gaf14s1DezeIYQwtoERWRh7ZQGR2gI1RBFN2hB/SMXpx759F5dd6/WgecbGYe/Sjn4xPdWaSL</latexit>
j6=yi
0.2 1.0 0.0 … 0.0 1 0.2 cat

1.1 2.4 -2 … -2 112 1.1 deer

0.3 4 -0.5 … -0.5 ·


<latexit sha1_base64="yfc6NYWr3Im+ZBK4Ggu2CK70rVE=">AAAB7HicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGCaQttKJvNpl262YTdiVBKf4MXD4p49Qd589+4bXPQ1gcDj/dmmJkXZlIYdN1vZ219Y3Nru7RT3t3bPzisHB23TJprxn2WylR3Qmq4FIr7KFDyTqY5TULJ2+Hobua3n7g2IlWPOM54kNCBErFgFK3k91iUYr9SdWvuHGSVeAWpQoFmv/LVi1KWJ1whk9SYrudmGEyoRsEkn5Z7ueEZZSM64F1LFU24CSbzY6fk3CoRiVNtSyGZq78nJjQxZpyEtjOhODTL3kz8z+vmGN8EE6GyHLlii0VxLgmmZPY5iYTmDOXYEsq0sLcSNqSaMrT5lG0I3vLLq6RVr3mXtfrDVbVxW8RRglM4gwvw4BoacA9N8IGBgGd4hTdHOS/Ou/OxaF1zipkT+APn8wfaWI62</latexit>

123 =
<latexit sha1_base64="MWbL6R2hZsQThSAdMNvF0orSXwY=">AAAB6HicbVDLSgNBEOyNrxhfUY9eBoPgKexGQS9C0IvHBMwDkiXMTnqTMbOzy8ysEEK+wIsHRbz6Sd78GyfJHjSxoKGo6qa7K0gE18Z1v53c2vrG5lZ+u7Czu7d/UDw8auo4VQwbLBaxagdUo+ASG4Ybge1EIY0Cga1gdDfzW0+oNI/lgxkn6Ed0IHnIGTVWqt/0iiW37M5BVomXkRJkqPWKX91+zNIIpWGCat3x3MT4E6oMZwKnhW6qMaFsRAfYsVTSCLU/mR86JWdW6ZMwVrakIXP198SERlqPo8B2RtQM9bI3E//zOqkJr/0Jl0lqULLFojAVxMRk9jXpc4XMiLEllClubyVsSBVlxmZTsCF4yy+vkmal7F2UK/XLUvU2iyMPJ3AK5+DBFVThHmrQAAYIz/AKb86j8+K8Ox+L1pyTzRzDHzifP44bjMU=</latexit>

0.3 dog

2 -0.7 30 … 30 … 2 frog

-0.8 0.1 1.0 … 1.0 45 -0.8 horse @L


= (xi )j
X W
<latexit sha1_base64="yxUgNFppSsebkUqo0HdMiZJUGss=">AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rEF+wFtKJvtpF272YTdjVBCf4EXD4p49Sd589+4bXPQ1gcDj/dmmJkXJIJr47rfztr6xubWdmGnuLu3f3BYOjpu6ThVDJssFrHqBFSj4BKbhhuBnUQhjQKB7WB8N/PbT6g0j+WDmSToR3QoecgZNVZqtPulsltx5yCrxMtJGXLU+6Wv3iBmaYTSMEG17npuYvyMKsOZwGmxl2pMKBvTIXYtlTRC7WfzQ6fk3CoDEsbKljRkrv6eyGik9SQKbGdEzUgvezPxP6+bmvDGz7hMUoOSLRaFqSAmJrOvyYArZEZMLKFMcXsrYSOqKDM2m6INwVt+eZW0qhXvslJtXJVrt3kcBTiFM7gAD66hBvdQhyYwQHiGV3hzHp0X5935WLSuOfnMCfyB8/kDtYOM3w==</latexit>
xi
<latexit sha1_base64="HYbfjgGaRCmI8j+M0Errr+OmJEA=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120i7dbMLuRiyhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEsG1cd1vZ2V1bX1js7BV3N7Z3dsvHRw2dZwqhg0Wi1i1A6pRcIkNw43AdqKQRoHAVjC6mfqtR1Sax/LBjBP0IzqQPOSMGivdP/V4r1R2K+4MZJl4OSlDjnqv9NXtxyyNUBomqNYdz02Mn1FlOBM4KXZTjQllIzrAjqWSRqj9bHbqhJxapU/CWNmShszU3xMZjbQeR4HtjKgZ6kVvKv7ndVITXvkZl0lqULL5ojAVxMRk+jfpc4XMiLEllClubyVsSBVlxqZTtCF4iy8vk2a14p1XqncX5dp1HkcBjuEEzsCDS6jBLdShAQwG8Ayv8OYI58V5dz7mrStOPnMEf+B8/gBh3o3c</latexit>

<latexit sha1_base64="eKuMw87scmkEmwqh07asQ0COg7k=">AAACFXicbZDLSsNAFIYnXmu9RV26GSxCBSlJFXQjFN24cFHBXqAJYTKdtNNOLsxM1BLyEm58FTcuFHEruPNtnLQBtfWHgY//nDMz53cjRoU0jC9tbn5hcWm5sFJcXVvf2NS3tpsijDkmDRyykLddJAijAWlIKhlpR5wg32Wk5Q4vsnrrlnBBw+BGjiJi+6gXUI9iJJXl6IeWxxFOrAhxSRGDV+kP3zkJHaQpPIPle4ceOANHLxkVYyw4C2YOJZCr7uifVjfEsU8CiRkSomMakbST7H7MSFq0YkEihIeoRzoKA+QTYSfjrVK4r5wu9EKuTiDh2P09kSBfiJHvqk4fyb6YrmXmf7VOLL1TO6FBFEsS4MlDXsygDGEWEexSTrBkIwUIc6r+CnEfqZikCrKoQjCnV56FZrViHlWq18el2nkeRwHsgj1QBiY4ATVwCeqgATB4AE/gBbxqj9qz9qa9T1rntHxmB/yR9vENWR+e5Q==</latexit>
@wij
(rW L)yi = I(wjT xi wyTi xi + )xi
<latexit sha1_base64="3wHiXlh8X1hUNUQicsTVKoyr8zs=">AAACRnicbVBNb9NAEB2HrxI+GuDIZUWElAoR2QUJLkgVcKBSD0VqmkqxWY03k3bb9drsrimR5V/HhTO3/oReeihCvXad+AAtT1rpzZt5mp2XFkpaF4YnQefGzVu376zc7d67/+Dhau/R412bl0bQSOQqN3spWlJS08hJp2ivMIRZqmicHn1o+uNvZKzM9Y6bF5RkuK/lTAp0XuK9ZBBrTBXyalyzrTVezbms2Tv2ksW2zHh1GGv6yhZinKE7SNNqs2aDY374ZYd959IPHi9Nbf2CxR9JOVxrKt7rh8NwAXadRC3pQ4tt3vsVT3NRZqSdUGjtJAoLl1RonBSK6m5cWipQHOE+TTzVmJFNqkUMNXvulSmb5cY/7dhC/dtRYWbtPEv9ZHOKvdprxP/1JqWbvU0qqYvSkRbLRbNSMZezJlM2lYaEU3NPUBjp/8rEARoUziff9SFEV0++TnbXh9Gr4frn1/2N920cK/AUnsEAIngDG/AJtmEEAn7AKZzD7+BncBb8CS6Wo52g9TyBf9CBS65tr3k=</latexit>
j6=yi

(rW L)j = I(wjT xi


<latexit sha1_base64="Sek9gTVGznC/VBlwulIC6IQa+Ls=">AAACL3icbVDLSsQwFE19O75GXboJDsKIOLQq6EYQFVFwoeA4wnQMt5mMRtO0JKk6lP6RG3/FjYgibv0L07ELXwcC555zLzf3BLHg2rjus9PXPzA4NDwyWhobn5icKk/PnOooUZTVaSQidRaAZoJLVjfcCHYWKwZhIFgjuN7J/cYNU5pH8sR0Y9YK4ULyDqdgrETKe1VfQiCApI0MHy6SK7yJ/RDMZRCkBxmu3pKr8xN8Rzhexrck7RKeFfUS9neZMLCYV6RccWtuD/gv8QpSQQWOSPnRb0c0CZk0VIDWTc+NTSsFZTgVLCv5iWYx0Gu4YE1LJYRMt9LevRlesEobdyJlnzS4p36fSCHUuhsGtjM/Rf/2cvE/r5mYzkYr5TJODJP0a1EnEdhEOA8Pt7li1IiuJUAVt3/F9BIUUGMjLtkQvN8n/yWnKzVvtbZyvFbZ2i7iGEFzaB5VkYfW0RbaR0eojii6R4/oBb06D86T8+a8f7X2OcXMLPoB5+MTeEum4w==</latexit>
wyTi xi + )xi
Batch Gradient Descent (BGD)
1 X
loss = L(f (xi , W ), yi ) + R(W )
N i
<latexit sha1_base64="1888QJ1iSX/hhV6ImuBb1q1z9cU=">AAACKXicbVDLSgNBEJz1GeMr6tHLYBASDGE3CnoRgl48iEQxJpANy+xkNhky+2CmVxKW/R0v/ooXBUW9+iNOHgc1FjQUVd10d7mR4ApM88OYm19YXFrOrGRX19Y3NnNb23cqjCVldRqKUDZdopjgAasDB8GakWTEdwVruP3zkd+4Z1LxMLiFYcTaPukG3OOUgJacXNUGNoBEhEql+BTbniQ0sdLkKrVV7DscXxa8wsDhJdwolvDQ4UV8gG2hF3QIvik0ik4ub5bNMfAssaYkj6aoObkXuxPS2GcBUEGUallmBO2ESOBUsDRrx4pFhPZJl7U0DYjPVDsZf5rifa10sBdKXQHgsfpzIiG+UkPf1Z0+gZ76643E/7xWDN5JO+FBFAML6GSRFwsMIR7FhjtcMgpiqAmhkutbMe0RHRbocLM6BOvvy7PkrlK2DsuV66N89WwaRwbtoj1UQBY6RlV0gWqojih6QE/oFb0Zj8az8W58TlrnjOnMDvoF4+sbEDWkjA==</latexit>

❖ En lugar de tomar todo el conjunto de datos de


entrenamiento lo partimos en mini-batchs (32/64/128)

2 # ------ Naive SGD implementation


3 tol = 1e-3
4 delta = np.inf
5 while delta > tol:
6 data_batch = sample_data(training_data, batch_size)
7 old = loss_fun(data_batch, weights)
8 w_grads = evaluate_gradient(loss_fun, data_batch, weights)
9 weights += - step_size * w_grads
10 delta = np.abs(old - loss_fun(data_batch, weights))
Stochastic Gradient Descent (SGD)
1 X
loss = L(f (xi , W ), yi ) + R(W )
N i
<latexit sha1_base64="1888QJ1iSX/hhV6ImuBb1q1z9cU=">AAACKXicbVDLSgNBEJz1GeMr6tHLYBASDGE3CnoRgl48iEQxJpANy+xkNhky+2CmVxKW/R0v/ooXBUW9+iNOHgc1FjQUVd10d7mR4ApM88OYm19YXFrOrGRX19Y3NnNb23cqjCVldRqKUDZdopjgAasDB8GakWTEdwVruP3zkd+4Z1LxMLiFYcTaPukG3OOUgJacXNUGNoBEhEql+BTbniQ0sdLkKrVV7DscXxa8wsDhJdwolvDQ4UV8gG2hF3QIvik0ik4ub5bNMfAssaYkj6aoObkXuxPS2GcBUEGUallmBO2ESOBUsDRrx4pFhPZJl7U0DYjPVDsZf5rifa10sBdKXQHgsfpzIiG+UkPf1Z0+gZ76643E/7xWDN5JO+FBFAML6GSRFwsMIR7FhjtcMgpiqAmhkutbMe0RHRbocLM6BOvvy7PkrlK2DsuV66N89WwaRwbtoj1UQBY6RlV0gWqojih6QE/oFb0Zj8az8W58TlrnjOnMDvoF4+sbEDWkjA==</latexit>

❖ mini-batch = 1
❖ Es común que se diga SGD cuando en realidad es BGD
2 # ------ Naive SGD implementation
3 tol = 1e-3
4 delta = np.inf
5 while delta > tol:
6 data_batch = sample_data(training_data, batch_size)
7 old = loss_fun(data_batch, weights)
8 w_grads = evaluate_gradient(loss_fun, data_batch, weights)
9 weights += - step_size * w_grads
10 delta = np.abs(old - loss_fun(data_batch, weights))
Recordemos
❖ Relación entre distintas técnicas Salida

de AI
Mapeo de
Salida Salida
caracterísiticas

Representation learning AI Layers


Mapeo de Mapeo de adicionales
Salida
caracterísiticas caracterísiticas (características
más abstractas)
Neural Networks
- Deep Learning

Ejemplo: Ejemplo: Ejemplo: Ejemplo:


Caracterísiticas
MLPs Autoencoder Logistic regression Knowledge bases Diseño a medida Diseño a medida Caracterísiticas
simples

Entrada Entrada Entrada Entrada


Machine learning

Sistemas basado Neural Networks


Enfoque clásico - Deep Learning
en reglas
machine learning
Representation learning
Machine learning

I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning. MIT, 2016.


AI
Extracción de características

Img. source: Stanford CS231n 2017


Extracción de características
Extracción de características

Img. source: Stanford CS231n 2017

You might also like