You are on page 1of 1
Unit 9 - Week 7 : ‘Course outtine Wook Wook 7: Lect 35: Gs Corton Wook Wook 12: ‘About the Course Ask 20. Assignment 7 9) what are the advantages of ntliing MUP with pre-trained autoencoder weights? Faster Convergence & Avoid overfitirg Faster Convergace & Simpler hypothesis Faster Convergance, Avo overfting& Simpler hypothesis b @. None these to, anor iio. ‘Accepts Aree: 2) Inalinearautoencoder(w/thoutreguarizr), hidden layer perceptrons are equate input layer perceptron than encoder and decoder weights ae indulged to leer 2. Optimal representations by. Sparse representations entity mate A. None ofthe above te, sneer nico Aecentes Aree: 9) The role of egulrzerin cost function Ito Avoid overfiting Induce spars Simp hypathests 4. Allofthe above 4 ‘Accentes Anewers: ¢ deni the techniques which can be used for training autoencoders 4 Training one layer a time 2. Training the encoder fat and thon the decod End to-end taining saz 203 haa 283 ‘Accent Anewets: 5) Which ofthese tenotbe a procedure fr adding noise during traning ofa denoising sutoancoder? ‘Adding random Gaussian noise to input and output by. Adding random Gaussion noise to input Svappinainput variables 4. Aiding random wie noise to Inpit 4 Accented Aree: ©) Rogularzation of Contractive Autocncadrisimpesed on bs weiehts Weights and Acttions 4. Does nat use regularization uo, anaweri inate Sear ‘Accented Aree: 7) Which of he fllowing not the purpote of cor function in raining denoising autoencoder? 1 Dimension edition Error minimisation Weight Regularization 4. tage denoising decent newer: 5) Whatisthe KL Divergence between two equal dstbution? aa > te ao ‘copie Anewor 8 Find he value of he|*el-t}, df] belng the deta function. ‘eal ol fet] ines] te, sneer nico Aecentes Aree: “Find the value of hfo|*dlnet}, do] beng the delta function a hie] b. obj hint] 4. shine} ses few: Due on 2019-09-16, 23:59 IST, 1 point 1 point 1 point 1 pone ‘ome ome ome 1 point 1 point

You might also like