You are on page 1of 3

Assessment submitted.

(https://swayam.gov.in) (https://swayam.gov.in/nc_details/NPTEL)
X

jeyaram.1701060@srec.ac.in 

NPTEL (https://swayam.gov.in/explorer?ncCode=NPTEL) » Deep Learning Part 1 (IITM) (course)

Announcements (announcements) About the Course (https://swayam.gov.in/nd1_noc19_cs85/preview)

Ask a Question (forum) Progress (student/home) Mentor (student/mentor)

Unit 14 - Week 11

Course outline Thank you for taking the Assignment


How to access the
11.
portal

Pre-Requisite
Assignment Assignment 11
Your last recorded submission was on 2019-10-16, 19:20 IST Due date: 2019-10-16, 23:59 IST.
Week 1
1) State True or False. 1 point
Week 2 In the context of the state equations of LSTM, we have seen that ht = ot ⊙ σ(st) where
h t , ot , st ∈ R . The derivative of h t w.r.t. st results in a square matrix?
n

Week 3
True
Week 4 False

2) What is the difference between back-propagation algorithm and back-propagation through time 1 point
Week 5
(BPTT) algorithm?

Week 6 Unlike back-propagation, in BPTT we add the gradients for corresponding weight for each time step
Unlike back-propagation, in BPTT we subtract the gradients for corresponding weight for each time
Week 7 step
No difference
Week 8
None of the above
Week 9 3) What technique is followed to deal with the problem of Exploding Gradients in Recurrent Neural 1 point
Net- works (RNN)?
Week 10
Parameter Tying
Week 11 Gradient clipping
Using modified architectures like LSTMs and GRUs
Sequence
Learning Using dropout
Problems (unit?
unit=150&lesson=151)
4) State True or False 1 point
LSTM provide more control-ability and better results as compared to RNN. But also comes with more
Recurrent Neural complexity and operating cost
Networks (unit?
unit=150&lesson=152) True
Backpropagation False
through time
(unit? 5) What is the number of zeros in the derivative of ht w.r.t. st , whereht , st n
∈ R ? 1 point
Assessment submitted.
unit=150&lesson=153)
X
The problem of
n−1
Exploding and
Vanishing
Gradients (unit? n

unit=150&lesson=154)
2
Some Gory n −n

Details (unit? 0
unit=150&lesson=155)
6) State True or False 1 point
Selective Read,
In LSTM, during forward propagation, the gates control the flow of information
Selective Write,
Selective Forget
True
- The Whiteboard
Analogy (unit? False
unit=150&lesson=156)
7) Which of the following is part of the sequence learning problem? 1 point
Long Short Term
Memory(LSTM) Successive inputs are not independent
and Gated The length of input is fixed
Recurrent
Units(GRUs)
Requirement of future inputs to make current predictions
(unit? None of the above
unit=150&lesson=157)
8) State True or False 1 point
How LSTMs RNNs can be used with convolutional layers to extend the effective pixel neighborhood
avoid the
problem of True
vanishing
False
gradients (unit?
unit=150&lesson=158) 9) Which of the following statements is true with respect to GRU? 1 point
How LSTMs
avoid the
Units with short-term dependencies have reset gate very active
problem of Units with long-term dependencies have update gate very active
vanishing
Both of the above
gradients
(Contd.) (unit? None of the above
unit=150&lesson=159)
10)Why is an RNN (Recurrent Neural Network) used for machine translation, say translating English 1 point
Lecture Material to French?
for Week 11
(unit? It is applicable when the input/output is a sequence (e.g., a sequence of words)
unit=150&lesson=160) It is strictly more powerful than a Convolutional Neural Network (CNN)
Quiz : RNNs do not have problem of vanishing gradient
Assignment 11
None of the above
(assessment?
name=188) You may submit any number of times before the due date. The final submission will be considered for
grading.
Week 11
Feedback : Deep Submit Answers
Learning (unit?
unit=150&lesson=161)

Week 12

DOWNLOAD
VIDEOS
Assessment submitted.
X

You might also like