You are on page 1of 1

Assignment 3

1.Discuss the differences between the various optimization algorithms such as


Stochastic Gradient Descent (SGD), Adam, and Adagrad.

2. Explain the concept of momentum in optimization algorithms and how it affects


the convergence of a neural network.

3. Discuss the role of regularization techniques such as weight decay and dropout
in optimization, and how they can prevent overfitting in deep learning models.

4. Explain the concept of normalization in deep learning and its importance in


improving model performance. Discuss at least two popular normalization
techniques.

5. Why is it essential to normalize the input data in deep learning models? Explain
the impact of unnormalized input data on model performance.

ASSIGNMENT -4

6. What is a recurrent neural network (RNN), and how does it differ from other
types of neural networks? Explain the fundamental architecture of an RNN and its
applications.

7. How does the Long Short-Term Memory (LSTM) architecture address the
problem of vanishing gradients in RNNs? Explain the key components of an
LSTM cell and their roles in capturing long-term dependencies.

8. Explain the concept of attention mechanisms in RNNs and their applications in


machine translation and image captioning. How do attention mechanisms help in
focusing on relevant parts of the input sequence while processing it?

9. Explain the concept of autoencoders in deep learning and their applications.


Discuss the fundamental architecture of an autoencoder and the different types of
autoencoders used.

10. What is a convolutional autoencoder (CAE), and how does it differ from a
regular autoencoder? Explain the application of CAEs in image compression and
denoising.

You might also like