1 min listen
[MINI] Dropout
FromData Skeptic
ratings:
Length:
16 minutes
Released:
Jan 13, 2017
Format:
Podcast episode
Description
Deep learning can be prone to overfit a given problem. This is especially frustrating given how much time and computational resources are often required to converge. One technique for fighting overfitting is to use dropout. Dropout is the method of randomly selecting some neurons in one's network to set to zero during iterations of learning. The core idea is that each particular input in a given layer is not always available and therefore not a signal that can be relied on too heavily.
Released:
Jan 13, 2017
Format:
Podcast episode
Titles in the series (100)
[MINI] Cross Validation: This miniepisode discusses the technique called Cross Validation - a process by which one randomly divides up a dataset into numerous small partitions. Next, (typically) one is held out, and the rest are used to train some model. The hold out set can... by Data Skeptic