Professional Documents
Culture Documents
Neural Networks
Deep Learning
● Time Series
Deep Learning
● Time Series
Deep Learning
● Sequences
● Section Overview
○ RNN Theory
○ LSTMs and GRU Theory
○ Basic Implementation of RNN
○ Time Series with an RNN
○ Exercise and Solution
Let’s get started!
Recurrent Neural
Networks Theory
Deep Learning
● Examples of Sequences
○ Time Series Data (Sales)
○ Sentences
○ Audio
○ Car Trajectories
○ Music
Deep Learning
Activation Function
Σ Aggregation of Inputs
Input
Deep Learning
● Recurrent Neuron
Output
Output at t-1
Σ = Σ
Input Input
at t-1
Deep Learning
● Recurrent Neuron
Output
Output at t-1
Σ = Σ
Input Input
at t-1
Time
Deep Learning
● Recurrent Neuron
Output Output
Output at t-1 at t
Σ = Σ Σ
● Recurrent Neuron
Output Output Output
Output at t-1 at t at t+1
Σ = Σ Σ Σ
X y
Deep Learning
X y
Deep Learning
● “Unrolled” layer.
Output Output Output
at t=0 at t+1 at t+2
Time
Deep Learning
X0 X1 X2 X3 X4
Deep Learning
X0 X1 X2 X3 X4
Deep Learning
X0 X1 X2 X3 X4
Deep Learning
X0 X1 X2 X3 X4
Deep Learning
X0 0 0 0 0
Deep Learning
X0 0 0 0 0
Deep Learning
Hidden Layers
Deep Learning
Hidden Layers
Deep Learning
Hidden Layers
Deep Learning
0
0
z = wx + b
Deep Learning
0
0
z = wx + b
Deep Learning
Output
z = wx + b
Deep Learning
Output
z = wx + b
Deep Learning
● “Leaky” ReLU
Output
z = wx + b
Deep Learning
-1
0
z = wx + b
Deep Learning
● A typical RNN
Output Output Output
at t-1 at t at t+1
● RNN
Output
at t
Input
at t
Deep Learning
● RNN
Output t-1
Output t
Input t
Deep Learning
● RNN
Ht-1 Ht
Xt
Deep Learning
● RNN
Ht-1 Ht=tanh(W[Ht-1,Xt]+b) Ht
Xt
Deep Learning
● LSTM
Deep Learning
Input at time t
Deep Learning
Input Update
Gate Gate New Short
Short Term
Memory Term Memory
Input at time t
Deep Learning
σ X
0
Deep Learning
Input at time t
Deep Learning
● LSTM ht
Ct-1 Ct
ht-1
ht
xt
An LSTM Cell
Output at t
Here we can
X + see the entire
tanh
X
LSTM cell.
X
σ σ tanh σ Let’s go
through the
process!
Input at t
http://colah.github.io/posts/2015-08-Understanding-LSTMs/
An LSTM Cell
ht
Here we can
Ct-1 Ct
X + see the entire
tanh
X
LSTM cell.
X
ht-1
σ σ tanh σ Let’s go
ht
through the
xt
process!
An LSTM Cell
ht
Ct-1 Ct
X +
tanh
X
X
ht-1
σ σ tanh σ
ht
xt
An LSTM Cell
ht
Ct-1 Ct
X +
tanh
ft X
X
ht-1
σ σ tanh σ
ht
xt
An LSTM Cell
ht
Ct-1 Ct
X +
tanh
it X
~
Ct X
ht-1
σ σ tanh σ
ht
xt
An LSTM Cell
ht
Ct-1 Ct
X +
tanh
ft it X
~
Ct X
ht-1
σ σ tanh σ
ht
xt
An LSTM Cell
ht
Ct-1 Ct
X +
tanh
X
X
ht-1
σ σ tanh σ
ht
xt
An LSTM Cell with “peepholes”
ht
Ct-1 Ct
X +
tanh
X
X
ht-1
σ σ tanh σ
ht
xt
Gated Recurrent Unit (GRU)
ht
ht-1 Ct
X
+
X rt 1- zt X
σ σ tanh
xt
Deep Learning