Professional Documents
Culture Documents
RNN 1
RNN 1
(RNN)
Time-indexed data points
The time-indexed data points may be:
[1] Equally spaced samples from a continuous real-world process.
Examples include
● The still images that comprise the frames of videos
● The discrete amplitudes sampled at fixed intervals that comprise
audio recordings.
● Daily Values of current exchange rate
● Rainfall measurements in Successive days (in certain location)
[2] Ordinal Time steps, with no exact correspondence to durations.
● Natural language (word sequence)
● Neucleotide base pairs in strand of DNA
Traditional Language Models
Traditional Language Models
RECURRENT NEURAL NETWORK (RNN)
Recurrent: perform same task for every element of a sequence
Output: depend on:
previous computations as well as
new inputs
RNNs have a “memory” of past !
Activation function
Time
Examples of Sequence
Application of RNN - LSTM
image sequence named entity
captioning classification translation
recognition
CHARACTER-LEVEL LANGUAGE MODEL
One-HOT Vectors input for Word Sequence
Indices instead of one-hot vectors?
CHARACTER-LEVEL LANGUAGE MODEL
CHARACTER-LEVEL LANGUAGE MODEL
CHARACTER-LEVEL LANGUAGE MODEL
(Generative Model)
Simple and Real RNN
(Number of Parameters)
Generated Text
Using Wikipedia
Generated Text
C Source Code
Generated Text
BACKPROPAGATION THROUGH TIME (BPTT)
BACKPROPAGATION THROUGH TIME
(BPTT)
Calculate gradients of error
with respect to: U, V, W.
Remember:
Softmax fn
BACKPROPAGATION THROUGH TIME
(BPTT)