You are on page 1of 37

Version 

: 2.02

Sequence 06
Données séquentielles et/ou temporelles.
Réseaux de Neurones Récurrents (RNN)
https://fidle.cnrs.fr

SAISON
2122
Le Réseau des informaticiens
du SillonAlpin

SIMaP
Cette session va être enregistrée.
Retrouvez-nous sur notre chaine YouTube
This session will be recorded.
Find us on our YouTube channel

https://fidle.cnrs.fr/youtube

SAISON
2122
Version : 2.02

Sequence 06
Données séquentielles et/ou temporelles.
Réseaux de Neurones Récurrents (RNN)
https://fidle.cnrs.fr

SAISON
2122
Le Réseau des informaticiens
du SillonAlpin

SIMaP
Before getting serious

Available ressources on : https://fidle.cnrs.fr

Slides Gitlab Datasets Vidéos

For doctoral students, we can make attestations :


● Remember to use Zoom , rather than YouTube
● Log in with your academic email address
● For any question : attestation@fidle.cnrs.fr

Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)


https://creativecommons.org/licenses/by-nc-nd/4.0/
Resources

You can also subscribe to :

Fidle information list http://fidle.cnrs.fr/listeinfo


To receive all FIDLE informations

devlog@services.cnrs.fr https://listes.services.cnrs.fr/wws/info/devlog
List of ESR* developers

calcul@listes.math.cnrs.fr https://listes.math.cnrs.fr/wws/info/calcul
List of ESR* « calcul » group

To contact us : Contact@fidle.cnrs.fr


https://fidle.cnrs.fr
(*) ESR : Enseignement Supérieur et Recherche, french universities and public academic research organizations
Roadmap

SAISON
2122 Full description on https://fidle.cnrs.fr/programme 6
37
Roadmap

6.1 Sequences data


➔ Recurrent Neural Network
➔ LSTM and GRU

6 6.2 Example 1 : Ladybug1


➔ Prediction of a virtual trajectory

Sequences data
RNN
6.3 Example 2 : SYNOP1/3
➔ Weather prediction at 3h and 12h
Roadmap

6.1 Sequences data


➔ Recurrent Neural Network
➔ LSTM and GRU

6 6.2 Example 1 : Ladybug1


➔ Prediction of a virtual trajectory

Sequences data
RNN
6.3 Example 2 : SYNOP1/3
➔ Weather prediction at 3h and 12h
What if the world was just one big sequence ?

...

9
37
From classic to recurrent neuron

Classical neuron.

10
37
From classic to recurrent neuron Where :
wy  : is a scalar
Wx : is a vector
b : is a scalar
y : is a scalar

Classical neuron.

Recurrent neuron.

11
37
Simple reccurent neuron Where :
wy  : is a scalar
Wx : is a vector
b : is a scalar
y : is a scalar

Recurrent neuron.

12
37
Simple reccurent neuron Where :
wy  : is a scalar
Wx : is a vector
b : is a scalar
y : is a scalar

Recurrent neuron.

13
37
Recurrent Layer / Cell

Where :
Wy  : is a tensor
Wx : is a tensor
b : is a vector
Y : is a vector
Recurrent neuron. Recurrent Layer / Cell

14
37
Recurrent Layer / Cell Where :
Wy  : is a tensor
Wx : is a tensor
b : is a vector
Y : is a vector

Recurrent Layer / Cell

15
37
Recurrent Layer / Cell Where :
Wy  : is a tensor
Wx : is a tensor
b : is a vector
Y : is a vector

Recurrent Layer / Cell

16
37
Recurrent Layer / Cell Wx shape is : (nb units, x size)
Wy shape is : (nb units, nb units)
y shape is : (nb units,)

Recurrent Layer / Cell 17


37
Simple RNN limits...

But….

... ...

Slow convergence, In short...


Short memory, it doesn't
Vanishing / exploding gradients work !
18
37
Long Short-Term Memory (LSTM)

Long term
state

Short term
state

Long short-term memory (LSTM)1


Gated recurrent unit (GRU)2
19
1
Sepp Hochreiter, Jürgen Schmidhuber, (1997) [LSTM] 2
Kyunghyun Cho et al, (2014) [GRU] 37
Long Short-Term Memory (LSTM)

inputs = tf.random.normal([32, 20, 8])

lstm = tf.keras.layers.LSTM(16)
output = lstm(inputs)
Serie to vector
Inputs shape is : (32, 20, 8)
Output shape is : (32, 16)

lstm = tf.keras.layers.LSTM(18, return_sequences=True, return_state=True)

output, memory_state, carry_state = lstm(inputs)


Serie to serie
Output shape : (32, 20, 18)
Memory state : (32, 18)
Carry state : (32, 18)

More about :
https://keras.io/api/layers/recurrent_layers/lstm/ 20
https://www.tensorflow.org/api_docs/python/tf/keras/layers/LSTM 37
Reccurent Neural Network (RNN)

Serie to serie Serie to vector


Example : Time serie prediction Example : Sentiment analysis

Vector to serie Encoder-decoder


Example : Image annotation Example : Language Translation

21
37
How to predict a sequence ?

Input X = [ X1, X2, …, Xn ]


Sequence :
Known sequence : [ X1, X2, …, Xn, Xn+1 ]
Expected Y = [ Xn+1 ]
output :

Serie to vector :
The objective will be to train our
RNN network to predict the n+1
vector of our input sequence :

22
37
Preparation of sequence data

The distribution of data between


trains and test series must be
comparable.
23
37
Preparation of sequence data

Can the past explain the


future ?

24
37
Preparation of sequence data

Note that a data


generator should be
very useful !
25
37
Roadmap

6.1 Sequences data


➔ Recurrent Neural Network
➔ LSTM and GRU

6 6.2 Example 1 : Ladybug1


➔ Prediction of a virtual trajectory

Sequences data
RNN
6.3 Example 2 : SYNOP1/3
➔ Weather prediction at 3h and 12h
Time series with RNN
Notebook : [LADYB1]

Objective :
Prediction of a 2D ladybug trajectory with a RNN
Dataset :
Generated trajectory

27
37
LADYB1

<60 s
train

test

Import Generation Compile


and init of a dataset and train
START

Step 1 Step 2 Step 3 Step 4 Step 5

END
Objectives :
Parameters Create a model Try a
Prediction sequence_len = 20 trajectory
of a 2D predict_len = 5 prediction
ladybug* scale
train_prop
=
=
1
.8
trajectory batch_size
epochs
=
=
32
5
with a RNN
28
*Disclaimer : No ladybugs were harmed during the making of this notebook. 37
Roadmap

6.1 Sequences data


➔ Recurrent Neural Network
➔ LSTM and GRU

6 6.2 Example 1 : Ladybug1


➔ Prediction of a virtual trajectory

Sequences data
RNN
6.3 Example 2 : SYNOP1/3
➔ Weather prediction at 3h and 12h
Time series with RNN
Notebook : [SYNOP1-3]
Objective :
Guess what the weather will be like !
Dataset :
SYNOP meteorological data*.
Data from LYS airport for the period 2010-2020

* Observational data from international surface observation messages (SYNOP) circulating on


the World Meteorological Organization's (WMO) Global Telecommunication System (GTS). 30
https://public.opendatasoft.com/explore/dataset/donnees-synop-essentielles-omm/information/?sort=date 37
Time series with RNN
Notebook : [SYNOP1-3]
Episode 1 : Data analysis and creation of a usable dataset
Episode 2 : Training session and first predictions
Episode 3: Attempt to predict in the longer term

31
37
SYNOP1 | SYNOP2 | SYNOP3

< 3 s !

SYNOP Pandas
Import Retrieve the About our
and init dataset enhanced dataset
START

Step 1 Step 2 Step 3 Step 4 Step 5 Step 6

END
Objectives : Prepare
Parameters Save it
Data analysis dataset
(No need to change)
and Enhanced ./data

preparation Pandas Enhanced

of a usuable
dataset
32
37
SYNOP1 | SYNOP2 | SYNOP3

scale = 1
train_prop = .8 ± 1’50s
sequence_len = 16
batch_size = 32
epochs = 10

Import
and init Create a model Predict

START

Step 1 Step 2 Step 3 Step 4 Step 5

END

Read and Compile


Objectives : prepare dataset and train
First weather
prediction, train

using LSTM. ./data


test

Attempt at 3h
33
37
SYNOP1 | SYNOP2 | SYNOP3

Iterations = 4
scale = 1
train_prop = .8 < 3s !
sequence_len = 16
batch_size = 32
epochs = 10

Import
and init Predict

START

Step 1 Step 2 Step 5

END

Read and
Objectives : prepare dataset
First weather
prediction, train

using LSTM. ./data


test

Attempt at 12h
34
37
Little things and concepts to keep in mind

– Can the past explain the future ?


– Understand the data, again and again !
– Beware of overfiting
– Remember that Pandas is good for you !
– The json files are cool, too
– Preparing your data can cost 70% of the work
– Think about data generators
– Matplotlib are also very good for you !
– There is a lot of sequential data
– Not everything can uses GPUs...

35
37
Next, on Fidle :

Jeudi 27 janvier,
Épisode 7 :
7 5b
"Attention Is All You Need",
quand les Transformers changent la donne
«Attention is Nouvelle evolution des modeles
All You Need» Fine tuning
Transformers Cas d'usages : BERT, GPT
Durée : 2h00
Next on Fidle :
6 Jeudi 27 janvier,
Séquence 6 :
Sequence data "Attention Is All You Need",
RNN quand les Transformers changent la donne !

Merci !

To be continued...
Contact@fidle.cnrs.fr
FIDLE https://fidle.cnrs.fr
Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) 37
https://fidle.cnrs.fr/youtube https://creativecommons.org/licenses/by-nc-nd/4.0/
37

You might also like