Professional Documents
Culture Documents
: 2.02
Sequence 06
Données séquentielles et/ou temporelles.
Réseaux de Neurones Récurrents (RNN)
https://fidle.cnrs.fr
SAISON
2122
Le Réseau des informaticiens
du SillonAlpin
SIMaP
Cette session va être enregistrée.
Retrouvez-nous sur notre chaine YouTube
This session will be recorded.
Find us on our YouTube channel
https://fidle.cnrs.fr/youtube
SAISON
2122
Version : 2.02
Sequence 06
Données séquentielles et/ou temporelles.
Réseaux de Neurones Récurrents (RNN)
https://fidle.cnrs.fr
SAISON
2122
Le Réseau des informaticiens
du SillonAlpin
SIMaP
Before getting serious
devlog@services.cnrs.fr https://listes.services.cnrs.fr/wws/info/devlog
List of ESR* developers
calcul@listes.math.cnrs.fr https://listes.math.cnrs.fr/wws/info/calcul
List of ESR* « calcul » group
SAISON
2122 Full description on https://fidle.cnrs.fr/programme 6
37
Roadmap
Sequences data
RNN
6.3 Example 2 : SYNOP1/3
➔ Weather prediction at 3h and 12h
Roadmap
Sequences data
RNN
6.3 Example 2 : SYNOP1/3
➔ Weather prediction at 3h and 12h
What if the world was just one big sequence ?
...
9
37
From classic to recurrent neuron
Classical neuron.
10
37
From classic to recurrent neuron Where :
wy : is a scalar
Wx : is a vector
b : is a scalar
y : is a scalar
Classical neuron.
Recurrent neuron.
11
37
Simple reccurent neuron Where :
wy : is a scalar
Wx : is a vector
b : is a scalar
y : is a scalar
Recurrent neuron.
12
37
Simple reccurent neuron Where :
wy : is a scalar
Wx : is a vector
b : is a scalar
y : is a scalar
Recurrent neuron.
13
37
Recurrent Layer / Cell
Where :
Wy : is a tensor
Wx : is a tensor
b : is a vector
Y : is a vector
Recurrent neuron. Recurrent Layer / Cell
14
37
Recurrent Layer / Cell Where :
Wy : is a tensor
Wx : is a tensor
b : is a vector
Y : is a vector
15
37
Recurrent Layer / Cell Where :
Wy : is a tensor
Wx : is a tensor
b : is a vector
Y : is a vector
16
37
Recurrent Layer / Cell Wx shape is : (nb units, x size)
Wy shape is : (nb units, nb units)
y shape is : (nb units,)
But….
... ...
Long term
state
Short term
state
lstm = tf.keras.layers.LSTM(16)
output = lstm(inputs)
Serie to vector
Inputs shape is : (32, 20, 8)
Output shape is : (32, 16)
More about :
https://keras.io/api/layers/recurrent_layers/lstm/ 20
https://www.tensorflow.org/api_docs/python/tf/keras/layers/LSTM 37
Reccurent Neural Network (RNN)
21
37
How to predict a sequence ?
Serie to vector :
The objective will be to train our
RNN network to predict the n+1
vector of our input sequence :
22
37
Preparation of sequence data
24
37
Preparation of sequence data
Sequences data
RNN
6.3 Example 2 : SYNOP1/3
➔ Weather prediction at 3h and 12h
Time series with RNN
Notebook : [LADYB1]
Objective :
Prediction of a 2D ladybug trajectory with a RNN
Dataset :
Generated trajectory
27
37
LADYB1
<60 s
train
test
END
Objectives :
Parameters Create a model Try a
Prediction sequence_len = 20 trajectory
of a 2D predict_len = 5 prediction
ladybug* scale
train_prop
=
=
1
.8
trajectory batch_size
epochs
=
=
32
5
with a RNN
28
*Disclaimer : No ladybugs were harmed during the making of this notebook. 37
Roadmap
Sequences data
RNN
6.3 Example 2 : SYNOP1/3
➔ Weather prediction at 3h and 12h
Time series with RNN
Notebook : [SYNOP1-3]
Objective :
Guess what the weather will be like !
Dataset :
SYNOP meteorological data*.
Data from LYS airport for the period 2010-2020
31
37
SYNOP1 | SYNOP2 | SYNOP3
< 3 s !
SYNOP Pandas
Import Retrieve the About our
and init dataset enhanced dataset
START
END
Objectives : Prepare
Parameters Save it
Data analysis dataset
(No need to change)
and Enhanced ./data
of a usuable
dataset
32
37
SYNOP1 | SYNOP2 | SYNOP3
scale = 1
train_prop = .8 ± 1’50s
sequence_len = 16
batch_size = 32
epochs = 10
Import
and init Create a model Predict
START
END
Attempt at 3h
33
37
SYNOP1 | SYNOP2 | SYNOP3
Iterations = 4
scale = 1
train_prop = .8 < 3s !
sequence_len = 16
batch_size = 32
epochs = 10
Import
and init Predict
START
END
Read and
Objectives : prepare dataset
First weather
prediction, train
Attempt at 12h
34
37
Little things and concepts to keep in mind
35
37
Next, on Fidle :
Jeudi 27 janvier,
Épisode 7 :
7 5b
"Attention Is All You Need",
quand les Transformers changent la donne
«Attention is Nouvelle evolution des modeles
All You Need» Fine tuning
Transformers Cas d'usages : BERT, GPT
Durée : 2h00
Next on Fidle :
6 Jeudi 27 janvier,
Séquence 6 :
Sequence data "Attention Is All You Need",
RNN quand les Transformers changent la donne !
Merci !
To be continued...
Contact@fidle.cnrs.fr
FIDLE https://fidle.cnrs.fr
Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) 37
https://fidle.cnrs.fr/youtube https://creativecommons.org/licenses/by-nc-nd/4.0/
37