You are on page 1of 32

Photonic computing

Introduction & Theory

Piotr Antonik

January 5, 2022
Summary & Logistics 2

Plenary lectures with a colloquium format

Syllabus (for today)


• Why photonic computing?
• General overview of artificial neural networks (ANN)
• Introduction to reservoir computing
• Applications

Exam / evaluation
• Review of a recent scientific paper on a related topic +
personal written report (5-10 pages)

Photonics for computing Neural networks Reservoir computing Applications


Course support 3

1. Photonic Reservoir Computing : Optical Recurrent Neural


Networks. D. Brunner, M. C. Soriano, G. Van der Sande. De
Gruyter (2019)
2. A Practical Guide to Applying Echo State Networks. Mantas
Lukoševičius (2012)
3. Recent advances in physical reservoir computing: A review. G.
Tanaka et al. (2019)

Photonics for computing Neural networks Reservoir computing Applications


Table des matières 4

1 Photonics for computing

2 Neural networks

3 Reservoir computing

4 Applications

Photonics for computing Neural networks Reservoir computing Applications


Antikythera mechanism 5

Photonics for computing Neural networks Reservoir computing Applications


Zuse 3 6

Image source: Wikipedia

Photonics for computing Neural networks Reservoir computing Applications


Electronics vs Photonics 7

Electrons Photons
Spin Fermion Boson
Charge Yes No
Interaction Yes No
Nonlinear transformations XX X
Information transport X XX

Photonics for computing Neural networks Reservoir computing Applications


Switching energies of different nonlinear optical effects 8

Photonics for computing Neural networks Reservoir computing Applications


Optical bistability 9

Photonics for computing Neural networks Reservoir computing Applications


Array of bi-stable optical elements 10

Photonics for computing Neural networks Reservoir computing Applications


Table des matières 11

1 Photonics for computing

2 Neural networks

3 Reservoir computing

4 Applications

Photonics for computing Neural networks Reservoir computing Applications


Analogue neuron 12

Average spiking rate:


!
X
a=f wi x i
i

Photonics for computing Neural networks Reservoir computing Applications


Perceptrons 13

Perceptron learning algorithm:


(
1 if w · x + w0 > 0,
y(x) =
−1 otherwise
n
X
w·x= wi x i
i=1

Photonics for computing Neural networks Reservoir computing Applications


Perceptron learning algorithm 14

X
E(wi ) = − yj (w · x + w0 )
j∈E

∂E X
=− yi
∂w0
i∈E

∂E X
=− yj x i
∂wi6=0
j∈E
     
w0 w0 yi
 w1 
  ←  w1  + λ  y i x 1  ,
   
· · · · · ·  ··· 
wn wn yi x n

Photonics for computing Neural networks Reservoir computing Applications


Feedforward neural networks 15

  
Nh
X Xn
g(x, w) = wNh +1,i tanh  wij xj + wi0  + wNh +1,0
i=1 j=1
  
Nh
X Xn
= wNh +1,i tanh  wij xj  + wNh +1,0 ,
i=1 j=0

Photonics for computing Neural networks Reservoir computing Applications


Recurrent neural networks 16

x(n) = Ax(n − 1) + Bu(n − 1),


g(n) = Cx(n − 1) + Du(n − 1),

Photonics for computing Neural networks Reservoir computing Applications


Canonical form 17

Photonics for computing Neural networks Reservoir computing Applications


Optical perceptron learning 18
Early photonic implementation

Photonics for computing Neural networks Reservoir computing Applications


Table des matières 19

1 Photonics for computing

2 Neural networks

3 Reservoir computing

4 Applications

Photonics for computing Neural networks Reservoir computing Applications


A more relaxed model of computation 20

• Echo State Networks


• Liquid State Machines

Photonics for computing Neural networks Reservoir computing Applications


Reservoir computing with a reservoir 21

Photonics for computing Neural networks Reservoir computing Applications


Echo-state networks 22

x̃(n) = tanh Win [1; u(n)] + Wx(n − 1)




x(n) = (1 − α)x(n − 1) + αx̃(n)


y(n) = Wout [1; u(n); x(n)]

Photonics for computing Neural networks Reservoir computing Applications


How to train and optimise a reservoir computer 23

Linear regression layer:


T
1X 2
MSE(y, ytarget ) = y(n) − ytarget (n)
T
n=1

Minimise the MSE with:


−1
Wout = X T X XT y
Ridge regression (regularisation):
T
MSEridge = MSE + λ W out W out
Hyper-parameters of the model:
• input scaling β
• spectral radius ρ
• leak rate α
• regularisation term λ
Photonics for computing Neural networks Reservoir computing Applications
Practical guide to ESN 24

1. For challenging tasks use as big a reservoir as you can afford


computationally.
2. Select global parameters with smaller reservoirs, then scale to
bigger ones.
3. N should be at least equal to the estimate of independent real
values the reservoir has to remember from the input to solve
its task.
4. Connect each reservoir node to a small fixed number of other
nodes (e.g., 10) on average, irrespective of the reservoir size.
Exploit this reservoir sparsity to speedup computation.
5. ρ(W) < 1 ensures echo state property in most situations.
6. The spectral radius should be greater in tasks requiring longer
memory of the input.

Photonics for computing Neural networks Reservoir computing Applications


Practical guide to ESN 25

7. Scale the whole Win uniformly to have few global parameters


in ESN. However, to increase the performance:
• scale the first column of Win (i.e., the bias inputs) separately;
• scale other columns of Win separately if channels of u(n)
contribute differently to the task.
8. It is advisable to normalize the data and may help to keep the
inputs u(n) bounded avoiding outliers (e.g., apply tanh(·) to
u(n) if it is unbounded).
9. The input scaling regulates:
• the amount of nonlinearity of the reservoir representation x(n)
(also increasing with ρ(W));
• the relative effect of the current input on x(n) as opposed to
the history (in proportion to ρ(W)).
10. Set the leaking rate α to match the speed of the dynamics of
u(n) and/or ytarget (n).

Photonics for computing Neural networks Reservoir computing Applications


Practical guide to ESN 26

11. The main three parameters to optimize in an ESN reservoir


are:
• input scaling(-s);
• spectral radius;
• leaking rate(-s).
12. The most pragmatic way to evaluate a reservoir is to train the
output and measure its error.
13. To eliminate the random fluctuation of performance, keep the
random seed fixed and/or average over several reservoir
samples.
14. When manually tuning the reservoir parameters, change one
parameter at a time.
15. Always plot samples of reservoir activation signals x(n) to
have a feeling of what is happening inside the reservoir.

Photonics for computing Neural networks Reservoir computing Applications


Practical guide to ESN 27

16. The most generally recommended way to learn linear output


weights from an ESN is ridge regression.
17. Extremely large Wout values may be an indication of a very
sensitive and unstable solution.
18. Use regularization whenever there is a danger of overfitting or
feedback instability.
19. Select λ for a concrete ESN using validation, without
rerunning the reservoir through the training data.
20. Use direct pseudoinverse (Moore-Penrose) to train ESNs with
high precision and little regularization when memory and run
time permit.
21. Averaging outputs from multiple reservoirs increases the
performance.

Photonics for computing Neural networks Reservoir computing Applications


Practical guide to ESN 28

22. For long sequences discard the initial time steps of activations
x(n) for training that are affected by initial transient.
23. To classify sequences, trainPand use readouts from
time-averaged activations x, instead of x(n).
24. Different powerful classification methods for static data can
be employedPas the readout from the time-averaged
activations x.
25. Use output feedbacks to the reservoir only if they are
necessary for the task.

Photonics for computing Neural networks Reservoir computing Applications


Physical reservoir computing 29

Photonics for computing Neural networks Reservoir computing Applications


What makes a good physical reservoir computer? 30

1. High dimensionality
2. Nonlinearity
3. Fading memory
4. Separation property

Photonics for computing Neural networks Reservoir computing Applications


Table des matières 31

1 Photonics for computing

2 Neural networks

3 Reservoir computing

4 Applications

Photonics for computing Neural networks Reservoir computing Applications


Applications of RC 32

Photonics for computing Neural networks Reservoir computing Applications

You might also like