Professional Documents
Culture Documents
Contents
Time Series
CS 476: Networks of Neural Computation
Prediction WK4 – Radial Basis Function
TS & NNs Networks
RBF Model
Conclusions
Dr. Stathis Kasderidis
Dept. of Computer Science
University of Crete
•Prediction
•Control
•Availability of data:
Contents
•Enough in number;
Time Series •Quality;
Prediction •Resolution.
5. Testing a model:
Contents
• Testing the model in out of sample data;
Time Series
• Re-iterate the modelling procedure until we
Prediction produce a model with which we are satisfied;
TS & NNs • Compare different classes of models in order
RBF Model to find the best one;
TS & NNs
•Representing variables:
Contents
•Continuous or discrete;
Time Series
•Semantics of variables (i.e. probabilities,
Prediction categories, data points, etc);
TS & NNs •Distributed or atomic representation;
RBF Model •Variables with little information content can be
Conclusions harmful in generalisation;
•In Bayesian estimation the method of Automatic
Relevance Determination can be used for
selecting variables;
•Selecting Features
•Capturing of causal relations
•Selecting an architecture:
Contents
•Type of training;
Time Series
•Family of models;
Prediction
•Transfer function;
TS & NNs
•Memory;
RBF Model
•Network Topology;
Conclusions
•Other parameters in network specification.
•Model selection:
•See discussion in WK3
• Additional Literature:
Contents
1. Masters T. (1995). Neural, Novel & Hybrid
Time Series Algorithms for Time Series Prediction, Wiley.
Prediction 2. Pawitan Y, (2001). In all Likelihood: Statistical
TS & NNs Modelling and Inference Using Likelihood, Oxford
University Press.
RBF Model
3. Chatfield C. (1989). The analysis of time series. An
Conclusions introduction. 4th Ed. Chapman & Hall.
4. Harvey A (1993). Time Series Models, Harvester
Wheatsheaf.
5. Efron B., Tibshirani R. (1993). An introduction to
Bootstrap, Chapman and Hall.
Prediction
TS & NNs
RBF Model
Conclusions
RBF Model
TS & NNs
RBF Model
Where N is the size of the training sample used to do
Conclusions the learning, and ej is the error signal defined by:
e j d j F (x j )
M
d j wi G (|| x j ti ||ci )
i 1
TS & NNs
•We use a weighted norm matrix when the individual
elements of x belong to different classes.
RBF Model
•To calculate the update equations we use gradient
Conclusions
descent on the instantaneous error function E. We get
the following update rules for the free parameters:
Prediction
TS & NNs E ( n )
wi (n 1) wi (n) 1
RBF Model wi (n) i=1,2,…,m1
Conclusions
2. Positions
E ( n)
of centers
N (hidden layer):
2 wi (n) e j (n)G ' (|| x j ti (n) ||Ci ) i [ x j ti (n)]
1
t i ( n ) j 1
E ( n )
ti ( n 1) ti ( n) 2
t i ( n )
i=1,2,…,m1
CS 476: Networks of Neural Computation, CSD, UOC, 2009
Learning Law for Radial Basis Networks IV
RBF Model
•NN can be used as models in time series prediction.