Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Save to My Library
Look up keyword or section
Like this
3Activity

Table Of Contents

2. Dedication
5. Preface
6. Acknowledgements
7. Author’s declaration
1. Introduction
2. Background to work
2.1 Review of fundamental concepts
2.1.1 Chaotic series
Figure 1, The first 500 points of the logistic Equation
Figure 3 The first thousand points of the Mackey-Glass equation
2.1.2 Embedding as a means of unravelling chaotic series
Figure 5, Embedded logistic equation and Tent map
2.2 Chaos theory based measurements
2.2.1 The Lyapunov exponent
characteristic divergence in time associated with chaos
2.2.2 Hurst Exponent
2.3.1 Empirical methods
Figure 7, Genes and Chromosomes
Figure 8, Genetic Algorithm processes
Figure 9, The Crossover operator
Figure 10, Generating a new generation
2.3.2 Analytical methods
2.4 Supervised Learning as non-parametric modelling
2.5 Neural Networks
2.5.1 Simulated Neurons
2.5.2 Simulated Synapses
2.5.3 Network topologies
2.5.4 The Error Surface
2.5.5 Symmetry and multiple solutions
2.5.6 Practical Neural Learning Algorithms
2.5.7 Neural Network Minimisation
complexity
2.5.8 Cross Validation
Figure 13, in and out of sample training performance
2.6 Surface modelling techniques
2.7 Trivial Predictors
2.8 Object Orientation of Design
3.1 Development and run time platform
3.2 Reasons for the changes in design
3.3 Previous versions in detail
4.1 The time series database
4.1.1 Formatting and storing input data
4.1.4 Data normalisation
4.1.5 Quantization of price data
4.1.6 Working with first differences
4.2 Qualitative measurements performed on the data
4.2.1 Lyapunov Exponent Implementation
4.2.2 Hurst exponent implementation
4.2.3 The effects of quantization on qualitative measures
4.3 Embedding Analysis of the data
4.3.1 Auto-Mutual Information
4.3.2 False Nearest Neighbours
4.4 Training pattern generation
Figure 17, Embedding a time series using a FIFO
4.5 Supervised learning of the training patterns
4.5.1 Local approximation algorithm
Figure 18, Local approximation interpolation
4.6 Conversion and formatting of predictions
4.7 Iterated Predictions
4.8 Evaluation of predictions
4.8.1 Validation data
4.8.2 Metrics
4.8.3 Similar day Information
Figure 20, Graphic representation of similar day data
4.9 Performance of the predictor on artificial time series
5.1 Foreign exchange data
5.2 False nearest neighbour and auto-Mutual Information responses
Figure 35, 3 month $ deposit False Nearest Neighbours
Figure 39, effects of perturbing £/$ D Separation
5.4 Discussion of predictability of financial series
6.1 Background to the analysis
6.1.1 Measurement of Biomass
6.1.2 Culturing Yeast in a fermentor
Figure 42 Measured capacitance during the experimentFigure 42 shows the
6.2 The data supplied
6.3 Analysis of the data
6.4 Predictions generated from the data
6.5 Conclusions drawn from the results
7. Conclusions and further work
8. References
0 of .
Results for:
No results containing your search query
P. 1
Time Series Prediction Using Supervised Learning and Tools From Chaos Theory

Time Series Prediction Using Supervised Learning and Tools From Chaos Theory

Ratings: (0)|Views: 199 |Likes:
Published by ipixtlan

More info:

Published by: ipixtlan on Jan 18, 2012
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

01/03/2013

pdf

text

original

You're Reading a Free Preview
Pages 4 to 25 are not shown in this preview.
You're Reading a Free Preview
Pages 29 to 65 are not shown in this preview.
You're Reading a Free Preview
Pages 69 to 127 are not shown in this preview.
You're Reading a Free Preview
Pages 131 to 135 are not shown in this preview.

You're Reading a Free Preview

Download
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->