Professional Documents
Culture Documents
Training Feedforward Neural Networks Using Genetic Algorithms
Training Feedforward Neural Networks Using Genetic Algorithms
References
[Davis 19881 L. Davis, "Mapping Classifier Systems into
Neural Networks,'' to appear in Proceedings of the 1988
Conference on Neural Information Processing Systems,
Morgan Kaufimann.
(Goldberg 1988] D. Goldberg, Genetic Algorithms in Ma-
chine Learning. Optimization, and Search, Addison-
Wesley (I988).
had two operators. MUTATE-NODES and CROSSOVER-
[Holland 1975] J. Holland, Adaptation in Natural and Arti-
NODES, and wis thus a variation on a standard genetic
ficial Systems, University of Michigan Press (1975).
algorithm.
[Rumelhart 1986a] D.E. Rumelhart, G.E. Hinton and
When comparing them, we considered two iterations of R.J. Williams, "Learning Representations by Back-
the genetic algorithm to be equivalent to one iteration (i.e. Propagating Errors,''' Nature 323, pp. 533-536 (1986).
one cycle through all the training data) of backpropagation.
I Rumelhart 1986b] D.E. Rumelhart and J.L. McClelland
To see why, observe that backpropagation consists of loop-
(Eds.), Parallel Distributed Processing: Explorations in
ing through all training data doing I) forward propagation
the Microstructure of Cognition. M I T Press (1986).
and calculation of errors at the outputs and 2) error backward
propagation and adjusting of weights. The second step re- [Whitley 1988] D. Whitley, "Applying Genetic Algorithms
quires more computation in our network and almost all other to Neural Network Problems," International Neural Net-
networks of interest. The evaluation function of the genetic work Society p. 230 (1988).
algorithm performs the same calculations as step 1). The
operators MUTATE-NODES and CROSSOVER-NODES do
very little computation. Hence, one iteration of backprop-
agation requires more than twice as much computation as
one iteration of the genetic algorithm.