You are on page 1of 2

Genetic Algorithms and Machine Learning

John J. Grefenstette
Navy Center for Applied Research in Artificial Intelligence
Naval Research Laboratory
Washington, DC 20375, USA
GREF@AIC. NRL. NAVY, MIL

Abstract

One approach to the design of learning systems is to learning systems that use genetic algorithms to learn
extract heuristics from existing adaptive systems. Ge- strategies for sequential decision problems [5]. In our
netic algorithms are heuristic learning models based on SAMUEL system [7], the “chromosome” of the genetic
principles drawn from natural evolution and selective algorithm represents a set of condition-action rules for
breeding, Some features that distinguish genetic alg~ controlling an autonomous vehicle or a robot. The fit-
rithms from other search methods are: ness of a rule set is measured by evaluating the per-
formance of the resulting control strategy on a simul~
A population of structures that can be interpreted tor. This system has successfully learned highly effective
as candidate solutions to the given problem; strategies for several tasks, including evading a preda-
The competitive selectaon of structures for repr~ tor, tracking a prey, seeking a goal while avoiding ob-
duction, based on each structure’s fitness as a so- stacles, and defending a goal from threatening agents.
lution to the given problem; As these examples show, we have a high level of in-
terest in learning in multi-agent environments in which
Idealized uenetzc o~er-ators that alter the selected
the behavior of the external agents are not easily char-
structures-in order to create new structures for fur-
acterized by the learner. We have found that genetic
ther testing.
algorithms provide an efficient way to learn strategies
that take advantage of subtle regularities in the behav-
In many applications, these features enable the genetic
algorithm to rapidly improve the average fitness of the ior of opposing agents. We are now beginning to inves-
tigate the more general case in which the behavior of
population and to quickly identify the high performance
the external agents changes over time. In particular, we
regions of very complex search spaces. In practice, ge-
netic algorithms may be combined with local search are interested in learning competitive strategies against
an opponent that is itself a learning agent. This is, of
techniques to create a high-performance hybrid search
course, the usual situation in natural environments in
algorithm. This talk provides a survey of recent ad-
which multiple species compete for survival. Our initial
vances in the application of genetic algorithms to prob-
studies lead us to expect that genetic learning systems
lems in machine learning.
can successfully adapt to changing environmental con-
Although many genetic algorithm applications have ditions.
been in the areas of function opt imizat ion, parame-
While the range of applications of genetic algorithms
ter tuning, scheduling and other combinatorial prob-
continues to grow more rapidly each year, the study
lems [1], genetic algorithms have also been applied to
of the theoretical foundations is still in an early stage.
many traditional machine learning problems, including
Holland’s early work [9] showed that a simple form of ge-
concept learning from examples, learning weights for
netic algorithm implicitly estimates the utility of a vast
neural nets, and learning rules for sequential decisions
number of distinct subspaces, and allocates future trials
problems. At NRL, we investigate many aspects of ge-
accordingly. Specifically, let H be a hyperplane in the
netic algorithms, ranging from the study of alternative
represent ation space. For example, if the structures are
selection policies [6] and crossover operators [3, 12], to
represented by six binary features, then the hyperplane
performance studies of genetic algorithms for optimiza-
denoted by H =0#1### consists of all structures in
tion in non-stationary environments [8]. Much of our
which the first feature is absent and the third feature is
effort has been devoted to the development of practical
present. Holland showed that the expected number .. of
samples (offspring) allocated to a hyperplane H at time
t + 1 is given by:

f(H, t)
ACM COLT ’93 17 f931CA, USA M(H, t+l) z M(H, t)*----=-- *(1 - PJH))
1993 ACM 0-89791-61 1-5/9310003 f

3
In this expression, !vf(ff, t) is the expected proportion References
of the population in hyperplane H at time t, f(H, t) is
the average fitness of the current samples allocated to [1] L. Davis (Ed.), Handbook of Genettc Algorithms,
H, ~ is the average fitness of the current population, Van Nostrand Reinhold, 1991.
and Pal(H) is the probability that the genetic opera- [2] K. A. De Jong, Genetic-algorithm-based learning.
tors will be “disruptive” in the sense that the children In Machzne Learnzng: An artzjictal Intelligence
produced will not be members of the same subspace as approach, Vol. 9, Y. Kodratoff and R. Michal-
their parents.1 The usual interpretation of this result ski (Eds. ), Morgan Kaufmann, 1990.
is that subspaces with consistently higher than average
[3] K. A. De Jong and W. M. Spears, A formal anal-
payoffs will be allocated exponentially more trials over
ysis of the role of multi-point crossover in genetic
time, while those subspaces with below average payoffs
algorithms. Annals of Mathematics and Arttficzal
will be allocated exponentially fewer trials. This zm-
Intelligence 5(1) , 1-26, 1992.
pltcd pamllelzsm can be shown to arise in any genetic
algorithm that satisfies certain minimal conditions [6]. [4] D. E. Goldberg, Genettc Algorithms tn Search,
Optzmtzataon, and Machine Learnzng. Addison-
Holland also provided an analysis of how genetic alg~ Wesley, 1989.
rithms balance two conflicting requirements: the need
to explore the search space in search of information, and [5] J. J. Grefenstette, Credit assignment in rule dis-
covery system based on genetic algorithms. Ma-
the need to explott the information gained to focus fu-
chine Learning 3(2/3), 225-245, 1988.
ture activity. The goal of strengthening the character-
ization of the behavior of genetic algorithms is an ac- [6] J. J. Grefenstette, Conditions for implicit paral-
tive research area. Rsults have been presented at an lelism. In Foundations of Genetic Algorithms, G.
ongoing series of Workshops [10, 13]. The COLT com- Rawlins (Ed.), Morgan Kaufmann, 1991.
munity may be particularly interested in [1 1], in which [7] J. J. Grefenstette, C. L. Ramsey and A. C.
Ros presents an initial PAC analysis of a class of genetic Schultz, Learning sequential decision rules us-
concept learners. There are many remaining opport u-
ing simulation models and competition. Machine
nities for formal analysis of genetic algorithms. Some Learnzng 5(4), 355-381, 1990.
crucial open questions include:
[8] J. J. Grefenstette, Genetic algorithms for chang-
How quickly do genetic algorithms converge to an ap- ing environments. Proceedings of Parallel Prob-
proximately optimal solution for various classes of prob- lems Solvzng from Nature-2 , R. Maenner and B.
lems? Behavior in the limit is known, but concrete Manderick (Eds.), North-Holland, 137-144, 1992.
convergence results are known only for trivial classes
[9] J. H. Holland, Adaptation zrs natural and arttji-
of problems and for the simplest forms of genetic algo-
czal systems. Ann Arbor: University of Michigan
rithms.
Press, 1975.
For which class of problems does recombination (e.g.,
[10] G. Rawlins (Ed.), Foundations of Genettc Algo-
crossover) provide a measurable advantage over mutw rithms , Morgan Kaufmann, 1991.
tion alone?
[11] J. Ros, Learning Boolean functions with genetic
Given a fixed amount of computational resources, what algorithms: A PAC analysis. In Foundations of
are the optimal trade-offs among population size, num- Genettc Algorithms-2 D. Whitley (Ed.), Morgan
ber of generations, and (for probabilistic problems) eval- Kaufmann, 1993.
uation accuracy?
12] W. M. Spears and K. A. De Jong, An analysis of
How much noise in the fitness functions can genetic al- multi-point crossover. In Foundations of Genettc
gorithms tolerate? Algomthms G. Rawlins (Ed.), Morgan Kaufmann,
1991.
How well do genetic algorithms track non-stationary en-
vironments? 13] D. Whitley (Ed.), Foundations of Genetac
Algorithms-2 , Morgan Kaufmann, 1993.
What is the role of mating restrictions, e.g., mating with
similar structures or among a spatially segregated sub-
population, in promoting robust search and learning?

For more details on these and other topics related to


genetic algorithms, the references listed below should
provide good starting points for further reading.

lThe effects of mutation are generally insignificant in


practice and may be neglected in a first-order analysis. Con-
siderable attention has been given to estimating the proba-
bility that a particular application of crossover will be dis-
ruptive [3].

You might also like