You are on page 1of 10

Nonstationary Function Optimization using the Structured

Genetic Algorithm.
Dipankar Dasgupta and Douglas R. McGregor.
Dept. of Computer Science, Uni. of Strathclyde
Glasgow G1 1XH, U. K.
In proceedings of Parallel Problem Solving From Nature (PPSN-2)
Conference, 28-30 September, 1992, Brussels (Belgium), pp 145-154.

Abstract
In this paper, we describe the application of a new type of genetic algorithm called
the Structured Genetic Algorithm (sGA) for function optimization in nonstationary en-
vironments. The novelty of this genetic model lies primarily in its redundant genetic
material and a gene activation mechanism which utilizes a multi-layered structure for
the chromosome. In adapting to nonstationary environments of a repeated nature genes
of long-term utility can be retained for rapid future deployment when favourable environ-
ments recur. The additional genetic material preserves optional solution space and works
as a long term distributed memory within the population structure. This paper presents
important aspects of sGA which are able to exploit the repeatability of many nonstation-
ary function optimization problems. Theoretical arguments and empirical study suggest
that sGA can solve complex problems more eciently than has been possible with sim-
ple GAs. We also noted that sGA exhibits implicit genetic diversity and viability as in
biological systems.
1. Introduction.
Genetic Algorithms [16] represent a class of general purpose adaptive problem solving
techniques, based on the principles of population genetics and natural selection. The
workings of simple GAs have been described elsewhere [13][17].
Genetic Algorithms are nding increasing applications in a variety of problems across
a spectrum of disciplines [8][9]. Despite their empirical success, as their usage has grown,
there has been a long standing objection to the use of simple GAs in complex problems
where they have been criticized for poor performance. Speci cally, environments that
vary over time present special challenges to genetic algorithms, since they cannot adapt to
changing functionality once converged, due to lack of genetic variation in the chromosome.
A number of authors (as mentioned in [15]) have used the mechanisms of dominance and
diploidy of biological genetics to improve the performance of simple GA's in nonstationary
environments with some success. Recently, Goldberg et al. developed messy Genetic
Algorithms(mGA) [12] [14] which could solved many complex and deceptive problems.
Our proposed Structured Genetic Algorithm, is a possible alternative approach. The

145
basic concept of the model is drawn from the biological system's exible strategy of
evolution (for genetic variation) and adaptation.
In many real-world applications, there is a time varying situation. The optimum tness
criterion changes in some way over time (typically with change in an external environ-
ment), and the population must adapt to survive, which may result in rapid optimization.
In many situations there may also be the problem that an apparent multi-element change
is required to escape from a local maximum of the tness function, and instances of the
required form may not be present in the population. In this situation apparent multi-
element mutation is required; but in simple GA multi-element mutations are extremely
unlikely to result in viable o spring.
In this paper, we brie y describe the mechanism of the Structured Genetic Algorithm
(sGA), and its implementation in nonstationary function optimization (0-1 Knapsack)
problem, we then present our experimental results and nally, based on the empirical
study, we give conclusions.
2. The Structured Genetic Algorithm.
2.1. Basic principle.
The Structured Genetic model(sGA) [6] [7] allows large variations in the phenotype
while maintaining high viability by allowing multiple simultaneous genetic changes. It is
therefore able to function well in complex changing environments. The central feature
of sGA is its use of genetic redundancy (as in biological systems [2]) and hierarchical
genomic structures in its chromosome. The primary mechanism for resolving the con ict
of redundancy is through regulatory genes [3] which act as switching (or dominance)
operators to turn genes on (active) and o (passive) respectively. It is analogous to the
controlled regulation of structural genes [18] which use promotor and repressor genes for
its expression during biological evolution. So, as in biological systems, the genotype-
phenotype di erence of sGA is vast: the genotype is embodied in the chromosomes
whereas the phenotype is the expression of the chromosomal information depending on
the environment.
In sGA, a chromosome is represented as a set of binary strings. It also uses conventional
genetic operators and the survival of the ttest principle. However, it di ers considerably
from the Simple Genetic Algorithms in encoding genetic information in the chromosome,
and in its phenotypic interpretation.
The fundamental di erences are as follows:
i) Structured Genetic Algorithms utilise chromosomes with a multi-level genetic struc-
ture (a directed graph or tree). As an example, sGA's having a two-level structure of
genes are shown in gure1(a), and chromosomal representations of these structures are
shown in gure 1(b).
ii) Genes at any level can be either active or passive .
iii) High level genes activate or deactivate sets of lower level genes. So the dynamic
behavior of genes at a level - i.e whether they will be expressed phenotypically or not,
are governed by the higher level genes.
Thus a change in a gene value with higher leverage represents multiple changes at
a lower levels in terms of genes which are active. Genes which are not active (passive

146
a1 a2 a3 level 1

a 11 a 12 a 13 a 21 a22 a 23 a 31 a 32 a 33 level 2

(a) A 2-level structure of sGA.

( a1 a 2 a3 a a a a a a a a a ) -a chromosome
11 12 13 21 22 23 31 32 33
and
( 0 1 0 1 0 0 1 1 0 0 0 1 ) - a binary coding

(b) An encoding process of sGA.

Figure 1: A Representation of the Structured Genetic Algorithm.

genes) do not disappear, they remain in the chromosome structure and are carried in a
neutral and apparently redundant form to subsequent generations with the individual's
string of genes . Since sGA is highly structured, a single change at a higher level of the
network produces an e ect on the phenotype that could only be achieved in simple GA
by a sequence of many random changes. The probability of such a sequence in the simple
GA model is incrediblyly small unless, as Richard Dawkins [10] has pointed out, every
single step results in improved viability (an hunch is that this, too, has a much too low
probability to be regarded as an e ective mechanism for large change).
One school of thought (Darwinian) believes that evolutionary changes are gradual; an-
other (Punctuated Equilibria) postulates that evolutionary changes go in sudden bursts,
punctuating long periods of statis when no evolutionary changes take place in a given
lineage. The new model provides a good framework for carrying out studies that could
bridge these two theories.
sGA also di er from recent messy genetic model (mGA) in the following main aspects:
1. mGA has a variable length string and scrues, and on the other hand sGA coding
is of xed-length and may be a neat GA type.
2. mGA uses cut and splice operators in contrast to sGA which uses conventional
genetic operators along with a gene activation mechanism (switching operator).
3. mGA applies two phases of evolutionary processes such as primordial and juxtapo-
sitional, whereas sGA has a single evolutionary process.
4. mGA deals with variable size population but sGA works with xed population size.

For searching a space, the high-level genes can explore the potential areas of the space
(by long jump mutations) and sets of low-level genes can continue to exploit that sub-
space. Also sGA has the advantage of being able to retrieve previously expressed good
building blocks, whereas a simple GA with dominance and diploidy mechanism (used so
far) can only store or retrieve one allele independently. Thus sGA work as a form of long
term distributed memory that stores information, particularly genes once highly selected

147
for tness. This memory permits faster adaptation to environmental changes.
2.2. A Mathematical Outline of Proposed Model.
In a two-level Structured Genetic Algorithm, a genotype may be of the form
A =< S1; S2 > , where A represents an ordered set which consists of two strings
S1 and S2, the length of S2 is an integer multiple of the length of S1 (i.e jS1j = s and
jS2j = sq); there is a genetic mapping S1 7! S2 de ned below.
In other words,
A = ( [a ]; [a ] ); (a 2 f0; 1g; i = 1 : : : s);
i ij i

(a 2 f0; 1g; i = 1 : : : s; j = 1 : : : q),


ij

and the order of the symbols in the string S2 is obtained by arranging subscripts in
row major fashion.
The mapping S1 7! S2 implies that each element a 2 S1 is mapped onto the unique i

substring [a ]  S2; (j = 1 : : : q).


ij

Now let
B = a [a 1 a 2 : : : a ]; i = 1 : : : s;
i i i i iq

where is called a genetic switch or activator and de ned as


B = a S2 = [a ]; if a = 1
i i ij i

= ; if a = 0; i

where  is the empty substring:


The B constitute the parameter spaces of the individual whose phenotypic interpre-
i

tation is as follows.
The appearance (phenotype) of each individual A is expressed by concatenation of all
its activated substrings B . This means that the length of an expressed chromosome is
i

less than the physical length of the chromosome. Hence, the observable characteristics of
an individual do not always indicate the particular genes that are present in the genetic
composition or genotype.
So the total population of individuals,
= fA j 1  p  Popsizeg
p

and each individual consisting of binary string A =< S1p ; S2p >= (0; 1) , where thep
l

physical length of the chromosome with notation above is s + qs = l.


If f is a real valued tness (objective) function
f : ! R+ ; where R+ is the set of positive real numbers:
In general, a multi-level structured string may be represented as
A = ([a ]; [a ]; [a ]; : : :);
p i ij ijk

where the genetic mapping [a ] 7! [a ] 7! [a ] and so on, are generalized


i ij ijk

in the obvious way.


148
3. Nonstationary function optimization.
In order to investigate the adaptability of the structured genetic algorithm in time
varying environment, we selected nonstationary 0-1 knapsack problems where the weight
constraint was varied in time as a periodic step function. The experimental aim was the
temporal optimization in uctuating environments.
The knapsack problem in operational Research is a NP-complete problem, where we
have to nd a feasible combination of objects so that the total value of the objects
(selected from n objects) put in the knapsack is maximized, subject to some capacity or
weight constraint.
Mathematically,
Let W be the weight limitation (i.e maximum permissible weight of knapsack), let the
integers 1; 2; : : : n denote n available types of objects, v and w the value (or pro t) and
i i

the weight of ith object type, then the knapsack problem can be expressed as
max
Xv xn

i i
i=1

subject to the weight constraint


Xw x  W
n

i i
i=1

where x represents the number of objects of type i which are selected. In the 0-1
i

knapsack problem, one object of each type is only available. Then:


x = 1 if the object i is chosen;
i

= 0 otherwise; for (i = 1; 2; : : : n):


Table 1. (also used in [15]) and table 2. (taken from [4]) show the value and weight
of objects along with optimal solutions for two example problems of di erent size. One
problem has two, and another has three, temporal optima. The sGA has no knowledge of
problem parameters or structure, and was forced to infer good knapsack solutions from
codings and tness of previous trials. The tness function adopted for this study was the
penalized value function where any weight constraint violation was squared, multiplied
by a constant (here 20), and subtracted from the total value of selected objects (P p x ).
i i

To test the adaptability of sGA in discontinuous non-stationary environments, the weight


constraint was varied as a step function among the values (shown in tables) of the total
weight of all the objects and it was done after every fteenth generation.
3.1. Experimental details.
To specify the working of sGA more precisely for experimental purposes, a two-level
sGA was adopted. It was assumed that the level of sGA depends on the level of complexity
of search space. So if the problem has one level of search space then two level sGA work
eciently where high level genes can activate the alternate solution space. The rst few
bits (a measure of redundancy and a determining factor like other GA parameters) of
chromosome were high-level bits, each of which activated only one solution space from the
optional solution spaces in the lower level of the chromosome. For these two problems, we
149
Variant
P weight constraintsP
Object Object Object W = 0:5  17=1 w i i W = 0:82  17=1 w i i

Number Value Weight Optimal Optimal


i vi wi xi xi

1 2 12 0 0
2 3 5 1 1
3 9 20 0 1
4 2 1 1 1
5 4 5 1 1
6 4 3 1 1
7 2 10 0 0
8 7 6 1 1
9 8 8 1 1
10 10 7 1 1
11 3 4 1 1
12 6 12 1 1
13 5 3 1 1
14 5 3 1 1
15 7 20 0 1
16 8 1 1 1
17 6 2 1 1
Total: 91 122 P17 13 P17 15
P17=1
i x i vi = 71 P17=1
i xi vi = 87
i=1
xi wi = 60 i=1
xi wi = 100
Table 1
The 17-Object, 0-1 knapsack problem parameters used here with optimal solutions.
Variant weight constraints
Object Object Object W =90 W = 50 W = 20
Number Value Weight Optimal Optimal Optimal
i vi wi xi xi xi

1 70 31 1 1 0
2 20 10 1 0 0
3 39 20 1 0 1
4 37 19 1 1 0
5 7 4 1 0 0
6 5 3 0 0 0
7 10 6 1 0 0
Total: 188P 93 6 2 1
P7=1=1
7
i x i vi = 183 107 39
i xi wi = 90 50 20
Table 2
The 7-Object, 0-1 knapsack problem parameters used here with temporal optimal solu-
tions.

150
tested with four and six optional spaces respectively, where each solution space consisted
of the total objects i.e each bit represented one object.
Our experiments required a speci ed number of high level genes active (one here) in a
chromosome at any one time according to the number of parameters (or solution space)
in the problem domain under consideration. This could not be assumed to hold where the
high level genes are subject to random mutations. The result would tend to a situation
in which more than the required number of high level bits would be active. This would
result in a phenotype bit string that was too long for the problem solution. As an ad
hoc approach, we generated an initial population in such a way that high level section
would have one active bit set and restricted mutations in the high level bits to the closure
of shift to the left or right (alternatively using local mutation by swapping the position
of two high level bits). It is acknowledged that this is biologically unrealistic, since it
undermines the normal assumption of the statistical independence of point mutations,
but it is an equivalent computationly ecient approach. A more biologically realistic
mechanism would be to allow mutations that activate multiple high level bits, and to use
a tness function to exclude the chimerical phenotypes that result from breeding.
In our computer simulation, di erent GA parameter sets were tested throughout the
experiment, the results reported here considered the following GA parameters :
crossover probability, c = 0 75
p :

mutation probability, m = 0 002


p :

population size, N=150


The experiments used a two point crossover operator along with stochastic remainder
selection strategy [1]. Only improved o spring replaced the parent to become member of
the population in the next generation. The results presented here were averaged over 10
independent runs for each problem.
3.2. Observation and Results.
In each generation the best and the average tness are reported as performance mea-
sures. The two problems were run independently. Figure 2 & gure 3 show the best-of-
generation and the generation average results of the rst problem (table 1), and gure 4
& gure 5 give the corresponding results of the second problem (table 2). For the rst
problem the weight constraints oscillated between two values and in the second problem
there were three temporal optima. These graphs exhibit that sGA can adapt quickly
to the abrupt change in environments. Goldberg & Smith [15] also reported that the
diploidy GA with evolving dominance is ecient in the nonstationary knapsack problem
(table 1). We have compared our results with their best results reported. Our results
exhibit improved performance over the previous methods. Goldberg & Smith's exper-
imental results [15] shown a drastic performance failure in generation 135 and in the
last three sets of cycles due to convergence of the whole population to one or other of
the optimum. The results produced by sGA have never shown such poor performance
even though the population converges to optima on many occasions in run cycles, but
always produces uniform results after the initial cycle. Redundant genetic material in
the chromosomes preserved solutions learnt in previous cycles in a passive state which
helps in species adaptation in environmental change. Environmental shift causes changes

151
90

generation-best fitness.
85
80

75
70
65
60
0 50 100 150 200 250 300 350 400
Generation number.

Figure 2. Best-of-generation results of problem 1.


100

80
Average fitness.

60

40

20

0
0 50 100 150 200 250 300 350 400
Generation number.

Figure 3. Generation average results of problem 1.


200
generation-best fitness.

150

100

50

0
0 50 100 150 200 250 300 350 400
Generation number.

Figure 4. Best-of-generation results of problem 2.


90
generation-best fitness.

85
80

75
70
65
60
0 50 100 150 200 250 300 350 400
Generation number.

Figure 5. Generation average results of problem 2.


152
(hypermutation) on the high-level bits to activate an alternate low-level solution space,
resulting in rapid discovery of the other temporal optimum.
The results demonstrate that this new GA model shows improved performance in
robustness of retaining and quickly rediscovering time-varying optima. Our results also
show that not once during the experimental run did the algorithm converge to any local
sub-optimum. Thus sGA provide a long term distributed memory which permit faster
adaptation to a varying environment.
4. Conclusion.
We presented a new genetic search approach (sGA) for temporal optimization in non-
stationary environments. Initial experimental results indicate that this model is more
successful in adapting to cyclically changing enivironments than simple GAs. The results
are very promising and we expect that sGA can be used as a practical tool in real world
applications of time-varying nature.
The Structured Genetic Approach o ers signi cant improvements over the simple ge-
netic model:
1. able to achieve optimization inaccessible to incremental genetic algorithms.
2. not easily trapped within local optima, since a single high-level bit change can bring
the phenotype into an area which would otherwise have required multiple changes.
3. unlike multiple random low-level changes, the high-level change results in higher
guaranteed viability, as the search is restricted to the solution space of integral low-level
genes.
4. able to adapt rapidly to the selective pressure of its changing environment.
5. biological plausibility is one of the most attractive points of this model.
We also noted that in comparison to sGA, the recent mGA model does not have the
ability to adapt in changing tness landscapes once it converges to a global optimum.
However, Deb in his dissertation [11] suggested, but did not simulate, a triallelic scheme
similar to evolving dominance mechanism (as used with simple GA) and dominance
shift operation for optimizing nonstationary functions using mGA. Our study shows that
the well-adapted population structure of sGA with less complexity may be a worthy
competitor of mGA in solving nonstationary optimization problems.
We conclude that this genetic model (sGA) is a novel idea, and the empirical studies
show that it is an ecient function optimizer [5], though it requires more memory space
for carrying apparent redundant material. We believe that the structured Genetic model
will take an important role in ongoing research into the improvement of the genetic
algorithms.
Acknowledgement.
The rst author gratefully acknowledges the support given by the Government of
Assam (India) for awarding State Overseas Scholarship. The authors also wish to thank
Dr. Robert E. Smith and the late Gunar E. Liepins for their valuable comments on the
draft version of this paper.

153
References
1 L. B. Booker. Intelligent behavior as an adaptation to the task environment. PhD
thesis, Computer Science, University of Michigan, Ann Arbor, U. S. A, 1982.
2 R. M. Brady. Optimization strategies gleaned from biological evolution. Nature,
317:804{806, October 1985.
3 T. A. Brown. GENETICS - a molecular approach. Van Nostrand Reinhold Int., rst
edition edition, 1989.
4 N. Christo des, A Mingozzi, P. Toth, and C. Sandi. Combinatorial Optimization.
John Wiley & Sons Ltd., June 1979.
5 Dipankar Dasgupta and D. R. McGregor. Engineering optimizations using the struc-
tured genetic algorithm. Proceedings of ECAI, Vienna (Austria), August, 1992.
6 Dipankar Dasgupta and D. R. McGregor. Species adaptation to nonstationary envi-
ronments: A structured genetic algorithm. Presented at Arti cial Life-III workshop,
Santa Fe, New Mexico, 15-19 June 1992.
7 Dipankar Dasgupta and D. R. McGregor. A Structured Genetic Algorithm: The
model and the rst results. (Technical Report NO. IKBS-2-91).
8 Yuval Davidor. Genetic Algorithms and Robotics. World Scienti c., rst edition, 1991.
9 Lawrence Davis. Handbook of Genetic Algorithms. Von Nostrand Reinhold, New
York., rst edition, 1991.
10 Richard Dawkins. The Blind Watchmaker. Penguin Books Ltd., 1986.
11 Kalyanmoy Deb. Binary and Floating-point Function Optimization using Messy Ge-
netic Algorithms. PhD thesis, Dept. of Engineering Mechanics, University of Al-
abama, Tuscaloosa, Alabama, USA, March 1991.
12 D. E. Goldberg, K. Deb, and B. Korb. Messy genetic algorithms revisited: Studies
in mixed size and scale. Complex Systems, 4(4):415{444, 1990.
13 David E. Goldberg. Genetic Algorithms in Search, Optimization and Machine Learn-
ing. Addison-Wesley., rst edition, 1989.
14 David E. Goldberg, Bradley Korb, and Kalyanmoy Deb. Messy genetic algorithms:
Motivation, analysis and rst results. Complex Systems., 3:493{530, May 1990.
15 David E. Goldberg and Robert E. Smith. Nonstationary function optimization using
genetic algorithms with dominance and diploidy. Proc. of ICGA, pages 59{68, 1987.
16 John H. Holland. Adaptation in Natural and Arti cial Systems. University of Michi-
gan press, Ann Arbor, 1975.
17 K. A. De Jong. Analysis of the behavior of a class of genetic adaptive systems. PhD
thesis, Dept. of Computer and Comm. Science, University of Michigan, U S A, 1975.
18 Mark Ptashne. How gene activators work. Scienti c American, pages 41{47, January
1989.

154

You might also like