IEEE TRA:"lSACTIONS ON INDUSTRIAL ELECTRONICS

,

VOL. 41, NO. s. ()CIOHEK

I~%

519

Genetic Algorithms: Concepts and Applications
K. F. Man, Member, IEEE, K. S. Tang, and S. Kwong, Member, IEEE

Abstract-This paper introduces genetic algorithms (GA) as a complete entity, in which knowledge of this emerging technology can be integrated together to form the framework of a design tool for industrial engineers. An attempt has also been made to explain "why" and "when" GA should bc used as an optimization tool.

1.

INTRODCCTION

HE USE of genetic algorithms (GA) for problem solving is not new. The pioneering work of J. H. Holland in the 1970's proved to be a significant contribution for scientific and engineering applications. Since then, the output of research work in this field has grown exponentially although the contributions have been, and are largely initiated, from academic institutions world-wide. It is only very recently that we have been able to acquire some material that comes from industry. The concept of this is somehow not clearly understood. However, the obvious obstacle that may drive engineers away from using GA is the difficulty of speeding up the computational process, as well as the intrinsic nature of randomness that leads to a problem of performance assurance. Nevertheless, GA development has now reached a stage of maturity, thanks to the effort made in the last few years by academics and engineers all over the world. It has blossomed rapidly due to the easy availability of low-cost but fastspeed small computers. Those problems once considered to be "hard" or even "impossible," in the past are no longer a problem as far as computation is concerned. Therefore, complex and conflicting problems that require simultaneous solutions, which in the past were considered deadlocked problems, can now be obtained with GA. Furthermore, the GA is not considered a mathematically guided algorithm. The optima obtained is evolved from generation to generation without stringent mathematical formulation such as the traditional gradient-type of optimizing procedure. In fact, GA is much different in that context. It is merely a stochastic, discrete event and a nonlinear process. The obtained optima is an end product containing the best elements of previous generations where the attributes of a stronger individual tend to be carried forward into the following generation. The rule of the game is "survival of the fittest will win." In this sphere, there is an endless supply of literature describing the lise of GA. The sheer number of references quoted in this paper is an apt indicator of the extensive work being done in this domain. This does not include the index finding from [1]. The technical knowledge of "what" GA is and "how" it works are well reported. This paper tries not to cover the same ground. Rather, there is room for the introduction of
Manuscript received September 21, 1l)l)5; revised November 26, 1995 The authors are with the City University of Hong Kong, Hong Kong. Publisher Item Identifier S 0278-0046(96)03295-9. 0278-0046/96$05.00

T

GA as a complete entity, in which knowledge of this emerging technology can be integrated together to form the framework of a design tool for industrial engineers. Moreover, a brave attempt has also been made to explain "why" and "when" we should use GA as an optimization tool. It is anticipated that there is sufficient materials being generated in this paper to support this claim. This paper starts by giving a simple example of GA, as described in Section II, in which the basic framework of GA is outlined. This example forms the cornerstone to the architecture of this paper. For the benefit of newcomers to this particular field, the essential schema theory and building block hypothesis of genetic algorithms are briefly given in Section III. What makes GA work and how does it improve its evolution are the essence of GA. There are a number of variations used to achieve these tasks, and each task has its own merit. In Section IV, a range of structural modifications for GA in order to improve its performance are thus recommended. Since so much has already been published about what can be done with GA, a short list of items cataloguing the advantages of using GA is given in Section V. In Section VI, an account of what GA "cannot do" is given. The well-known phenomena of deception and genetic drift are described. In addition, the problems concerning the real time and adaptiveness of GA are also reported. As this paper is targeted at a specific audience, the collection of practical systems being implemented are introduced in Section VII, whereas Section VIII outlines the possibility of integrating GA into emerging technologies such as neural networks and fuzzy systems. Finally, the conclusions reached in Section Xl and recommendations for future works are also given.

II.

BASIC COl'CEPTS OF GENETIC ALGORITHMS

The basic principles of GA were first proposed by Holland [66]. Thereafter, a series of literature [33], [52], [89] and reports [10], [11], [1021, [118] became available. GA is inspired by the mechanism of natural selection, a biological process in which stronger individuals are likely be the winners in a competing environment. Here, GA uses a direct analogy of such natural evolution. It presumes that the potential solution of a problem is an individual and can be represented by a set of parameters. These parameters are regarded as the genes of a chromosome and can be structured by a string of values in binary form. A positive value, generally known as fitness value, is used to reflect the degree of "goodness" of the chromosome for solving the problem, and this value is closely related to its objective value.
© 1996 IEEE

This criterion can also be set by the number of evolution cycles (computational runs). = 1(T.5. /1 perturb the mated population stochastically mutate P' (t).520 IEEE TRANSACTIONS ON I\JDlISTRIAL ELECTRONICS. II evaluate fitness of all initial individuals of population evaluate P (t). A scheme called roulette wheel selection [33J is one of the most commonly used techniques in such a proportionate selection mechanism. emulating the survival-of-the-fittest mechanism in nature.001 size (30) -c 0. The cycle of evolution is repeated until a desired termination criterion is reached. fitness. /1 evaluate it's new fitness evaluate P' (t). a one-point crossover mechanism is depicted on Fig. I. the process is applied to each offspring individually after the crossover exercise. Standard genetic algorithm. Bit mutation on the fourth bit. 2. However. NO.lJ. Pc as the. The size of this population varies from one problem to the other although some guidelines are given in [83]. 4): z where T. The portions of the two chromosomes beyond this cut-off point to the right is to be exchanged to form the offspring. An operation rate (Pc) with a typical value of between 0. It is expected that from this process of evolution (manipulation of genes). etc. I. VOL 43. Standard Genetic Algorithm { 0 1/ start with an initial time crossover point t:=O. which means a better solution to the problem. two fundamental operators-crossover and mutation-are required. [Irri • For small population crossover rate. the amount of variation of individuals between different generations. for mutation.P (t). Example of one-point crossover." are selected via a specific selection routine. although the selection routine can be termed as the other operator.6 = 0. It alters a usually random population of individuals initpopulation P (t). ad Fig. Example of GA for Optimization There is no better way to show how GA works than by going through a real. a subsequent generation is created from the chromosomes in the current population. To illustrate this further. In each cycle of genetic operation. the "better" chromosome will create a larger number of offspring. /1 test for termination criterion (time. Pm Fig. nonlinear optimization problem. named as total fitness (N). example to demonstrate its effecti veness.01 genetic algorithm. their settings are critically dependent upon the nature of the objective function. • Generatea random number (n) between 0 and total fitness N • Return the first population member whose fitness added to the fitness of the preceding population members. OCTOBER 1996 TABLE I ROULETTE WHEEL PARENT SELECTION Original Chromosome • Sum the fitness of all the population members. /1 recombine the" genes" of selected parents recombine P' (t). 3. In order to facilitate the GA evolution cycle. This selection issue still remains open to suggestion although some guidelines have been introduced by [36J and [59J: • For large population crossover rate.9 = 0. In a practical application of GA. The genes of the parents are to be mixed and recombined for the production of offspring in the next generation. control parameters can be a complex. the selection procedure is listed in Table 1. 3 summarizes size (100) =0. To further illustrate the operational procedure. a population pool of chromosomes has to be installed and they can be randomly set initially. . and thus has a higher chance of surviving in the subsequent generation. or a predefined value of fitness. generally called "parents" or a collection term "mating pool. is greater than or equal to n.6-1. Pc mutation rate. Offspring Throughout a genetic evolution.) while not done do II increase the time counter t:= t + 1. /1 select a SUb-population for offspring production P := selectparents P (t). Pc mutation rate. This can only he successful if a group of those chromosomes. II select the survivors from actual fitness P := survive P. New Chromosome Fig.0 is normally used as the probability of crossover. the standard A. A crossover point is randomly set. each bit randomly with a small probability (Pm) with a typical value of less than 0. y) (1) Y E [-l.1. termed an evolving process. The choice of Pm. a fitter chromosome has the tendency to yield good-quality offspring. 1/ initialize Parents Fig. Furthermore. J) Problem: To search the global maximum point of the following objective function (see Fig. but simple.

6 clearly demonstrates that GA is capable of escaping from local maxima to find the global maximum point.: GE:-!ETIC ALGORITHMS: CONCEPTS AND APPLICATIONS 521 STEP 1: Parent Selection Objective Value z 3. 0(') be the order of schema which is the .864222 = f(x.8 Short." It is clear that every schema matches exactly 2" strings. Here. and the y coordinates with 8-bit resolution each. A multimodal problem.481746 3. schemata are sets of strings (encoded form of the chromosome) that have one or more features in common. In this example. when "r" is the number of don't care symbols "#" in the schema template. The design methodology of GA relies heavily on Holland's notion of schemata.668023 6. above-average schemata receive exponentially increasing trials in subsequent generations of a genetic algorithm [89].85 and O. Population size is set to 4 (In general. t) be the number of strings matched by schema "5" in the tth generation. Generation to generation. A schema is built by introducing a "don't care" symbol "#" into the alphabet of genes. b( 5) be the defining length of the x Fig. L respectively. On this front.261380 12.MAN et al. An example of GA.092550 6. there are two schools of thoughts for explanation: schema theory and building block hypothesis. For example. 01l0100}.. it is conducted via simulation based on MATLAB with the Genetic Toolbox [19]. A schema represents all strings (a hyperplane or subset of the search space). low-order. Fig. the set of the schema #1101#0 is {11l0110. 5.044649 STEP 4: Reinsertion Second Population ~ ~ 1111010100100110 100001111 lOO 1100 1000010100110110 1101011111001100 Fig. It simply states that. J" 110000111110011001 z = 6. 6. see Appendix. A. we can now carryon to gain a deeper understanding of GA. only two offspring are generated for each evolution cycle. this number should be much larger). One-point crossover and bit mutation are applied with operation rates of 0. f(x. Thus far.044649 6. THEORY AND HYPOTHESIS STEP 3: MUTA1l0N 111010101001101101 110000111110011001 -t 111110101001001101 z = 8. HI.y) crossover point 2) Implementation: The chromosome is to be formed by a 16-bit binary string rcprcscnting z .261380 12. 1110100. the necessary justification of how GA works is yet to be illustrated.g. A number of software packages can be utilized for this exercise. schema 5 which is defined as the distance between the outermost fixed position. Schema Theory 0.092550 Objective Value z 8. Fig. 4.0110110.864222 = First Population 1100110110101000 0101010110110101 1000010100110110 1101011111001100 STEP 2: CROSSOVER 1101011111001100 1000010100110110 Fig.y) Having established the fundamental principles of GA in structure arrangement and operational procedure. #1101#0. 5 shows the typical genetic operations and the changes of the population from first generation to second generation. which match it on all positions other than "it. Let ((8. e.

called the building block [89]. 2) Power Law Scaling: The actual fitness value is taken as the kth power of the objective value. the predictions of the GA could be useless or misleading on some problems [63]. using the average fitness is only relevant to the first population [60]. graph coloring). After this. A detailed derivation of this equation can be referred to [89]. a standard GA has many limitations. of one-point crossover and mutation into account.S. t) F(t) J (2) where Pc.Pc __ lS) . L be the length of the chromosome. highperformance schemata. and cause maximum scaled fitness to be a specified multiple of the average fitness. Crossover tends to conserve the genetic information present in the strings. t) be the fitness value of schema S defining as the average fitness of all strings in the population matched by the schema S. t) in the next. the operators. Despite this formulation. 0. generally in least square form) as a measure to the chromosome's performance. a reproductive schema growth equation is thus obtained and expressed in (2). Secondly. However. String-based representation may pose difficult and sometimes unnatural answers to some optimization problems. the value of [iS. First. A minor modification is the use of Gray code in the binary coding. It takes the chromosome as inpul and produces a number or list of numbers (objective value. and F (t) he the average number of occurrences of S. =a . It tends to bias toward building blocks with higher fitness values. This representation was introduced especially to deal with real parameter problems. This is an important link between GA and the system. and at the end ensures their representation from generation to generation. frS. objective value is rescaled to a fitness value [52]. since schema can interfere with one other. IV. the structure of a GA has to be altered to suit the requirement. As the problem becomes complicated. NO. Chromosome Representations ((8. However. l) in the current population may differ significantly from the value of [i S. = O~ (4) . the sampling of strings will become biased and the inexactness makes it impossible to predict computational behavior. In addition. To maintain uniformity over various problem domains. STRUCTURE MODIFICATION OF GENETIC ALGORITHMS Because of the GA mechanism. The work currently taking place by [71] indicates that the floating point representation would be faster in computation and more consistent from the basis of runto-run. and conflicting. and even LISP S-expressions [77] can thus be explored. embedded lists [89] (for factory scheduling problems). The implicit parallelism lower bound derived by Holland provides a figure about the number of schemata which are processed in a single cycle in the order of N3. its capacity to generate new building blocks diminishes. B. Mutation. as 1. where N is the population size. 0. Building Block Hypothesis A genetic algorithm seeks near-optimal performance through the juxtaposition of short. t + 1) -: ((8. There are many facets of operational modification to be introduced on the chromosomes. Recently. 1 .' promote. Reference [67] investigated the use of GA for optimizing functions of two variables and claimed that a Gray code representation worked slightly better than the normal binary representation. Thus. such as order-based representation l33J (for bin-patching. B. Taking the effects of proportionate selection. is not a conservative operator but capable of generating new building blocks radically. t) ~ . the opinion given by [54] suggests that a real-coded GA would not necessarily yield good results in some situations. 43. parent selection is an important procedure to be devised. despite many practical problems have been solved by using real-coded GA. 1) Linear Scaling: The fitness value (ji) of chromosome i has a linear relationship with the objective value O. VOL. however. and juxtapose (side by side) building blocks to form the optimal strings. it is therefore not difficult to conceive that a simple structure of GA can be devised for many practical applications as an optimizer. when the strings for crossover are similar. However. even to the extent of causing a complete failure or breakdown in computation.o(S)Prn L-l l [). [122] raised some considerable interest. it has limitations that lead to the restriction of its use. and the implementation. the direct manipulation of real-value chromosomes [71]. The genetic operators we normally refer to crossover and mutation have the ability to generate. respectively. there is insufficient consensus to be drawn on this argument. The use of other encoding techniques. A. [89]. Objective and Fitness Value The objective function of a problem is a main source providing the mechanism for evaluating the status of each chromosome. Pm are the operation rate of crossover and mutation. So far. low-order. Thus. multitasking. its range of values varies from problem to problem. +b (3) where a. The following operations are recommended for appropriate modifications of a GA. which is neither governed by differential equations nor behaves like a continuous function. [i S. f. variable element lists [89] (for semiconductor layout). b are chosen to enforce the equality of the average objective value and the scaled average fitness values. Bit string encoding [66] is the most classical approach used by GA researchers because of its simplicity and traceability. OCTOBER 1996 number of fixed positions present in the schema. It is also without doubt that a standard GA is capable of solving a difficult problem which the conventional gradient-based or hill-climbing techniques might have trouble with.522 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS.

This is a process used to determine the number of trials for one particular individual used III reproduction. However. The chromosomes are selected proportionally to their rank rather than actual evaluation values. 8. This ability may outweigh the disadvantage of destroying building blocks and make uniform crossover a superior operator for some problems [104]. Efficiency is related to the overall time complexity of the algorithms. bias. Example of multipoint crossover. . is very much problem oriented. This has since been contradicted by [101] as two-point crossover could perform poorly in a situation where the population has largely converged because of reduced crossover productivity. The number of effective crossing points -is not fixed. The chance of selecting one chromosome as a parent should be directly proportional to the number of offspring produced. except that one-point crossover is indicated as a "loser" experimentally. I. Some other problem-based crossover techniques have been proposed. Another approach is the uniform crossover. D. Reference [52] described a partially matched crossover (P~IX) for the order-based problem.7. It can generally be implemented with time complexity of the order of N 101'. A general comment is that each of these crossovers is particularly useful for some classes of problems and quite poor for others. L35J concluded that a two-point crossover seemed to be an optimal number for multipoint crossover.: GENETIC ALGORITHMS: CONCEPTS AND APPLICATIONS crossover points /\\. E. 7 where multiple crossover points are randomly selected.(0 - C' (J) (5) where c is a small integer. (m. Therefore. An example of this operation is depicted in Fig. Such a technique is known as inversion. 0 is the mean of the objective values. the performance of generating offspring is greatly improved. spread. N where N is the population size. Reordering/Inversion As stated in the building block hypothesis. Reference [39J reports on sever-al experiments for various crossover operators. of chromosome Ii = 0. Reference [2S J designed an "analogous crossover" for robotic trajectory generation. 3) Sigma Truncation: The fitness value 'I is calculated according to or even varying OffSPring( Fig. it has one major drawback in that certain combinations of schema cannot be combined in some situations [89]. This generates offspring from the parents based on a randomly generated crossover mask. Stochastic universal sampling (SUS) is another single-phase sampling algorithm with minimum spread. zero bias and the time complexity of SUS is in the order of N [8]. The resultant offspring contains a mixture of genes from each parent. the use of a crossover technique to improve the offspring production. the order of genes on a chromosome is critical. This low-crossover productivity problem can be resolved by the proposal of reduce-surrogate crossover [12]. Reference [52] proposed reversing the order of gene between two randomly chosen positions within the chromosome. A multipoint crossover can be introduced to overcome this problem. Parents Mask ( 100011101110001111 523 Parents Fig. The operation is demonstrated in Fig. Spread is the range in the possible number of trials that an individual may achieved.MAN et al. As a result. All in all. and efficiency. To prevent negative value of any negative result < 0 is arbitrarily set to zero. There are other methods can be used such as the ranking scheme [71. Example of uniform crossover. Roulette wheel selection tends to give zero bias but potentially inclines to spread unlimitedly. and (T is the standard deviation in the population. The preference of which crossover techniques to use is arguable. a proficient parent selection mechanism is necessary. Reference [8] presented three measures of performance of selection algorithms. Techniques for reordering the positions of genes on the chromosome have been suggested. Bias defines the absolute difference between actual and expected selection probabilities of individuals. This introduces an alternative to proportional fitness assignment. The purpose of reordering attempts to find gene orders which have better evolutionary potential. it can combine features regardless of their relative locations. Crossover Operations Although one-point crossover was inspired by biological processes. but will be averaged at L/2 (where L is the chromosome length). Since the uniform crossover exchanges bits rather than segments. I C Selection Mechanisms To generate good offspring. Offspring = 3) where k is in general problem dependent during the run. 8. . It has been shown to help in the avoidance of premature convergence and to speed up the search when the population approaches convergence [117]. I. there is no unified view on this front.

On the other hand. Since the best chromosome of the population may fail to reproduce offspring in the next generation. The recessive genes provide additional information for the changing environment. The triggered hypermutation mechanism is an adaptive. Usually. the problem of computing overrun time requires much attention before it can be resolved. discrete. and highly dimensional search algorithm. The increase of crossover probability causes the recombination of building blocks to rise. this can be painfully long. but help to reintroduce the lost genetic material. Syswcrda [105] imposed a fixed schedule for both cases but Booker [12] utilized a dynamically variable crossover rate which is dependent upon the spread of fitness. large changes in the location of the optimum. individual . One disadvantage of this method is that the slave sits idly by while the master is handling its job. [18]. Triallelic representation consists of a diploid chromosome and a third allelic structure for deciding dominance. a part of the current population is replaced by the new generated offspring. and at the same time. with a decreasing crossover rate during the run while the mutation rate increases. as well as the nature of the population structure and recombination mechanisms used. The methods of parallelization can be classified as globaL migration. Random immigrants mechanism [62]. This is understandable as GA is not a mathematically guided solution to the problem. should the mutation probability increase. Global GA treats the entire population as a single breeding mechanism. References [32] and [33] modified the operator probabilities according to the success of generating good offspring. As each operator probability may vary through the generations. it is based on the master-slave relationship shown in Fig. The elitist strategy may increase the speed of domination of a population by a super chromosome. Davis [30] suggested a linear variation in crossover and mutation probability. the chromosomes in the current population arc completely replaced by the offspring [59J. nonlinear. in a nutshell. 5. On the contract. Should it be adaptive to dynamic signal behavior and/or should it be able to sustain environmental disturbance? Whichever it is. OCTOBER 1996 F Reinsertion After generating the subpopulation (offspring). Sometimes. the population with size N will generate an equal number of offspring in this strategy. 1. Statistical process control would then be devised to monitor the best performance of the population such that the GA-based optimization system adapts to the continuous. two basic strategies have been developed. Rather. this would transform the genetic search into a random search. NO. but on balance it appears to improve the performance. Migration GA (coarse grained parallel GA) divides the population into a number of subpopulations. and diffusion. particularly in control and signal processing areas. it also increases the disruption of good chromosomes. G. the recornrnendation made by [36] and [59] is the yardstick to apply. Parallel nA The choice of an optimal prohahility operation rate for crossover and mutation is another controversial debate for both analytical and empirical investigations. To ensure that GA responds properly for a changing environment. To encourage the proliferation of good genetic material throughout the whole population. On a distributed memory computer. it requires no extra effort to construct a parallel computational framework. A successful application of this global GA approach can be found in [29J and [37]. it is usually combined with an elitist strategy such that one or a number of the best chromosomes can be copied into the succeeding generation. chromosomes are stored in the shared memory.524 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS. and statistical process control [108] are grouped as another type of strategy. when a small number of offspring are generated in the evolution cycle. the worst chromosomes are replaced when new chromosomes are inserted into the population. a direct replacement of the parents by the corresponding offspring may also be adopted. GA has to cope with the requirement such that the time-dependent optima are reached. To use GA for practical applications. Considering that GA already has an intrinsic parallelism architecture. 9. It can be implemented on a shared memory multiprocessor or distributed memory computer. This approach increases diversity in the population to compensate for changes encountered in the environment. Techniques for Changing Environments There are many instances where GA may be used to optimize the time-variant system characteristics. each of which is treated as a separate breeding unit under the control of a conventional GA. Triggered hypermutation mechanism [22]. there are several representative strategies to replace the old generation that can be proposed. There are a number of GA-based parallel methods capable of enhancing the computational speed [15]. It is merely a stochastic. VOL 43. These categories reflect different ways in which parallelism can be exploited in GA. Despite all these suggested methods. However. It works well in environments where there are occasional. On a shared memory multiprocessor. Probability Rates Setting [51]. In the case of generation replacement. H. This mechanism temporarily increases the mutation rate to a high value whenever the best time-average performance of the population deteriorates. time-dependent nonstationary environment. Each processor evaluates the particular assigned chromosomes. A typical example is triallelic representation One of the major criticisms of using GA must be the time spent in computation. This is not easy for a standard GA to handle. GA can be fully exploited in its parallel structure to obtain the speed required for practical uses. [23]. The first strategy is to expand the memory of the GA in order to build up a repertoire of ready responses for environmental conditions. Therefore. mutation-based mechanism to adopt the environmental change. The random immigrants mechanism replaces a fraction of the GA's population by randomly generated new individuals.

10. Glohal GA. A similar strategy. Send emigrants · . Ring migration topology. where individuals are transferred between directionally adjacent xubpopulations. the GA was implemented on a connected ladder network using transputers with one individual per processor.) WHILE not finish SEQ . This applies to small multiprocessor pathforms or even the clustering of networked workstations. This neighborhood is usually chosen from immediately adjacent individuals on the population surface and is motivated by the practical Fig. Some different topologies have also been studied in this area [3]. [9]. Unrestricted migration topology is depicted on Fig. each individual is assigned to a processing unit which is placed on a 2-D grid topology.MAN et al. Evaluation PAR · . Neighborhood migration topology.. A pseudo code is expressed in Table II. is shown on Fig. Given the range of possible population topologies and migration paths between them. . Usually. [112] and rings [56] are commonly used for this purpose.. II. as indicated in Fig. Figs. The individual migrants are then determined according to the appropriate selection strategy. Reproduction · .. 11. Selection and mating are only possible with neighboring individuals. individuals may migrate from any subpopulation to another. 9. communication restrictions of parallel computers.. In addition. The pseudo code is listed on Table III. where migration can be made between nearest neighbors bidirectionally.. The individuals are allowed to breed with individuals contained in a small local neighborhood. Receive immigrants Fig. migration between the subpopulations occurs from time to time. In this configuration. considers the population as a single continuous structure.. Reference [85] introduces a massive parallel GA architecture with population distributed with a 2-D mesh topology. TABLE PSEUDO CODE 11 GA OF MIGRATION -Each node (GA. 10 shows the ring migration topology. Diffusion GA (fine grained parallel GA). Fig.: GENETIC ALGORITHMS: CONCEPTS AKD APPLICATIONS 525 GAMaster Selection Fitness Assignment Slave 1 Recombination Mutation Slave 2 Recombination Mutation Function Evaluation Slave k ••• Recombination Function Evaluation Mutation Function Evaluation Fig. 12. The topology model of the migration GA is well suited to parallel implementation on multiple instruction multiple data (MIMD) machines. Selection ·. 10-12 show three different topologies in migration. 13. ASPARAGOS system. The architecture of hypercubes [24]. Here. The practical application of diffusion GA to solve a 2-D bin packing problem has been reported in [78]. [57] and [91] introduce an asynchronous parallel GA. efficient communications networks should thus be possible on most parallel architecture. Reproduction · . for a connection of massively parallel computers. and the same technique has been implemented to tackle a job shop scheduling problem [106].. known as neighborhood migration.

It has been used for the optimization of structure. One of the nicer niches of using GA is its capability of solving multiobjective problems [44]. Send self to neighbours . (b) Topology of the network by control genes. The chromosome is formulated in a hierarchical structure. (e) Associated weighting connection genes. The structured genetic algorithm (SGA) was first proposed by [26J. K..'t] In-active aeuroa connection genes (c) Fig. GA has demonstrated its power in obtaining the Pareto-optimal set instead of a single solution [42J. Receive neighbours . Evaluate PAR . Pareto-optimal solutions are also called nondominated. or noninferior. NO.... Sometimes in everyday life. SGTNN structure. (a) Control genes. conflicting. Fig.. The multiple objective approach can be incorporated into SGA (MOSGA) which gives the flexibility of strncture optimization.. and the topology in neural networks (NN) [109J. and sometimes mathematically difficult. like the number of membership sets or rules in fuzzy logic.. Unrestricted migration topology.. -01 Layer control bit ZAB .tivCUCWVA [w1. The most noted examples of applications are the simple weighted sum approach [70J and the target vector optimization [120J. Fig. This is known as the concept of Pareto-optimality.-'~': ....5. 13.. Diffusion GA. TABLE PSEUDO III GA CODE OF DIFFUSION The solution set of a multiobjective optimization problem consists of all those vectors such that their components cannot all be simultaneously improved. Select mate . Sometimes. To maintain population diversity and enable searching between different structures. and the solution set is known as the Pareto-optimal set [42]. Structured GA -Each node (Ii. SGA has been further developed as a structure optimizer. VOL. a Pareto-optimal set of solutions requires a trade-off with different objective performances. Multiobjective Gil. the population is formulated by a number of subgroups [109J.. Reproduce J.j) Initialize WHILE not finished SEQ . This is particularly useful for meeting system design specifications that are complex. [44]. we seldom experience the luxury of optimizing a single and perfect solution from an objective function.W1.N control genes neuron control bit x : don't care slate A B N o : inactive stale 1 : active state (a) o (b) • Iv. 14.. Each subgroup . 12.526 IEEE TRANSACTIONS 01\ INDUSTRIAL ELECTRONICS. 43. 14 illustrates how SGA can be used for NN topology optimization. Of course. a simply way to handle this multiobjective optimization problem is to combine and aggregate the objectives. The deactivated genes provide additional information that allow it to react to a changing environment [27]. Higher-level nodes in the structure govern the activation of the lower-level genes. OCTOBER 1996 z I 0 [D . order of the transfer function [110]. Several applications using this approach have been reported. Nevertheless. solutions. Fig. [69J.Wl.

1) > 1(1. which is not always the case. This causes the failure of the building block hypothesis. an adequate but efficient mathematical model is desirable. The schemata (0. These include preselection [82]. and let 1(1. A number of techniques have been proposed to limit the effect of genetic drift and maintain population diversity. V. the population may converge toward this point and premature convergence occurs. In a fuller deceptive problem. • Structured GA provides a tool to optimize the topology or the structure in parallel with the parameters of the solution for a particular problem. VII. Depending on the nature of the objective functions. generally in the form of a transfer function or infinite impulse response (IIR) filter. there are some things that GA just cannot or finds difficult to do. Therefore.MAN et al: GENETIC ALGORITHMS: CONCEPTS AND APPLICATIONS 527 stores up structure. r521. [55]. Now. [991. A. Because of the multimodal error surface of IIR together with its system stability criterion are to be met simultaneously. some insight into why GA has become more and more popular should be given. Furthermore. Such functions are referred as GA-deceptive functions [34]. it has been noticed that a fitness value is evaluated for each offspring. 1) represents the optimal solution for a two-bit objective function. Parameter and System Identification In the systems-design phase for both control engineering and signal processing. Real-Time and On-Line Issues Similar to the other artificial intelligence (AI) techniques and heuristics. with our aim to draw your attention by bringing out some practical examples for which GA has been successfully implemented in the area of industrial electronics. Whereas the use of GA is found . • It can handle problem constraints by simply embedding them into this chromosome coding. VI. This method assumed the prior knowledge of the objective function. 1). This is only a minimally deceptive problem that is considered to be a partially deceptive function. but merely an indicator to realize the trend and potential growth in the future. The following three phenomena are often encountered. This is by no means an exhaustive list. A. In the evolution cycle. it is unwise to apply GA directly to a real system without any simulation model. it is ready to solve multi modal. 1) as an instance and that leads the GA away from the solution (1. very bad chromosomes can be generated by combining good building blocks. 0). The required model. crowding r351. I. This unfavorable nature limits the use of GA in the real-time problem. 1) > 1(0. As GA seeks out a suboptimal point. noncontinuous. all low-order schemata containing a suboptimal solution are better than other competing schemata [34]. To obtain such a model. Deception Some objective functions may be very difficult to optimize by GA. • Multiobjective problem can be addressed by means of the approach stated in Section IV. is dominantly addressed. Another characteristic that limits GA' s application in online application for control is its randomness. To illustrate this point in a mathematical fashion. Since GA has the property of exploring the searching domain. assume that (1. For example. the global optimal solution can only be obtained by the exploration of mutation in the genetic operations. nondifferientable. or even NP-cmnplete problems [47]. • It can be easily interfaced to existing simulations and models. SHORTCOMINGS OF GENETIC ALGORITHMS: PROB! JiMS AND DIFFICULTY Of course. the simplest GA-deceptive function is the minimal deceptive problem [52]. the pros and cons of using GA have been addressed. and fitness sharing [45]. This possibility is reduced if there is a loss of population diversity. The first one is to modify the coding in an appropriated way. this problem is also confronted with when GA is used for solving multiobjective problems for which a single optimal solution is sought instead of a set of Pareto solutions. the gradient type of optimization algorithms would have the problem to obtain an adequate solution. we can proceed. C. [50]. Genetic Drift (Bias) There is no guarantee of obtaining the global optimal point by using GA although it has the tendency to do so. GA is attractive for a number of reasons. ApPLICATIONS Thus far. Such a phenomenon is known as genetic drift [12] and can easily occur when the GA is set with a small population size. Three approaches were proposed to deal with deception [89]. 0) > 1(0. • It is a very easy-to-understand technique with very few (or even none) mathematics. • Since it is a technique independent of the error surface. The last one is the use of messy genetic algorithm [53]. The population may become better and better but the same cannot be said about an individual. the variance of the response time for an GA system is much larger than the conventional methods. chromosomes that have a particular class of B. Moreover. the performance is not guaranteed for one particular offspring. ADVANTAGES OF GA: WHY Is GA USED? After discussing the modification of the GA. the research work in the area of parameter and system identification has been very active for the last 30 years. the GA is not suited for analyses that would provide guaranteed response times [39]. In this case. #) does not contain the optimal string (1. The second approach uses the inversion operation.

The IIR filter in the form of cascade. should be optimally determined. apart from the pointer to the next node. the multiple objective approach can be also adopted to further enhance the degree of success of meeting the design criteria of extreme plants [1101. NO. some parameters are required to be optimized in order to give a better overall control performance. It has also been reported that GA is being used for designing a sliding mode control system [90]. or lattice form can be assigned for optimization using GA. This is another approach to address the problem of obtaining a static gain matrix in reaching phase which was vividly described in [64]. parallel. One further advantage of using GA for identification is its capability to find an optimal order of the IIR filter [11OJ.5. and the heat exchanger system [72). [121J. B. Equation (6) shows the string structure of a trajectory consisting of l arm configurations n-tuple (6) In-tuples known as proportional. distillation column [IIOJ. integral. Synchronization between these two chaotic systems is achieved and is desired for the use of secure communication systems [21]. and a Boolean variable b indicating whether the given node is feasible or not. Reference [28J demonstrated the application of GA in the trajectory of a three-arm-configuration robot. In the case of robust control. Successful implementations are found in the pH neutralization process [115J. VOL. Furthermore. and the benchmark problem [58]. the configuration or the order of the controller may be optimized to reduce the system complexity. this navigation scheme should then be designed to cope with such constraints. Here. First. OCTOBER 1996 to be an ideal solution for this problem. the searching domain of parameters should be set within the range of + 1 to -1 [116J. for a stable IIR filter in lattice form. The end effector's path is of interest. whereas the on-line planner is responsible for handling possible collisions or previously . GA may be incorporated into the LSDP for searching the shaping where each n-tuple represents an arm-configuration. GA can be applied simultaneously for optimizing the order and coefficient without much effort to alter the chromosome structure. The optimization of both order and parameter of the shaping function can be undergone simultaneously [48]. EN unifies off-line and on-line planning with a simple map of high-fidelity and efficient planning algorithms. consists of x and y coordinates of an intermediate knot point along the path. Michalewicz [89) adopted the order-based coding in his evolutionary navigator (EN). A number of successful applications have been reported in this area. the principal of the H= loop shaping design procedure (LSDP) [88J is used to design a precompensator (W1) and a postcompensator (W2) to shape the nominal plant (G) so that the singular values of the shaped plant (Gs = W2GW1) have a desired open-loop shape. Considering that such navigation is the art of directing a course for a mobile robot to traverse in a restricted environment. the coefficients must lie inside the stability triangle [lOOJ for stability assurance. C. The GA used for determining the parameters of the Chua's chaotic oscillator [20J proved to be a sound tactic as the temporal series of the state variables and parameters of another Chua's oscillators are inherently known. The use of GA for the purpose of identification is not limited to the IIR design. This approach of system identification is considered to be a difficult problem for the gradient type of optimization schemes. Each chromosome consists of a numher of integers while each represents a particular term. the configurations of the shaping function W1. The advantage' of GA in robust controller design are twofold. GA can be applied to a number of control methodologies for the improvement of the overall system performance. The off-line planner searches for the optimal global path from the start to the desired destination. and derivative gain. For the case of parallel or cascade of second-order form. 43. the coding of GA must be modified to accommodate variable-length and order-dependent strings. Therefore. the application of GA in robotics has only found a use in robot navigating systems. to the best of the authors' knowledge. the required three-terms-parameters which arc function space in order to find a suitable robust controller so that an explicit close-loop performance is met. A chromosome in EN is an ordered list of path nodes. so that the robot is capable of reaching its desired destination without getting lost or crashing into any objects. Control For its use in control systems engineering. In classical proportional integral derivative (PID) control problems. these parameters can he optimally obtained. Each of the path nodes. as GA has the ability to solve multimodal problem with relative ease [93J.528 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS. An N -array but order-independent chromosome representation is used for the generation of a nonlinear model with a fixed number of terms. [76]. Noticeable contributions are found in magnetically levitated (maglev) vehicles [25J. This is because of the flexibility of structuring of filter's parameter into the form of a chromosome that can be arranged in a prefixed manner. Since the number of arm-configurations used is unknown until the solution is found. Reference [43) applied GA for term selection in a nonlinear autoregressive moving average model with exogenous inputs models. In most of the controller designs. This offers the flexibility to search for the parameters of controllers of different degrees of complexity and at the same time it provides a means of achieving the certain optimality. Final trajectory would consist of a series of ordered arm-configurations. Secondly. the power of GA also contributes to nonlinear system identification problems [16]. W2 do not necessarily have to be predefined. Despite the method of Ziegler-Nichols (ZN) ultimate-cycle tuning scheme. Robotics Thus far.

: GENETIC ALGORITHMS: CONCEPTS AND APPLICATIONS 529 unknown ohjects by replacing a part of the original glohal path by the optimal subtour. GA has been demonstrated as a power tool for such problems that can be even NP-completed. or better than. This is a contrast to the traditional method of layout design for VLSI-chips where the cell and global routes are firstly determined. A Faceprint system was designed in New Mexico State University [14] for reproducing the feature of a suspected criminal's face. GA takes that information and follows the evolution cycle to obtain the optimal solution. Another example is the simultaneous optimization of global cells placement and routing in a very large scale integration (VLSI) layout [98]. G. the spoken speech patterns (test pattern) are usually identified with the prestored speech patterns (reference patterns). the factors of different environments. hair. GA is applied to optimize the target pressure distribution so that a supercritical airfoil shape is obtained via design samples from this method. circuit layout and many other applications. chin) of a face. will result in lacking of prominent dominant points. nose. As engineering design is very much an art form. The use of GA in this context is the best to apply. A novel chromosome representation of a primitive by a minimal subset of member points is proposed. A witness can then rate each face on a lO-point subjective scale. A fixed-sized template is applied to the geometric data to count the total number of points inside the template and then the fitness value is assigned. In this task. Furthermore. time registration of the test and the reference patterns is one of the fundamental problems in the area of automatic isolated word recognition. whereas five-points is used for a unique conic. three-points is used to define a unique circle. the use of this to create enhanced designs is an object to pursue. . A linear filter is found in the first stage (screener) to select subimages while a filter bank (a set of filters) is being generated in the last stage (classifier) for class decision. If there are n geometric data points in the input that has an associated index say. On a similar front. Initially. Then. motions. Reference [96] applied GA. cause a dramatic change in the number and distributions of the dominant points. The GA approach maintains a global view of the layout surface. certain classes of object shapes. Pattern Recognition The use of GA for pattern recognition has been widely studied. They can be generalized and grouped into two categories. Planning and Scheduling This is another area where GA can be comfortably applied. 2) Recognition: A planar object is taken from two viewing positions to be considered as a relationship which is governed by an affine transformation matrix. based on a minimal subset representation. Optimization is often required for the planning of actions. Such an approach provides a means to generate suitable filters rapidly with a minimal need for human intervention. and the endpoint constraint relaxation. In addition. at the end. pump scheduling in water industry [81]. a circle for instance. the associated target pressure distribution is to be acquired by the Navier-Stoke solver. Experiments have indicated that the GA approach yields good results which are often comparable to. "Better" would mean a primitive which has more geometric data points within a template. which is considered to be an important step to reducing silicon area or chip count in digital designs. The performance is good and accurate. those using established heuristics that embody extensive domain knowledge. The comparison of speech signals has a number of difficulties as variations in time and the time scales among them are not fixed. occlusion. GA has a higher level of confidence in identifying confused input utterances than the conventional approach of DTW. Such a subset is the smallest number of points necessary to define a unique instance of a primitive. noise interference. The transformation parameters can be calculated from a triplet of matched dominant point sets on the boundaries of the images. Experimental results have shown that the GA approach has a very close recognition performance. are not easily solved. 1) Extraction: GA is applied to generate the image filters for a two-stage target recognition system in such a way that the target image from background clutter is vividly classified from the imaging data [75]. especially for confusable words [79]. Speech Recognition In an automatic speech recognition system. The technique of dynamic time warping (DTW) has heen developed for this task. job-shop scheduling [40]. GA could thus be used as a lao1to aid designers for this purpose. eyes. Therefore. and distortions would lead to insufficient resolution which. then a minimal subset is identified by the indices of its member points. However. the nonlinear time alignment of DTW in which the conflicting issues of optimizing the stringent rule on slope weighting. The applications of GA in the travelling salesman problem [51]. GA can also be used for engineering designs that include optimization of object shaping. D. to perform primitive extraction from geometric sensor data.MAN er al. Engineering Designs It is interesting to note that GA is not applied only to solve problems that are mathematically oriented in the strictest sense. from I-n. The end result is very encouraging and there is room for future development. F. the nontrivial computation to obtain the K -best paths. E. A binary chromosome is used to code the five facial features (mouth. different apparatus. 20 suspects' faces are generated on a computer screen. and tasks. However. Reference [114] suggested using standard GA for the transformation parameters calculation to replace the current exhaustive blind searching. For example. Real-number encoding is adopted in this design. It is anticipated that this method of design is an up-and-coming technology for the creation of futuristic object designs. [21 introduced GA to find the best state machine assignment for implementing a synchronous sequential circuit. A typical but innovated design is an airfoil shapes optirnizalion prohlem solved hy the inverse numerical method [94].

especially when time and parts are. [119]. [123]. The application of GA will try to evolve a set of rules to deal with the particular situation. Therefore. it is only natural to engineers that an integration of GA with either a neural network. Integration of Genetic Algorithms and Fuzzy Logic In principle. or even for training networks that are with non differentiable transfer neurons. their intelligence is merely trying to emulate human brain or behavioral patterns in a mathematically recognizable fashion. As a fuzzy concept the rules and membership functions are defined by the skililed operators and these. as well as its various practical applications. there is no general rule or method to construct the fuzzy rules and to define the membership functions. or for that matter. In supporti ve integration. Integration of Genetic Algorithms and Neural Networks The use of neural networks (NN) for industrial control has heen well accepted. The most noticeable applications are in . obviously. where many control parameters are required to be adjusted in order to keep a system running in an optimal way. The real-time issue is not so much hinged on the computing speed. so that GA may be implemented to solve their practical problems. This section attempts to outline some applications of GA in this domain. or on a computer model of it. speech recognition. The contribution of GA's in optimizing a global NN topology optimization is also feasible.5. and understandably. [109] where both topology and weight coefficients are globally optimized. and • analyzing a neural net [103]. process control. GA can be used to optimize NN on the weight parameters and/or topology. OCTOBER 1996 [92].530 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS. O[ both. the activities to be scheduled within a planned maintenance period which requires that they be optimally set. Fogarty [41] used the former method to develop rules for controlling the optimum gas/air mixture in furnaces. an optimal model is critical in order to achieve the ultimate goal. in general. and production-inventory-distribution system [84] are typical examples and have been well recorded. the problem formulation. But. to integrate GA with other emerging technologies has yet to be mentioned. GA is used to evolve the network topology and use other local learning methods or GA to fine tune the weights. VIII. The paper's purpose is to introduce this emerging technology to engineers who may have little or no knowledge of GA. Here. A. At this point. it is worth trying. NO. Classifier System In a large. It is also reasonable to believe that this article contains sufficient. Rather. it is the unpredictability of GA is considered as one type of emerging technology. Davis and Coomb used a similar approach to design communication network links [31 j. Thus far. References [95] and [107] apply GA to optimize the fuzzy membership functions while [68] uses GA to optimize to optimize both fuzzy membership functions and fuzzy rules simultaneously. The theme is oriented from an industrial application basis. GA is an ideal method for solving constraints problems that are incurred from the complex nature of this kind of situation. that we normally come across. Such a system involves encoding production rules in string form (classifiers) and learning such rules on-line during their interaction with an arbitrary environment. active noise control. useful material to encourage and awaken the interest to newcomers. In collaborative integration. with a fuzzy logic system. like game playing [6] and maze solving. Whereas Goldberg modeled a gas-pipeline system to determine a set of rules for pipeline control and gas-leaks detection [52]. There are other more successful applications of GA in this area. GA working as the global optimizer is hence applied [73]. Using GA as a replacement for back propagation for weight optimization does not seem to be very competitive. either in neural network (KN) or in fuzzy logic. Because of its efficiency in modeling for such a complex human system. especially where computational speed is concerned. GA can assist neural networks in • selecting features or transforming the feature space used by a neural net classifier [13]. is given in this paper. it may be a promising learning method for the reinforcement of training a fully recurrent NN. [17]. EMERGING TECHKOLOGY WITH GENETIC ALGORITHMS B. prediction and financial analysis. A fuzzy controller optimized by GA has been designed on pH control [74] and water pump system [107]. The fitness of a set of rules may be assessed by judging their performance either on the real system itself'. the areas of telecommunication. complex system. There are certain to be many more uses to come as GA is inherently well suited for this type of application. the classifier system is generally applied. [5]. since the parallelism of GA shoulld improve the computational speed quite considerably and comfortably. in those emerging technologies. An interesting maintenance-scheduling problem may also be considered for GA-that is. pattern recognition. as well as political and economic modeling [46]. be used to accomplish a so-called intelligent system. Rules have been developed to control such complex plants. where the conflicting issue arises of which parts should be taken out for maintenance at a particular time. The reason for this is simple. GA may contribute its usefulness in obtaining an optimal model. etc. may not be optimal. What remains as a challenge to GA is undoubtedly the realtime and adaptive capability. • selecting the learning rule or the parameters that control learning in a neural net. the inherent capability of GA for solving complex and conflicting problems. 43. Some initial work has already been given in [80]. Combining GA and NN [97] can be generally divided into two broad categories in supportive and collaborative integration. The applications of GA on neural networks can be shown by the optimization of recurrent neural networks [4] and also feedforward networks [89]. CONCLUSION An attempt to outline the features of GA in terms of the genetic functionality of operators. interacting against each other. H. However. IX. VOL.

58-69. Pollack. and largely depend on simulation. B. TR 926. Neural Networks. practical. "An overview of genetic algorithms: Part 2-Research topics. GENESIS has been very influential in stimulating the use of GA. It is a function optimization system based on genetic search techniques. Grefenstette. Apr. "An indexed bibliography of genetic algorithms: Years I \157-1993. GA has difficulty in reacting promptly to the changing environment. As GA is a stochastic. but could result in the emerging technologies being able to assist GA applications. . pp. Feb.. It provides a platform for modeling. Turner. B. C. While they are strongly application independent. IJd. N. Jan." Univ. TOLKIEN ApPENDIX GENETIC OPTIMIZATION TOOLS The study and evaluation of GA are essentially nonanalytic. 1994." IEEE Trans. "The evolution of strategies in the Iterated pnsoner s dilemma. Conf. "A genetic algorithm for the assembly line balancing problem. Univ. J. 5th Int. 5. "Designing genetic algorithms for the state assignment problem. REFERENCES [I] J. Grefenstette. 3. Conf. Brill." IEEE Trans.. Crossover operators and self-adaptation of mutation rates are also possible.twork. 32--41. GENESiS The GENEtic Search Implementation System (GENESIS) was developed by John Grefenstette [6ll. 1987. pp. and nonlinear process. Angeline. and J. Research work in this direction should be encouraged. "Improving search in genetic algorithms. L." in Genetic Alga· rithms and Stimulated Annealing.. 4. The combination of these emerging technologies may not only involve applying GA as a helper to these two. no. and skew of object variables and mutation rates. All in all. is anticipated for the future. • various selection schemes such as tournament selection and linear ranking. There are additional datamonitoring facilities." Comput. . pp. " [6] R." Univ. IS. Anderson and M. Brown. Cybern. Amaral. 37." Dept. .• vol. For portability. Davis.. design.. 1987. 1992.1 and GNU C++. and general extinctive selection variants are included. ACM. Production Econ." in Proc. Takagi. A)-selection. • multipoint and uniform crossover. [7] J. 1987. 1993. no.l'. and simulation with an interactive environment and the associated graphical facility. Finland. "Reducing bias and inefficiency in the selection algorithm. Lnix and DOS versions are available. Bull. such as recording average. binary. Saunders. no compiler-specific or class library-specific features are used. and efficient. and R. Genetic Algorithms. [101 D. and floating point chromosomes are provided. 4. Rep. "Structure and performance of fine-grain parallelism in genetic search. Ed.. pp. vol." Communicat." IEEE Trans. and • linear fitness scaling and sigma truncation. C. R. 1993. [21 J. guaranteed response times are not applicable. Ed. R. GENEsYs GENEsYs [113] is a GENESIS-based GA implementation which includes extensions and new features for experimental purposes. and several other GA packages are generated because of its capability. Rep. 1995. Sci. 25. K. For example: • chromosomes of user-definable types. 106-112. Tolkien. Ed . Baluja. Tech. University of Vaasa." in Proc. T. vol. • diploidy. and creating bitmap-dumps of the population. Booker. One possible approach to this problem is to monitor the GA process via some intelligent supervisory scheme. and J. the knowledge generated from GA over the last 20 or so years has now become mature. 54-65. D. J. Conf. 1985. variance. 94-1. Syst . vol. vol." Genetic Algorithms and Stimulated Annealing. As the first widely available GA program. Neural N. ". discrete. R. 101-111. Davis. A GA Toolbox was developed [191 for MATLAB [87]. [8] __ . no. "Adaptive selection methods for genetic algorithms. A. character. A collection of reusable objects has been developed for genetics-based applications. MaL 19~4. Part of the common software package is briefly introduced and morc information can be found in [65J . vol. 1990. [91 S. 324-328. O. GA software packages have potentially a very broad spectrum of applications. E. J. 3. Ghosh. M. Genetic Algorithms. 2. "Fast genetic selection of features for neural network classifiers. GENOCOP GEnetic Algorithm for Numerical Optimization for COnstrained Problems (GEl'\OCOP) is developed by Zbigniew Michalewicz and details can be obtained in [891. Wisconsin-Madison. Comput . [12] L. D. and w. 15. The GENOCOP system was designed to find the global optimum of a function with additional linear equalities and inequalities constraints. 14-21.. Boltzmann. Axelrod.MAN et al. This is GA software which is easy to use.. pp. Thus far. Different selection schemes like linear ranking. The integration of GA with other emerging technology such as neural networks and fuzzy logic systems could be another challenging area. 2nd Int. New York: Pitman. N. l st Int. 170-181. Dept. Asakawa and H. Lawrence Erlbaum Associates... Baker. E. Different combinations may offer us a fruitful result in intelligent system design. 61-73. Man." in Proc. "Neural networks in Japan. Alander. Technol. [4] P. pp. L. Genetic Algorithms. no. D. • gray code encoding and decoding. "An overview of genetic algorithms: Part I-Fundamentals. Lawrence Erlbaum Associates. TOLKIEN contains a number of useful extensions to the generic GA. Ferris. [131 F. A considerable growth in the application GA. [31 E. pp. Due to the factor of population convergence. [11] __ . Genetic Algorithm Toolbox in MATLAB TOLKIEN (TOoLKIt for gENetics-based applications) version L 1 [lll] is a C++ class library named in memory of J. . An evolutionary alzorithm that constructs recurrent neural networks.. Martin. J. Ser. The prospect of applying GA for practical applications is good. GENETIC ALGORITHMS: CONCEPTS AND APPLICATIONS 531 GA that is the main cause of concern. 1993. Z. particularly in the field of industrial electronics. R. [5] K. The current version has been compiled successfully using Borland C++ Version 3. J. Comput . pp. J. New York: Pitman. Beasley. Infonnat. E. pp. applying GA in a time-varying system is still in its infancy. 1994. Martin. (/k. integer..

vol. and the TSP. IL. 38-47. "Deception considered harmful. Genetic Algorithms. pp. "Genetic algorithms for changing environments. Sheffield. G. Bifurcation. M. 422-427. 5th Int. Sept.. Int.. Ann Arbor. E. Kocarev. 1996. and first results. 11-1'). 3. 1st Int. University of Michigan. [22] H. May 1995. pp. 165-168. Beasley. and block. U. Genetic Algorithms. Johnston. . 1993. J. Macfarlane. J. Eshelman. 1975. Coventry. . 2nd Conf..K." in Proc. "Nonstationary function optimization using genetic dominance and diploidy." in IFAC World Congr. Electron. timedependent nonstationary environments." in lsi lEE/IEEE Int. Sept. T. "Nonlinear model term selection with genetic algorithms. E. J. "Simultaneous design of membership functions and rule sets for fuzzy controllers using genetic algorithms. N. 93005. pp. Dejong. vol. "Multi-layer genetic algorithms in robust control system design." IEEE Trans. E. L. 10 19. Fleming. 2.0. New York: Pitman. 1989.. [48] S." Int. "A genetic algorithm applied to robot trajectory generation. Grefenstette. Amsterdam: lOS Press. Chua. Chang and R. Deb and D. [46] S." in Foundations of Genetic Algorithms. 6760. no. Rep. "Rule-based optimization of combustion in multiple burner furnaces and boiler plants. G. 2. Can}: Genetic Algorithms. J. C. dissertation. 1993. 523-530. Reading. Goh. H. pp. R. 1993." in Proc. Cybem. Grefenstette." in Proc. U. 1992. NO. D. pp. University of Sheffield. 1993. M. B." in 3rd Int. Genetic Algorithms. 170-174. and M. Holland." in Trunsputing '91. 1995. 3. Durfee. [53] __ . pp. Systems Engineering. S. [35] K. 797-803. "Real-coded genetic algorithms. 45-52. Univ. Lippmann. pp.. Sept. and D. [20] L. no.K. Horn and N. 2nd Int. Conf. and J. 375-382. [Online]. Conf. "Analyzing deception in trap functions. J. 144-165. pp. 1991. 1st Int. Marland. and C. S. pp. and B. [25] N. "A genetic algorithm toolbox for MATLAB. 3. Goldberg. 177-183. Genetic Algorithms. FiJlh Int. 1987. The Hitch-Hiker's Guide to Evolutionary Computation: A list of Frequently Asked Questions (FAQ). E. 90001. S.K. 1989. pp.K. [521 D.. Garey and D. Practice. Fan. S. 2. 1995. Cobb. [51] D. "Tracking a criminal suspect through face-space with a genetic algorithm. Handbook of Genetic Algorithms. Nov. 1986." Eng. M. Workshop on Innovative Approaches to Planning. Naval Res. "An analysis of the interacting roles of population size and crossover in genetic algorithms. and K. 1990. Whitley. pp. SMC-16. Fang. 1992. Rep.. "Genetic algorithms for multiobjecitve optimization: Formulation. 4th Int. J. Goldberg and R. 2. R. A. Jan. A. "Application of genetic algorithms to task planning and learning. 166-185. GA:s in Engineering Systems: Innovations and Applications. 1992. J. pp. virtual alphabets. 518. Johnson. Conf." Parallel Problem Solving from Nature. 41ft Int. nonlinear dynamical systems. Chaos. D. Carmi-Paz. Pohlheim. 145-154. [47] M. New York: Van Nostrand Reinhold. E. pp. Whitley. pp." in Proc. "Job shop scheduling with genetic algorithms. pp.. Davis. 1991. 3rd Int. Rep." Dept. Grefenstette. Fonseca and P. Fortuna. 137-144. Conf. [23] H. Jakob. Conf. "Messy genetic algorithms: Motivation. Tech. pp. IlliGAL Rep. 416-421. Hollstien. E. "An overview of evol.. analysis. vol. O. Man. [26] D. 1993. Sheffield. Genetic Algorithms in Search. K.utionary Algorithms in multiobjective optimization. Fonseca and P. "Multiobjecitvc genetic algorithms made easy: Selection.. 1991. Urbana. 291-300." Int. D. Deb. 74-88. J. Ed. 2.genetic. Rep. 416-423. 3rd Int. 1991. vol. [27] __ . "Do not worry. Fleming." in Proc. control. Conf Genetic Algorithms. 1993. M. 1989. Eds. J. P. 1991. Davis. 1994.. L. J. 136-140.. Fogarty. GA's in Engineering Systems: Innovations and Applications. Ed. TlIigal. Yallapragada." NRL Memo. P. "A cooperative approach to planning for real-time control" in Proc. Res. McCormick. Cobb and J. E. New York: Van Nostrand Reinhold. 1992. J. Conf. 1991.. [63] __ . 1991. 277-283. 5.. dissertation. discussion. Ed. Eng. 1989. Rep. P. [36] K. 1985. Essex. 1971. "Adapting operator probabilities in genetic algorithms." in PIOC. "Artificial genetic adaptation in computer control systems. [34] K. K. Sheffield. and Machine Learning. Chipperfield. [611 J. J. [431 c. "ASPARAGOS an asynchronous parallel genetic optimization strategy. Fonseca. L. 1991. Goldberg. Cambridge. P. 1987. Mendes. CA: Morgan Kaufmann. 3. Spears. P. "Multiobjective optimization using the niched Pareto genetic algorithm. Ed. [68] A. J. 1990. 1l7] E. Ed. and D. Davies and T. U. Goldberg. 122-128. Illinois at Urbana-Champaign. U. no.D. Chipperfield. Fleming. [65] J." in Genetic Algorithms: Proc. "How genetic algorithms work: A critical look at implicit parallelism. 3. [56] V." Contr. "Alleles. "Chaotic system identification via genetic algorithm. Gorges-Schleuter." in Proc. 61-69. 1990. M. Caruna. [18] A. 1994. [66] J. 1991. 4th Int. Homaifar and E. 1975. [67] R. L. Con}: Genetic Algorithm. l11iGal 91009." in Proc. "Parallel genetic algorithms: A snrvey. J. 705-708. pp. pp. Artificiallntell. 1993.\ User's Guide to GENESIS v5. L. Dasgupta and D. "Numerical methods to deci gn the reaching phase of output feedback variable structure [37] [38] [39] [40] [41J Verlag. 46. 141-145. [33] __ ." Complex Systems. pp." in Proc. "Using genetic algorithms to improve pattern classification performance. J." Illinois Genetic Algorithms Lab. Heck. W Gu. [15] E. Grefenstette. [31] L." Parallel Problem Solving from Nature." in 1st IEEIIEEE Int. ACSE Res. D." Automatica. "Optimization of control parameters for genetic algorithms. H. & open-shop scheduling problems. Clarke. "Optimization of artificial neural network structure using genetic techniques implemented on multiple transputers. 3. Illinois." in Genetic Algorithms and Stimulated Annealing. G.K." IEEE Trans. Eckert. Conf. McGregor. Amsterdam: North Holland. no. 1990. Chua." in Proc. Davis. Lab. 1989." Univ. May 1994. Davis and S. Genetic Algorithms. Caldwell and V. "Genetic algorithms and communication link speed design: Theoretical considerations. (1994). 1991. [70] W. J." Univ. Ed.el Problem SnJving from Nature Berlin: Springer [42] C. pp. Michigan. [55] D. Chelmsford. L. J. Communicat. Genetic Algorithms. 154-159." in Proc. Dejong and W. "Experimental chaos synchronization in Chua's circuit. 1990.-L. Davis. 2. pp." in Proc. 27/1-27/8 [44] C." in Proc. U. D. "Biases in the crossover landscape. I'roc. [59] J." Univ. [64] B. Sept. vol." in Workshop on Natural Algorithms in Signal Processing. Graebe. 5th Int. pp. [62] __ . J. E. Davidor. [60] J.532 IEEE TRAl\SACTIONSON INDUSTRIALELECTRONICS. Lj.K. Res. Martin. Grefenstette and J. vol. "The genesis of Chua's circuit. pp. pp. 1993. vol. [69] J. 1995. Conf. [50] __ . pp. pp. Sheffield. Xibilia. M. Man." Tech. Corne." Control '96.. 95007. 3rd Int.. First Wnrkshnp Pa mll. S. [29] R. San Mateo. locis.C. Manganaro. IlliGAL Rep." Parallel Problem Solving [rom Nature. be messy." in Proc. San Mateo. and Control. pp. CA: Freeman 1979. 687-700. Heitkoetter and D. "Parallel implementation of a genetic algorithm. V. Amsterdam: North Holland. F. "A multi-population genetic algorithm for solving the k-partition problem on hyper-cubes. Fleming. [54] __ . J. J. and M. Illinois at Urbana-Champaign. pp. Caponetto. 527. "Serial and parallel genetic algorithms as function optimizer. [21] L. J.K. and models of international security.VOL. Blume. 1989. Ross. Gorges-Schlcutcr. [28] Y. Conf.. "Hoo design of an EMS control system for a magle vehicle using evolutionary algorithms. U.. "A structured genetic algorithm: The model and first results. Fuzzy Syst. Itoh. Con}: Genetic Algorithms.ai. I. vol. Nov. 1985. Ann Arbor. 1988. of Automatic Control and Systems Eng.. G. 59-68. 6-8. Con}: GA' s in Engineering Systems: Innovations and Applications.. 1987. Baker. Rep. "A summary of research on parallel genetic algorithms. [161 R. vol. C." Univ. pp. MA: Addison-Wesley. Korb. Conf Genetic Algorithms. and S." in Proc. 1993. R. Forrest. and mating restriction. H. Richards. sharing. Ed. "Nonstationary function optimization using the structured genetic algorithm. Scheduling. 495-530." Ph. pp. Dodd. and M. Syst. OCTOBER 1996 [14] C./Feb. Washington. F. 4. 244-248. IKBS-2-91. Conf. [57] M. Smith. Coombs." in Handbook of Genetic Algorithms.. Schaffer." Advances in Neural Information Processing 3. Fleming. "Robust and adaptive control of an unknown plant: A beachmark of new format. Cohoon." in Handbook of Genetic Algorithms." in I st lEE/IEEE Int.. 203-209. Conf. Grefenstette. Applicat. [19] A. W. San Francisco. H. Optimization. "Genetic algorithms. N. [49] D. Goldberg. "Simple genetic algorithms and the minimal deceptive problem. pp. K. "The analysis and behavior of a class of genetic adaptive systems. Computers and Intractability: A Guide 10 the Theory of NP-Completeness. 1. pp. Conf Genetic Algorithms. 129-139. Univ. [45] __ . 5th Int. 75-91. "An investigation into the use of hypermutation as an adaptive operator in genetic algorithms having continuous. Adaption in Natural and Artificial Systems. MA: MIT Press. Strathclyde. rescheduling. Genetic Algorithms. CA: Morgan Kaufmann. [30] L. "Genetic algorithms for tracking changing environments. 43. Dakev and A. Billings. 191)5. Mayer-Kress. pp. Chipperfield and P. and generalization. Fonest and G. Dec. "A promising genetic algorithm approach to Job-shop scheduling. [24] J. [32] L. Avai1ahle: USRNRT:comp. pp. [58] S. O. Genetic Algorithms." PhD. S. U. Nafpliotis. 252-256. Davis. V. pp. and C. Gordon and D. and H. July 1995.

Ramanathan.. 17-26." Dept. Whitley. Manderick and P. [72] A. F. Halle. Hong Kong. Conf. 4th Int. Chan. scale job-shop problems. M.ps. vol. "A comparative study of natural algorithms for adaptive TTRfiltering. for a photograph and biography. Schwenderling." in Proc. Man. A. Evolution. Gentry. Sheffield. 2. Yamada and R. Tang." in 1st lEE/IEEE Int. Nov. Glover. R. 1994. K. pp. Sept. no. Fuzzy Syst. IlliGAL Rep. "Structured genetic algorithm for robust H ex» control system design. Thomas. Willson. D. Eds. L. Mackle. "Genetic design of VLSI-layouts..s in Engineering Systems: Innovations and Applications. Mahfoud. and D. pp. Cybern. Nov. vol. [83] __ . Genetic Algorithms. C. pp. June 6. C. [112] R. Univ. Chinese Univ." IEEE Trans. 151-185. [III] Y. Man. Harley. "Genetic algorithm for aerodynamic inverse optimization problems. Chelmsford. Berlin: Springer-Verlag. 1995. Michalewicz. 1989. and combinatorial optimization. Rawlins. Man (M'91).. Wilson and M. W. pp. Chelmsford. no. and P. B. Tech." Dept. "Design of integrated production-inventorydistribution systems using genetic algorithm. "A genetic algorithm applicable to large. Essex. [103] K. pp. D.K." IEEE Trans. Sci. 419--423. pp. [121] P. July 1991. pp. S. Langdon. Con! GA's in Engineering Systems: Innovations and Applications. pp. [119J B. Suzuki and Y. pp. 211-225. U. 3. "Genetic algorithms for fuzzy controllers.: GE"fETIC ALGORITHMS COI\CEPTS AND APPLICATIONS 533 [71J C.. [105] __ . [101] W. Genetic Algorithms. 16. G. pp. Information. pp. Sheffield. Colorado State Univ. Vomberger. 26-33. B. K. J. IlIiGAL Rep. Sheffield. "Optimal design of pid process controllers based on genetic algorithms.K" 1995. Rumelhart." in Workshop on Natural Algorithms in Signal Processing. pp. Hong Kong. O. Mar. Shin and P. Con! Genetic Algorithms.K" 1995. Con]." Int. Y. 1992. D. Kwong. 250-255. 1994. F. Karr and E. Syst. vol." in 1st lEE/IEEE Int. [123] T. Comput." in Proc. Genetic Algorithms. K. Flockton. Dortmund. Nov. 1995. 205-218.." in l st lEE/IEEE InT. 24. I. 1990. C . [80] W. [89J Z. 7-12. pp. 1995. Nakano. 1995. Hong Kong. 1991. L. Janikow and Z. C. Machine Intell. A. Kocarev. Rawlins. Berlin: Springer-Verlag.. [77] J. [106J H. Patnaik. "Pine-grained parallel genetic algorithms.. Learning. "Multicriteria target vector optimization of analytical procedures using a genetic algorithm. COGANN-92Inl. Con/ Control. Maniezzo. MA: MIT Press/Bradford Books." in 1st lEE/IEEE Int. 906-910. Walters. 709-713. Conf. "Fuzzy control of pH using genetic algorithms." in lst lEE/IEEE Int. 1989. Zinober. [93] R. F. Tolkien Reference Manual. [91] H. Wright. Machine Intell. Srinivas and L. Dec. Baltimore. "Robust controller design using normalized coprime factor plant descriptions. [76] Lj. Kowk.K . "Geometric primitive extraction using a genetic algorithm. Can! on GA's in Engineering Systems: Innovations and Applications. Jan. l11inois at Urbana-Champaign. [97] J. Syswerda. S. Conf. "Adaptive IIR filtering. pp. Con]. Jones and P.. U." in Proc. M. [75] A.uk:/genetic/papers/grid\_aisb-95. Park. Tang. J. J. 573-582." in Proc." in Parallel Genetic Algorithms: Theory and Applicatiom. E. Man. see this issue." Analvtica Chimica fl. 1731 c. 93-105. "Uniform crossover in genetic algorithms. "A genetic algorithm tutorial. and S. Genetic Algorithms.. San Mateo. 281-290. "Conventional genetic algorithms for Job-shop problems. Sci. 2nd Ed. GA's in Engineering Systems: Innovations and Applications. "Generating image filters for target recognition by genetic learning. 1991. White and S. 454--460. Genetic Algorithms + Data Structures ~ Evolution Program. 39--47. Eng. Ind. F. U. B. pp. Gu. pp. D. "Improving local search in genetic algorithms for numerical global optimization using modified GRID-point search technique. J. Man. 1992. D. pp." Communicat. 22/1-22/8. 1994." in Proc. Reliability. 1991. J." IEEE Trans. Sci." in Int. "A genetic algorithm for affine invariant object shape recognition. W." in 1st lEE/IEEE Int.. Kakazu. Nambiar and P. Part I. Apr. Man. Whitley. Apr. 539-546. Dept. pp. [98] V. [118] __ . Tang. Bifurcation. CA: Morgan Kaufmann.K. 2. 37. S." in Parallelism. Schaffer. 1992. Sheffield. pp. McFarlane and K. "Neural networks: Applications in industry. [99] K. Spiessens. pp. Essex. [113] B." Parallel Problem Solving from Nature. Univ. A. "Genetic auto-tuning of PID controllers. vol." in Workshop on Natural Algorithms in Signal Processing. [104 J G. U. "Genetic algorithms: A survey.ac. and application to atomic emission spectroscopy. P. 2. 3rd Int. 20/1-20/10. Nakano. (1995). Ng. Ed. L." in Proc. [Online]. 1994. 1992." in Information and Control Sciences. F. Sheffield. 3. "Genetic-based new fuzzy reasoning models with application to fuzzy comrol. and L. "Real-time computing: A new discipline of computer science and engineering. [90J N. "Genetic evolution of the topology and weight distribution of neural networks.K. IlJlJ3.. pp.. 1992. 293-298. Chelmsford. 1994. 4/1--4/8. Widrow. 1989. H. G. vol. Sheffield. pp. and G.ucl.. J. E. vol. 265. "Adaptive IIR filtering using natural algorithms. J. 1989." in Handbook of Genetic Algorithms. 1991. vol." in Animals to Animals. 6. IlJ5J O. C. 434-439. D. 1995. Tamaki and Y Nichikawa. Wang and D. vol. "An analysis of multi-point crossover. "The GENITOR algorithm and selection pressure: why rank-based allocation of reproductive trials is best. Kroger. 2. 1993. pp. Con! on GA's in Engineering Systems: Innovations and Applications. 430--435. S. "Fuzzy control of water pressure using genetic algorithm.MAN et al. [100] J. 238-244. pp. 1993. Vornberger. 428--433. Tang.. 901-905. Con]: GA's in Engineering Systems: Innovations and Applications. K. Comput. Kwong. pp. U. U. 1994. . 116-121. Genetic Algorithms." in Proc. 1991. no. Pattern Anal. J. Mak and Y. 1995. 575-582. 94005. Tang. 1991. Comput. 1. 1991.. The Mathworks." in l st lEE/IEEE lilt." AI Expert. [79] S. GA's in Engineering Systems: Innovations and Applications. W. "Crowding and preselection revisited. Conf. "Genetic structure for nn topology and weights optimization." in Proc. no. Kateman. [102] M. June 5-9. "Combinations of genetic algorithms and neural networks: A survey of the state of the art. Con! GA. Moin. 1993. DcJong. Sept. Essex. numerical simulations. 4-21. and L. "Distributed genetic algorithms." Dept." IEEE Trans. MD. Chan. "Genetic algorithms for real parameter optimization. 1992. pp. 1995. ACM. [87] MATLAB User's Guide. 31-36. Katz and P. Conf. [74] C. "Experimental demonstration of secure communications via chaos synchronization. Schaffer. U.. Savic. "A paralleled genetic algorithm based on a neighborhood model and its application to job shop scheduling.K. [122J A. 400--405. pp. Karr. A. 1116J M. 4th Int. [114] P. Eckert. Lucasius. 1993. 19lJ5. Comput. 2. [81J G." in Foundations of Genetic Algorithms. "Evolution and co-evolution of computer programs to control independently-acting agents. Scheduling Planned Maintenance of the (UK. P.. 1994. pp. Milhlenbein. Michalewicz. Tsang. D. Comput. Ineger and S. [117J D. Y. Shynk. pp. pp." Computer. Z. H. [92] R. Mars. K.' Foundations of Genetic Algorithms. H. [107] K. Sheffield. vol. "Schedule optimization using genetic algorithms. Berlin: Springer-Verlag. p." rEEE Trans. 1994. U. 301-3/5. F. pp. no. [85J B. Koza. Parlitz. and Applications of Emerging Intelligent Control Technologies." in Workshop on Natural Algorithms in Signal Processing. Wong. this issue. 518. Conf. Langholz. [110] K. vol. and M. M. "GA approach to time-variant delay estimation. "Sliding mode control design using genetic algorithms. Available: cs. 1994. System Analysis Research Group. Aug. 2. Obayashi. Cambridge. [115] P.." in Proc. L. pp. pp. 3rd Int. pp. K. Macleod. 332-34lJ. "Application of genetic algorithms to pump scheduling for water supply. CS-93-103." IEEE ASSP Maf?. [94] S. Chaos.K. 398--406. Int. 31d Int. Pattern Anal. S. [120] D. June 1994. IEEE. Sci. 1993. [109] K. vol. E. K. J. Amsterdam: lOS Press. Levine. pp. 5. Chua. 3rd. 1994. 641-648. pp. 1992. Roth and M. on GA's in Engineering Systems: Innovations and Applications. U. Man. "An experimental comparison of binary and floating point representations in genetic algorithms. 46-53. I. [82] S. Sheffield. 1989. 15-20. Univ. [86J V. and G. 1989. pp. Rep. [84] K. IlJlJ1. and S. pp. Ed. S. Electron. Jan. and O. 39-53. Thrift. Theory. A..) National Grid. [96] G." IEEE Trans. Ed.K. 82. Tanse. [78] B. Neural Networks. "Parallel genetic algorithms. population genetics. and K. "Parallel genetic packing on transputers. S. Schnecke and O. IlIinois at Urbana-Champaign. and science. J. 2-9. 1." in 1st lEE/IEEE Int. Users Guide for GElVEsYs. Wienke. A." Contr. W. "Population sizing for sharing methods." Parallel Problem Solving From Nature.cta. Kwong. Eshelman. business. pp. Genetic Algorithms. and G. Sheffield. A. 16. [108] K. 92004. Conj. Can]: GA's in Engineering Systems: Innovations and Applications. S. 4th Int. [88] D. IFAC Workshop on Safety. Lehr. and C. pp. "Low implementation cost IIR digital filter design using genetic algorithms. 1994." in 1st lEE/IEEE Int. no. Practice. "An approach to the analysis of the basins of the associative memory model using genetic algorithms. Sci." in Proc. 141-145. Dept. Nov. De Moura Oliveira. Spears and K. Kandel. 4. 474-479. Workshops 011 Combination of Genetic Algorithms and Neural Networks.

In 1989. Kwong (M'93) received the M.5. degree from City University of Hong Kong in 1992. Canada. he joined City University of Hong Kong as a Lecturer in the Department of Electronic Engineering and later on moved to the Department of Computer Science. VOL. and Chinese computing. Tang received the B. OCTOBER 1996 K. S.534 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS.Eng. pattern recognition. S.Sc. He joined Control Data Canada as a diagnostic engineer and participated in the design and development of diagnostic software for the Cyber mainframe computer series. His research interests include genetic algorithms. He is involved actively in both academic and industrial projects and publishes widely in the areas of speech processing.D. chaotic and nonlinear system control.) in electrical and electronics engineering from the University of Hong Kong in 1988 and the M. City University of Hong Kong. in 19~5. degree in electrical engineering from the University of Waterloo. he joined Dell Northern Research as a Member of Scientific Staff responsible for the developmenl of the telecommunication systems in both voice and data network.A. degree in the Department of Electronic Engineering. . In 1987. computer system. He is currently working toward the Ph.Sc. 43. NO. degree (Hons. active noise control.

Sign up to vote on this title
UsefulNot useful