A new method of optimization, Particle Swarm optimization (PSO), is able to accomplish the same goal as GA optimization in a new and faster way. The purpose of this paper is to investigate: the foundations and performance of the two algorithms when qplied to the design of a profiled corrugated horn antenna. Also investigated is the possibility of hybridizing the two algorithms.
A new method of optimization, Particle Swarm optimization (PSO), is able to accomplish the same goal as GA optimization in a new and faster way. The purpose of this paper is to investigate: the foundations and performance of the two algorithms when qplied to the design of a profiled corrugated horn antenna. Also investigated is the possibility of hybridizing the two algorithms.
A new method of optimization, Particle Swarm optimization (PSO), is able to accomplish the same goal as GA optimization in a new and faster way. The purpose of this paper is to investigate: the foundations and performance of the two algorithms when qplied to the design of a profiled corrugated horn antenna. Also investigated is the possibility of hybridizing the two algorithms.
Particle Swarm, Genetic Algorithm, and their Hybrids :
Optimization of a Profiled Corrugated Horn Antenna
Jacob Robinson, Seelig Sinton, and Yahya Rahmat-Samii Department of Electrical Engineering University of California, Los Angeles Los Angeles, California 90095-1.594 robinson@ee.ucla.edu, rahmat@ee.ucla.edu Introduction Genetic Algorithms (CA) have proven to be a useful method of optimization for difficult and discontinuous multidimensional engineering problem. A new method of optimization, Particle Swarm Optimization (PSO), is able to accomplish the same goal as GA optimization in a new and faster way [l ]. The purpose of this paper is to investigate: the foundations and performance of the two algorithms when qplied to the design of a profiled corrugated horn antcnna. Also investigated is the possibility of hybridizing the two algorithms. GA Background GA is an evolutionary optimizer (EO) that takes a sample of possible solutions (individuals) and employs mutation, crossover, and selection as the primary operator!: for optimization [2]. For the case of a profiled corrugated horn antenna, there are five parameters being optimized for the design of a profiled corrugated horn antenna. The optimization parameters are the 3 paramcter relating to the length, the number of corrugations per wavelength, the ratio of tooth width to total corrugation width, profile parameter, and matching section parameter. The fitness function takes the values of each parameter and returns a single number representing how good the solution is. For this example the design parameters are fed into a hom simulation program that creates a horn cross section and simulates the far field pattern. A representative fitness function is definedas: f= -0.5*Weight (1bs)- 0.2*XPol (dB)-(BW,,,-BM3-S,/ (dB) ( 1) The BWtaTg, is the desired beamwidth of the horn. In this case 34 is chosen because that was the desired outcome for a particular project. Weight is the wi ght of the horn, and Xpol is the peak crosspolarization within a reflector subtended angle of +36 . Additional constraints are made to discourage unrealistic dcsigns. Any design with a tooth thickness of lcss than I mm is given a fitness value of -100 bccause 1 mm is the thinnest tooth that can be effectively manufactured. I f the value of SI I was lcss than -30 dB, the fitness is evaluated using -30 as the value of S I I. PSO Background PSO is in principle a much simpler algorithm. It operates on the principle that each solution can be rcprcsentcd as a particle (agcnt) in a swarm. Each agcnt has a position and velocity vector. Each position coordinate represents a paramctcr value. Thus for an n-dimensional optimization, each agcnt will have a position in n-dimensional space that reprcscnts a solution [3]. For this case the position corresponds to thc horn design parainetcrs. Figurc l a shows a flow chart of the PSO algorithm. Like GA, PSO must also have a fitncss evaluation function that takes thc agents positiori and assigns to it a fitness value. For consistency the fitncss function IS the same as for CA. Thc position with the highest fitness value in the entire run is called the global bcst (ghcrt). Each :agent also keeps track of its highest fitness value. The location of this value is called its personal bcst (pbcit). Each agent is initialized with a randomposition and random velocity. The velocity in each of n dimensions is accelerated toward the global best and its own personal best based on the following equation: 314 (17803-733@8/02$17.0002002 IEEE ,, =U,, +c, rand( ) * k,,,., ,.,, - .yr, 1 +c 2 rand( ) * ( P ~ ~ , ~ ~ . ~ ~ - x,, 1 ( 2) Here rand( ) returns a numbcr between 0 and I . Thc acceleration constants CI and Q determinc the relative pull of and he\, Thc higher the constant. the grcatcr the acceleration toward the position it is multiplying. For this casc 2.0 15 used for both constants. The positions are updated based on their movement over a discrete tiiiic intcrval (A t) as follows, with At usually sct to I : s,, =x, , +v,, *A/ (3) Then the fitness at cach position reevaluated. If any fitncbs is grcater than h.,,, then the ncw position becomes gbf and the velocities arc accclcrdted toward that point. If the agcnts fitness value is grcater than pbc,,, then pbr,, is replaccd by the current position and the agent is accelerated toward that position. Another point called the local best (IbcJ is sometimes used. This is thc position of the highest value froma small group of agcnts. The size of thc group is usually about 15% of the population size. Agcnts arc accelerated toward ]he,, fromthcir rcspcctive group. This technique, however, was not used in this application. Figurc I b shows a two dimensional examplc ofhow the particles are accelerated. An additional factor vnlar is the ma~i num velocity of an agent in any givcn dimension. In this case, vmax is set to half the range of thc given dimension. This is done to help kecp the swarm under control. Another factor known as thc inertial wcight (w) is dccrcascd linearly from0.9 to 0.4 over the length of the run. This has the effect of slowing the particlc as the run progresses. This helps the particles converge to gben rather than oscillating around it. The new equation for the position is given by the following: Results The performance ofthese two algorithms were tcstcd for the design ofthc corrugated horn for spaceborne application using amodified version of CA developed by Dr. Pctc Carroll of the University of Illinois at Urbana-Champlain, and a Particle Swarm Optimizer recently developed at UCLA. Each algorithmwas given apopulation size of 10 and allowed to make 600 calls to fitness cvaluator (horn simulation program). The calls to the fitness evaluator constitute the majority of the runtime of the algorithm. Thus an equal number of calls to the fitness evaluator indicate approximately the same runtime for both algorithms For GA optimization, the crossover probability was 0.6 and the random mutation probability was 0.1. Fitness evaluations were reduced in GA optimization by not reevaluating the same horn design even if it was repeated in future generations. Evaluations were reduced in PSO by not evaluating particles that flew outside the limits of the allowed solution space. Each algorithmstarted with the same randomly generated initial population. Figure 2 shows the fitness of the best horn design found as a function of fitness evaluations made. As shown in Figure 2 , GA improves more rapidly than PSO early in the run. However, PSO passts CA as improvement in CA levels off, and eventually PSO returns a better horn. The GA-PSO and PSO-CA hybrids offer a combination of the two algorithms with the hope of utilizing the qualities and uniqueness of the two algorithms. This was done by taking the population of one of the algorithms when the improvement began to lcvel off, and using it as the starting population of the other algorithm. For the CA-PSO hybrid, the CA population was used to start the PSO after 323 fitness evaluations, and for the PSO-CA hybrid the population from the PSO was used to start the CA after 334 fitness evaluations. Table I shows the characteristics of the best horns designed by the four different algorithms. Figures 3a and 3b show the cross section and the far field pattern of the best horn designed by the PSO-GA hybrid, the best horn overall Conclusion The use of the hybrid incrcased the final fitness value for PSO and GA respectively, however, the PSO alone outperformed both CA and the GA-PSO hybrid. The PSO-GA Hybrid returned the best horn out of all four algorithms. This asserts the value of PSO as a new and effective way to optimize difficult engincering problems. 315 Acknowledgement This work has been supported in part by J PL contract 81202696 - - - original velocity 4 - velocity toward gbest velocity toward pbest resultant velocity . . . . . . . . . . pbest $r gbest '\ * 2 r( i 0.01 0.5 Ratio of tooth width vs. total corrugation width b. Figure 1: (a) Flow chart showing the PSO process. (b) Individual agents (1 and 2) are accelerated toward the location of the best solution, gbest, and the location of their own personal best, pbest, in a two dimensional simplification of the problem. 35 34. 33. 32. 26' 25- 0 im m 33 403 m Bx) wsSsE&Eiim5 Figure 2: Chart above shows the best fitness value returned versus the number of calls to the fitness evaluator for each of the four optimization algonthms. 316 J . . . . . . . 4 11 iiii 311 YII 411 $11 HII 711 mi o 30 60 90 120 1x1 180 m Theta (Degrees) a. b. Figure 3: (a) The horn cross section for the best overall horn, designed by the PSO-GA hybrid. (b) The far field pattern for the horn. Plus or minus 36 is the subtended angle for an offset reflector antenna illuminated by the horn. Table 1: The table shows the characteristics of the best horn fromthe initial population and the best horn optimized by the four different algorithms. Notice that the fitness value is highest for the PSO-CA hybrid. References: [ I ] J . Kennedy, and W.M. Spears. Matching Algorithms to Problems: An Experimental Test of the Particle Swarm and Some Genetic Algorithms on the Multimodal Problem Generator. Proceedings of t he IEEE IntI Conference on Evolutionary Computation. 1998 [2] Y . Rahmat-Samii, E Michielssen. Electromagnetic Optimization by Genetic Algorithms. J ohn Wiley & Sons, 1999. [3] R.C. Eberhart, and Y . Shi. Particle SwarmOptimization: Developments, Applications and Resources Proceedings of the 2001 Congress on Evolutionary Computation. vol. I , 2001, pp. 81- 86. 317