This action might not be possible to undo. Are you sure you want to continue?

Welcome to Scribd! Start your free trial and access books, documents and more.Find out more

)

PREPARED BY:VRAJESH CHOKSHI (08CE10)

ARTIFICIAL INTELLIGENCE

Now a days the concepts of artificial intelligence have came into existence, basically they are related to producing such artificial things which act totally as an human being or has the memory and functionality same as the human brain. Artificial Intelligence includes many topics on the Swarm optimization, soft computing, memetic algorithm , genetic algorithm many more topics. One of the main type of algorithm for artificial inteligence is the PARTICLE SWARM OPTIMIZATION. Which provides how the collective behavior of the realworld entities make it possible work effficiently .

SWARM INTELLIGENCE Swarms consist of many simple entities that have local interactions. or microscopic. or macroscopic. Swarm intelligence techniques (note the difference from intelligent swarms) are population-based stochastic methods used in combinatorial optimization problems in which the collective behavior of relatively simple individuals arises from their local interactions with their environment to produce functional global patterns. The emergence of complex. behaviors and the ability to achieve significant results as a team result from combining simple. behaviors. . including interacting with the environment.

PARTICLE SWARM OPTIMIZATION .

PSO is mainly inspired by social behaviour patterns of organisms that live and interact within large groups. schools of fish. PSO incorporates swarming behaviours observed in flocks of birds.Introduction y Particle Swarm Optimization (PSO) is a relatively new.and two-dimensional continuous search problems. multi-dimensional search spaces. or swarms of bees. evolution-based search and optimization technique. . In particular. y PSO algorithms are especially useful for parameter optimization in continuous. We explain the PSO algorithm in detail and demonstrate its performance on one.

y PSO is an established method for parameter optimization . It is implemented easily in most of the programming languages since the core of the program can be written in a single line of code and has proven both very effective and quick when applied to a diverse set of optimization problems. .(Cont. it represents a population based adative optimization technique that is influenced by several strategic parameters choosing reasonable parameters for its convergence behavior and depends on optimization task.) y The term PSO refers to a relatively new family of algorithms that may be used to find optimal or near to optimal solutions to numerical and qualitative problems.

PSO is a population based search alogorithm and is initialize evoluted with a population of random solution . each particle retains an individual (or cognitive )memory of the best position it has visited . A particle calculates its next position based on a combination of its last movement vector .each particle in PSO is also associated with the velocity. the individual and global memories .What does it basically mean???? y Particle swarm optimization (PSO) is one of the evolutionary computation techniques . like the other evolutionary computation techniques . as well as a global (or social )memory of the best position visited by all the particles in the swarm. looking for the global optimum .unlike in the other evolutionary ideas and techniques. by exchanging information the particles can influence each others movements . . called particles . y The idea of PSO is to have swarm of particles flying through a multidimensional search space . and a random component.

a swarm of n individuals communicate either directly or indirectly with one another search directions (gradients). Potential solutions are plotted in this space and seeded with an initial velocity. y In PSO. Particles move through the solution space. . particles accelerate toward those with better fitness values. y PSO is a simple but powerful search technique.(Cont. y It has been applied successfully to a wide variety of search and optimization problems. and certain fitness criteria evaluate them.) y PSO is a global optimization algorithm for dealing with problems in which a point or surface in an ndimensional space best represents a solution. Over time. y Particle Swarm Optimization (PSO) applies to concept of social interaction to problem solving.

and Eberhart. (1995). 1942-1948. James kennedy Mr. R.Russell Eberhart It was developed in 1995 by James Kennedy and Russ Eberhart [Kennedy. IEEE Press. pp. Particle Swarm Optimization .Who basically found it???? Mayor. Proceedings of the 1995 IEEE International Conference on Neural Networks. J.] .

In the field of chemiformatics. PSO has successfully been applied to the quantitative structure activity relationship(QSAR) modeling . including knearest neighbor and kernel regression .Advantages y The main advantage of such an approach over other global minimization strategies such as simulated annealing is that the large numbers of the members that make up the particle swarm make the technique impressively resilient to the problem of the local minima y An advantage of PSO is its ability to handle the particle swarm optimization problem with multiple local optima reasonably well and it s simplicity of implementation. partial least square modeling . and neutral network training. minimum spanning tree for piecewise modeling . Especially in the comparison to related strategies like genetic algorithm. .

How it works!!! y The basic algorithm funda: y The social metaphor that led to this algorithm can be summarized as follows: y The individuals that are part of a society hold an opinion that is part of a belief space (search space ) shared by every possible y y y y individual. . Individuals may modify this opinion state based on three factors: The knowledge of the environment(its fitness value) The individual s previous history of states(its memory) The previous history of states of the individual s neighbourhood.

Evaluate the fitness of individual particles(pbest) b.Find the best instance of particle against generation / keep track of the individual s highest fitness(gbest) c.Terminate if condition is best 4.Initialize the particles randomly in the search space 2. .Calculate particle velocities d. e.Terminate if generation count expires.Set the generation count 3.Algorithm 1.While not teminate : a.Update particle position / location.

. the original procedure for implementing PSO is as follows: y Initialize a population of particle with random positions and velocities on D dimensions in the problem space y For each particle .and pi equals to the current location xi in D-dimensional space y Identify the particle in the neighborhood with the best success so far and assign its index to be variable g. y Compare particle s fitness evaluation with its pbest . if current value is better than the pbest . then the set pbest equal to the current value . evaluate the desired optimization fitness function in D variable. easy to implement and conceptual efficient .Working of algorithm y The PSO algorithm is simple in concept .

(Cont.) y Change the velocity and position of the particle according to the following equation Vid=vid+c1rand()(pid-xid)+c2rand()(pgd-xid) Xid=xid+vid y Loop to step b until a critetion is met. usually a sufficiently good fitness or maximum number of iterations. .

y Each point the moves or flies through the seach space following its velocity vector . that is assigned to a potential solution . which is influenced by the directions and velocities of other points in its neighborhood . y How much influence a particular point has on the other points is determined by the fitness . . y These localized interaction with neighboring points propogates through the entire swarm of potential solution. a number of particles are initialized randomly within the search space. which captures how good it is compared to all other solution y Each solution is represented as a series of co-ordinates in n dimensional space .Concepts to apply & how its works y The connection to search and optimization is made by assigning direction vectors and velocities to each point in a multidimensional search space.

every article is moved a certain diatance from its current location . .(Cont.the fitness value is also stored The value is pbest .) y Each particle has a very simple memory of its personal best solution stored as pbest y The global best solution for each iterartion is given as the gbest y On each iteration . the best value is a global best given as gbest. y Each particle keeps track of its coordinates in the problem space which are associated with the best solution it has achieved. influenced a random amount by the pbest and gbest values.When a particle takes all population as its topological neighbors.

particle calculates its new velocity and position using forula a & b respectively V[]=v[]+c1*rand()*(pbest[]-present[]) + c2*rand[](gbest[]-present[]) Present[]=present[]+v[] .(Cont.) y After fining this best value.

APPLICATION y Solving the constrained optimization problem y Solving min-max problem y Solving multiobjective optimization problem y Also used for solvig the travelling salesman problem: .

y Dynamic tracking y Evolve weights and structure of the neutral network.(Cont. y Register 3d-to-3d biomedical image . y Analyse human tremor.) y Her e in travelling sales man problem the PSO is applied so as to get the minimum distance travelled by the sales man . coloured dot is a beginning point whle others are destination.

y Conceptually. It is highly dependent on stochastic processes. it is ordinary experience. particle swarm optimization has obvious ties with evolutionary computation.in fact. which occurs on the order of milliseconds. and neural processing. like evolutionary programming .Conclusion y Particle swarm optimization is an extremely simple algorithm that seems to be effective for optimizing a wide range of functions. occupying the space in nature between evolutionary search. which requires eons. it seems to lie somewhere between genetic algorithms and evolutionary programming. Social optimization occurs in the time frame of ordinary experience . y We view it as a mid-level form of A-life or biologically derived algorithm. y In addition to its ties with A-life.

.

Thank You!!! .

- adaptive filters-3 - PPT.pptx
- Pso
- An Empirical Study on the Settings of Control Coefficients in Particle Swarm Optimisation
- PSO(Particle Swam Optimization)
- As Me Part 6
- Slides
- STAT MECH_Lecture 10
- A New Particle Swarm Optimization Based Auto-tuning of Pid
- IJMPA2013
- Particle Swarm Optimization
- Final13 Sol
- Combined Algorithm of Particle Swarm Optimization
- MIT8_04S13_ps4
- Particle Swarm Optimization
- Simulation of Radiation Characteristics of Sierpinski Fractal Geometry for Multiband Applications
- particle-swarm-optimization
- 2010_ Methode Semi-Inverse
- Memory Polynomial Based Adaptive Digital Predistorter
- PhysRevLett.13.508
- Broken Symmetries
- 01_pzflex_ieee93
- DC Lecture2
- 2d beam
- Van J. Wedeen et al- Mapping Complex Tissue Architecture With Diffusion Spectrum Magnetic Resonance Imaging
- LTM Laplace Transform 2011a Mk
- Continuous Newton-Euler Algorithms For
- J. Hetthéssy, A. Barta, R. Bars
- Donze_ppr.pdf
- Bagian 0 Pengantar Dan Silabus
- The General Problem of Elastic Wave Propagation in Multilayered Anisotropic Media

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue listening from where you left off, or restart the preview.

scribd