0% found this document useful (0 votes)
210 views18 pages

The Particle Swarm Optimization Algorithm

The document discusses the Particle Swarm Optimization (PSO) algorithm. PSO is a computational method that optimizes problems by iteratively trying to improve candidate solutions. It was developed in 1995 and inspired by bird flocking behavior. PSO benefits from faster convergence and less parameters to tune compared to other optimization techniques like genetic algorithms. The basic principle of PSO involves moving particles toward the best positions in the search space based on each particle's and the swarm's best known positions.
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
Download as ppt, pdf, or txt
0% found this document useful (0 votes)
210 views18 pages

The Particle Swarm Optimization Algorithm

The document discusses the Particle Swarm Optimization (PSO) algorithm. PSO is a computational method that optimizes problems by iteratively trying to improve candidate solutions. It was developed in 1995 and inspired by bird flocking behavior. PSO benefits from faster convergence and less parameters to tune compared to other optimization techniques like genetic algorithms. The basic principle of PSO involves moving particles toward the best positions in the search space based on each particle's and the swarm's best known positions.
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1/ 18

The Particle Swarm

Optimization
Algorithm

Neboja Trpkovi
trx.lists@gmail.com
10th Dec 2010
Problem Definition

optimization of continuous nonlinear


functions

finding the best solution in problem


space

Neboja Trpkovi trx.lists@gmail.com Slide 2 of 18


Example

Neboja Trpkovi trx.lists@gmail.com Slide 3 of 18


Importance

function optimization

artificial neural network training

fuzzy system control

Neboja Trpkovi trx.lists@gmail.com Slide 4 of 18


Existing Solutions

Ant Colony (ACO)


discrete

Genetic Algorithms (GA)


slow convergence

Neboja Trpkovi trx.lists@gmail.com Slide 5 of 18


Particle Swarm Optimization

Very simple classification:

a computational method
that optimizes a problem
by iteratively trying to improve a candidate solution
with regard to a given measure of quality

Neboja Trpkovi trx.lists@gmail.com Slide 6 of 18


Particle Swarm Optimization
Facts:

developed by Russell C. Eberhart and James Kennedy


in 1995

inspired by social behavior of bird flocking or fish


schooling

similar to evolutionary techniques such as Genetic


Algorithms (GA)

Neboja Trpkovi trx.lists@gmail.com Slide 7 of 18


Particle Swarm Optimization
Benefits:

faster convergence
less parameters to tune

easier searching in very large
problem spaces

Neboja Trpkovi trx.lists@gmail.com Slide 8 of 18


Particle Swarm Optimization
Basic principle:

let particle swarm move


towards the best position in search
space, remembering each particles
best known position
and global (swarms) best known
position

Neboja Trpkovi trx.lists@gmail.com Slide 9 of 18


Velocity Change
xi specific particle
pi particles (personal) best known position
g swarms (global) best known position
vi particles velocity

vi vi + prp(pi - xi) + grg(g -


xi )
inertia cognitive social

Neboja Trpkovi trx.lists@gmail.com Slide 10 of 18


Position Change

xi specific particle
vi particles velocity

xi x i + v i

Neboja Trpkovi trx.lists@gmail.com Slide 11 of 18


Algorithm
For each particle
Initialize particle
END

Do

For each particle


Calculate fitness value
If the fitness value is better than the best personal fitness value in history,
set current value as a new best personal fitness value
End

Choose the particle with the best fitness value of all the particles, and if that
fitness value is better then current global best, set as a global best
fitness value
For each particle
Calculate particle velocity according velocity change equation
Update particle position according position change equation
End

While maximum iterations or minimum error criteria is not attained


Neboja Trpkovi trx.lists@gmail.com Slide 12 of 18
Single Particle

Neboja Trpkovi trx.lists@gmail.com Slide 13 of 18


Parameters selection
Different ways to choose parameters:

proper balance between exploration and


exploitation
(avoiding premature convergence to a local optimum yet still
ensuring a good rate of convergence to the optimum)

putting all attention on exploitation


(making possible searches in a vast problem spaces)

automatization by meta-optimization

Neboja Trpkovi trx.lists@gmail.com Slide 14 of 18


Avoiding Local Optimums

adding randomization factor to velocity


calculation

adding random momentum in a specific


iterations

Neboja Trpkovi trx.lists@gmail.com Slide 15 of 18


Swarm

Neboja Trpkovi trx.lists@gmail.com Slide 16 of 18


Conclusion
This algorithm belongs ideologically to that philosophical
school

that allows wisdom to emerge rather than trying to impose it,

that emulates nature rather than trying to control it,

and that seeks to make things simpler rather than more


complex.

James Kennedy, Russell Eberhart

Neboja Trpkovi trx.lists@gmail.com Slide 17 of 18


References
Wikipedia
http://www.wikipedia.org/
Swarm Intelligence
http://www.swarmintelligence.org/
Application of a particle swarm optimization
algorithm for determining optimum well location
and type, Jerome Onwunalu and Louis J. Durlofsky, 2009
Particle Swarm Optimization, James Kennedy and
Russell Eberhart, 1995
http://www.engr.iupui.edu/~shi/Coference/psopap4.html
Robot Swarm driven by Particle Swarm
Optimization algorithm, thinkfluid
http://www.youtube.com/watch?v=RLIA1EKfSys

Neboja Trpkovi trx.lists@gmail.com Slide 18 of 18

You might also like