Particle Swarm Optimization Introduction Marco A. Montes de Oca - - PowerPoint PPT Presentation

particle swarm optimization
SMART_READER_LITE
LIVE PREVIEW

Particle Swarm Optimization Introduction Marco A. Montes de Oca - - PowerPoint PPT Presentation

Particle Swarm Optimization Introduction Marco A. Montes de Oca IRIDIA-CoDE, Universit e Libre de Bruxelles (U.L.B.) May 7, 2007 Marco A. Montes de Oca Particle Swarm Optimization Presentation overview Origins The idea Continuous


slide-1
SLIDE 1

Particle Swarm Optimization

Introduction Marco A. Montes de Oca

IRIDIA-CoDE, Universit´ e Libre de Bruxelles (U.L.B.)

May 7, 2007

Marco A. Montes de Oca Particle Swarm Optimization

slide-2
SLIDE 2

Presentation overview

Origins The idea Continuous optimization The basic algorithm Main variants Parameter selection Research issues Our work at IRIDIA-CoDE

Marco A. Montes de Oca Particle Swarm Optimization

slide-3
SLIDE 3

Particle swarm optimization: Origins

How can birds or fish ex- hibit such a coordinated collective behavior?

Marco A. Montes de Oca Particle Swarm Optimization

slide-4
SLIDE 4

Particle swarm optimization: Origins

Reynolds [12] proposed a behavioral model in which each agent follows three rules:

  • Separation. Each agent tries to move

away from its neighbors if they are too close.

  • Alignment. Each agent steers towards the

average heading of its neighbors.

  • Cohesion. Each agent tries to go

towards the average position

  • f its neighbors.

Marco A. Montes de Oca Particle Swarm Optimization

slide-5
SLIDE 5

Particle swarm optimization: Origins

Kennedy and Eberhart [6] included a ‘roost’ in a simplified Reynolds-like simulation so that: Each agent was attracted towards the location of the roost. Each agent ‘remembered’ where it was closer to the roost. Each agent shared information with its neighbors (originally, all other agents) about its closest location to the roost.

Marco A. Montes de Oca Particle Swarm Optimization

slide-6
SLIDE 6

Particle swarm optimization: The idea

Eventually, all agents ‘landed’ on the roost. What if the notion of distance to the roost is changed by an unknown function? Will the agents ‘land’ in the minimum?

  • J. Kennedy
  • R. Eberhart

Marco A. Montes de Oca Particle Swarm Optimization

slide-7
SLIDE 7

Continuous Optimization

The continuous optimization problem can be stated as follows:

  • 4
  • 2

2 4

  • 4
  • 2

2 4

  • 60
  • 40
  • 20

20 40 60 80 100

Find X ∗ ⊆ X ⊆ Rn such that X ∗ = argmin x∈X f (x) = {x∗ ∈ X : f (x∗) ≤ f (x) ∀x ∈ X}

Marco A. Montes de Oca Particle Swarm Optimization

slide-8
SLIDE 8

Particle swarm optimization: The basic algorithm

  • 1. Create a ‘population’ of agents (called particles) uniformly

distributed over X.

Marco A. Montes de Oca Particle Swarm Optimization

slide-9
SLIDE 9

Particle swarm optimization: The basic algorithm

  • 2. Evaluate each particle’s position according to the objective

function.

Marco A. Montes de Oca Particle Swarm Optimization

slide-10
SLIDE 10

Particle swarm optimization: The basic algorithm

  • 3. If a particle’s current position is better than its previous best

position, update it.

Marco A. Montes de Oca Particle Swarm Optimization

slide-11
SLIDE 11

Particle swarm optimization: The basic algorithm

  • 4. Determine the best particle (according to the particle’s previous

best positions).

Marco A. Montes de Oca Particle Swarm Optimization

slide-12
SLIDE 12

Particle swarm optimization: The basic algorithm

  • 5. Update particles’ velocities according to

v t+1

i

= v t

i + ϕ1U t 1(pb t i − x t i ) + ϕ2U t 2(gb t − x t i ).

Marco A. Montes de Oca Particle Swarm Optimization

slide-13
SLIDE 13

Particle swarm optimization: The basic algorithm

  • 6. Move particles to their new positions according to

x t+1

i

= x t

i + v t+1 i

.

Marco A. Montes de Oca Particle Swarm Optimization

slide-14
SLIDE 14

Particle swarm optimization: The basic algorithm

  • 7. Go to step 2 until stopping criteria are satisfied.

Marco A. Montes de Oca Particle Swarm Optimization

slide-15
SLIDE 15

Particle swarm optimization: The basic algorithm

  • 2. Evaluate each particle’s position according to the objective

function.

Marco A. Montes de Oca Particle Swarm Optimization

slide-16
SLIDE 16

Particle swarm optimization: The basic algorithm

  • 3. If a particle’s current position is better than its previous best

position, update it.

Marco A. Montes de Oca Particle Swarm Optimization

slide-17
SLIDE 17

Particle swarm optimization: The basic algorithm

  • 3. If a particle’s current position is better than its previous best

position, update it.

Marco A. Montes de Oca Particle Swarm Optimization

slide-18
SLIDE 18

Particle swarm optimization: The basic algorithm

  • 4. Determine the best particle (according to the particle’s previous

best positions).

Marco A. Montes de Oca Particle Swarm Optimization

slide-19
SLIDE 19

Particle swarm optimization: The basic algorithm

  • 5. Update particles’ velocities according to

v t+1

i

= v t

i + ϕ1U t 1(pb t i − x t i ) + ϕ2U t 2(gb t − x t i ).

Marco A. Montes de Oca Particle Swarm Optimization

slide-20
SLIDE 20

Particle swarm optimization: The basic algorithm

  • 6. Move particles to their new positions according to

x t+1

i

= x t

i + v t+1 i

.

Marco A. Montes de Oca Particle Swarm Optimization

slide-21
SLIDE 21

Particle swarm optimization: The basic algorithm

  • 7. Go to step 2 until stopping criteria are satisfied.

Marco A. Montes de Oca Particle Swarm Optimization

slide-22
SLIDE 22

Particle swarm optimization: Main variants

Almost all modifications vary in some way the velocity-update rule: v t+1

i

= v t

i + ϕ1U t 1(pb t i − x t i ) + ϕ2U t 2(gb t − x t i )

Marco A. Montes de Oca Particle Swarm Optimization

slide-23
SLIDE 23

Particle swarm optimization: Main variants

Almost all modifications vary in some way the velocity-update rule: v t+1

i

= v t

i

  • inertia

+ ϕ1U t

1(pb t i − x t i ) + ϕ2U t 2(gb t − x t i )

Marco A. Montes de Oca Particle Swarm Optimization

slide-24
SLIDE 24

Particle swarm optimization: Main variants

Almost all modifications vary in some way the velocity-update rule: v t+1

i

= v t

i + ϕ1U t 1(pb t i − x t i )

  • personal influence

+ ϕ2U t

2(gb t − x t i )

Marco A. Montes de Oca Particle Swarm Optimization

slide-25
SLIDE 25

Particle swarm optimization: Main variants

Almost all modifications vary in some way the velocity-update rule: v t+1

i

= v t

i + ϕ1U t 1(pb t i − x t i ) + ϕ2U t 2(gb t − x t i )

  • social influence

Marco A. Montes de Oca Particle Swarm Optimization

slide-26
SLIDE 26

Particle swarm optimization: Different population topologies

Every particle i has a neighborhood Ni. v t+1

i

= v t

i + ϕ1U t 1(pb t i − x t i ) + ϕ2U t 2(lb t i − x t i )

Marco A. Montes de Oca Particle Swarm Optimization

slide-27
SLIDE 27

Particle swarm optimization: Inertia weight

It adds a parameter called inertia weight so that the modified rule is: v t+1

i

= wv t

i + ϕ1U t 1(pb t i − x t i ) + ϕ2U t 2(lb t i − x t i )

It was proposed by Shi and Eberhart [14].

Marco A. Montes de Oca Particle Swarm Optimization

slide-28
SLIDE 28

Particle swarm optimization: Time-decreasing inertia weight

The value of the inertia weight is decreased during a run It was proposed by Shi and Eberhart [15].

Marco A. Montes de Oca Particle Swarm Optimization

slide-29
SLIDE 29

Particle swarm optimization: Canonical PSO

It is a special case of the inertia weight variant derived from: v t+1

i

= χ

  • v t

i + ϕ1U t 1(pb t i − x t i ) + ϕ2U t 2(lb t i − x t i )

  • ,

where χ is called a “constriction factor” and is fixed. It has been very influential after its proposal by Clerc and Kennedy [3].

Marco A. Montes de Oca Particle Swarm Optimization

slide-30
SLIDE 30

Particle swarm optimization: Fully Informed PSO

In the Fully Informed PSO, a particle is attracted by every other particle in its neighborhood: v t+1

i

= χ  v t

i +

  • pk∈Ni

ϕkU t

k(pb t k − x t i )

  . It was proposed by Mendes et al. [9].

Marco A. Montes de Oca Particle Swarm Optimization

slide-31
SLIDE 31

Particle swarm optimization: Other variants

There are many other variants reported in the literature. Among

  • thers:

with dynamic neighborhood topologies (e.g., [16], [10]) with enhanced diversity (e.g., [2], [13] ) with different velocity update rules (e.g., [11], [8] ) with components from other approches (e.g., [1], [5] ) for discrete optimization problems (e.g., [7], [18] ) . . .

Marco A. Montes de Oca Particle Swarm Optimization

slide-32
SLIDE 32

Particle swarm optimization: Parameter selection

Consider a one-particle one-dimensional particle swarm. This particle’s velocity-update rule is v t+1 = av t + b1U t

1 (pb t − x t) + b2U t 2 (gb t − x t)

Marco A. Montes de Oca Particle Swarm Optimization

slide-33
SLIDE 33

Particle swarm optimization: Parameter selection

Additionally, if we make E[U t

∗ (0, 1)] = 1

2 , b = b1 + b2 2 , pb t+1 = pb t+1 , gb t+1 = gb t , and r = b1 b1 + b2 pb t + b2 b1 + b2 gb t .

Marco A. Montes de Oca Particle Swarm Optimization

slide-34
SLIDE 34

Particle swarm optimization: Parameter selection

Then, we can say that v t+1 = av t + b(r − x t) .

Marco A. Montes de Oca Particle Swarm Optimization

slide-35
SLIDE 35

Particle swarm optimization: Parameter selection

It can be shown that this system will behave in different ways depending on the value of a, b.

Graph taken from Trelea [17]. Marco A. Montes de Oca Particle Swarm Optimization

slide-36
SLIDE 36

Particle swarm optimization: Parameter selection

Some examples

Graph taken from Trelea [17]. Marco A. Montes de Oca Particle Swarm Optimization

slide-37
SLIDE 37

Particle swarm optimization: Parameter selection

Factors to consider when choosing a particular variant and/or a parameter set: The characteristics of the problem (”modality”, search ranges, dimension, etc) Available search time (wall clock or function evaluations) The solution quality threshold for defining a satisfactory solution

Marco A. Montes de Oca Particle Swarm Optimization

slide-38
SLIDE 38

Particle swarm optimization: Research issues

A number of research directions are currently pursued: Matching algorithms (or algorithmic components) to problems Application to different kind of problems (dynamic, stochastic, combinatorial) Parameter selection. (How many particles, which topology?) Identification of ”state-of-the-art” PSO algorithms (comparisons) New variants (modifications, hybridizations) Theoretical aspects (particles behavior, stagnation)

Marco A. Montes de Oca Particle Swarm Optimization

slide-39
SLIDE 39

Particle swarm optimization: Our work at IRIDIA-CoDE

We have been working on three of the previously mentioned directions: Identification of ”state-of-the-art” PSO algorithms (comparisons) Matching algorithms (or algorithmic components) to problems New variants (modifications, hybridizations)

Marco A. Montes de Oca Particle Swarm Optimization

slide-40
SLIDE 40

Particle swarm optimization: Comparisons

We used run-time and solution-quality distributions [4] to compare several PSO variants.

0.1 1 10 100 run-time [CPU sec] 0.5 1 1.5 2 2.5

  • rel. soln. quality [%]

0.2 0.4 0.6 0.8 1 P(solve) Marco A. Montes de Oca Particle Swarm Optimization

slide-41
SLIDE 41

Particle swarm optimization: Comparisons

Rosenbrock function

Marco A. Montes de Oca Particle Swarm Optimization

slide-42
SLIDE 42

Particle swarm optimization: Comparisons: Different Topologies

(20 particles, Rosenbrock) : Fully connected topology

Marco A. Montes de Oca Particle Swarm Optimization

slide-43
SLIDE 43

Particle swarm optimization: Comparisons: Different Topologies

(20 particles, Rosenbrock) : Square topology

Marco A. Montes de Oca Particle Swarm Optimization

slide-44
SLIDE 44

Particle swarm optimization: Comparisons: Different Topologies

(20 particles, Rosenbrock) : Ring topology

Marco A. Montes de Oca Particle Swarm Optimization

slide-45
SLIDE 45

Particle swarm optimization: Comparisons: Different Topologies

(20 particles, Rosenbrock) : Fully connected topology

Marco A. Montes de Oca Particle Swarm Optimization

slide-46
SLIDE 46

Particle swarm optimization: Comparisons: Different Topologies

(20 particles, Rosenbrock) : Square topology

Marco A. Montes de Oca Particle Swarm Optimization

slide-47
SLIDE 47

Particle swarm optimization: Comparisons: Different Topologies

(20 particles, Rosenbrock) : Ring topology

Marco A. Montes de Oca Particle Swarm Optimization

slide-48
SLIDE 48

Particle swarm optimization: Comparisons: Different Topologies

(20 particles, Rastrigin) : Fully connected topology

Marco A. Montes de Oca Particle Swarm Optimization

slide-49
SLIDE 49

Particle swarm optimization: Comparisons: Different Topologies

(20 particles, Rastrigin) : Square topology

Marco A. Montes de Oca Particle Swarm Optimization

slide-50
SLIDE 50

Particle swarm optimization: Comparisons: Different Topologies

(20 particles, Rastrigin) : Ring topology

Marco A. Montes de Oca Particle Swarm Optimization

slide-51
SLIDE 51

Particle swarm optimization: New variants (Frankenstein’s PSO)

Rastrigin (best configurations for speed)

Marco A. Montes de Oca Particle Swarm Optimization

slide-52
SLIDE 52

Particle swarm optimization: New variants (Frankenstein’s PSO)

Rastrigin (best configurations for quality)

Marco A. Montes de Oca Particle Swarm Optimization

slide-53
SLIDE 53

Thank you

(More) Questions?

http://iridia.ulb.ac.be/∼mmontes/slidesCIL/slides.pdf Marco A. Montes de Oca Particle Swarm Optimization

slide-54
SLIDE 54

Peter J. Angeline. Using selection to improve particle swarm optimization. In Proceedings of the 1998 IEEE Congress on Evolutionary Computation, pages 84–89, Piscataway, NJ, USA, 1998. IEEE Press. Tim M. Blackwell and Peter J. Bentley. Don´ t push me collision-avoiding swarms. In Proceedings of the 2002 IEEE Congress on Evolutionary Computation, pages 1691–1696, Piscataway, NJ, USA, 2002. IEEE Press. Maurice Clerc and James Kennedy. The particle swarm–explosion, stability, and convergence in a multidimensional complex space. IEEE Transactions on Evolutionary Computation, 6(1):58–73, 2002. Holger H. Hoos and Thomas St¨ utzle. Stochastic Local Search: Foundations and Applications. Morgan Kaufmann, San Francisco, CA, USA, 2004.

Marco A. Montes de Oca Particle Swarm Optimization

slide-55
SLIDE 55

Mudassar Iqbal and Marco A. Montes de Oca. An estimation of distribution particle swarm optimization algorithm. In Marco Dorigo, Luca M. Gambardella, Mauro Birattari, Alcherio Martinoli, Riccardo Poli, and Thomas St¨ utzle, editors, LNCS 4150. Ant Colony Optimization and Swarm Intelligence. 5th International Workshop, ANTS 2006, pages 72–83, Berlin, Germany, 2006. Springer-Verlag. James Kennedy and Russell Eberhart. Particle swarm optimization. In Proceedings of IEEE International Conference on Neural Networks, pages 1942–1948, Piscataway, NJ, USA, 1995. IEEE Press. James Kennedy and Russell Eberhart. A discrete binary version of the particle swarm algorithm. In Proceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics, pages 4104 – 4108, Piscataway, NJ, USA, 1997. IEEE Press. Jing Liu, Wenbo Xu, and Jun Sun.

Marco A. Montes de Oca Particle Swarm Optimization

slide-56
SLIDE 56

Quantum-behaved particle swarm optimization with mutation

  • perator.

In Proceedings of the 17th IEEE International Conference on Tools with Artificial Intelligence. ICTAI 05, pages 237 – 240, Piscataway, NJ, USA, 2005. IEEE Press. Rui Mendes, James Kennedy, and Jos´ e Neves. The fully informed particle swarm: Simpler, maybe better. IEEE Transactions on Evolutionary Computation, 8(3):204–210, 2004. Arvind Mohais, Rui Mendes, Christopher Ward, and Christian Postoff. Neighborhood re-structuring in particle swarm optimization. In Shichao Zhang and Ray Jarvis, editors, LNCS 3809. Proceedings

  • f the 18th Australian Joint Conference on Artificial Intelligence,

pages 776–785, Berlin, 2005. Springer. Riccardo Poli, Cecilia Di Chio, and William B. Langdon. Exploring extended particle swarms: A genetic programming approach.

Marco A. Montes de Oca Particle Swarm Optimization

slide-57
SLIDE 57

In Proceedings of the 2005 conference on Genetic and Evolutionary Computation, pages 169–176, New York, NY, USA, 2005. ACM Press. Craig W. Reynolds. Flocks, herds, and schools: A distributed behavioral model. ACM Computer Graphics, 21(4):25–34, 1987.

  • J. Riget and J. Vesterstroem.

A diversity-guided particle swarm optimizer - the arpso. Technical Report 2002-02, Department of Computer Science, University of Aarhus, 2002. Yuhui Shi and Russell Eberhart. A modified particle swarm optimizer. In Proceedings of the IEEE International Conference on Evolutionary Computation, pages 69–73, Piscataway, NJ, USA, 1998. IEEE Press. Yuhui Shi and Russell Eberhart. Empirical study of particle swarm optimization.

Marco A. Montes de Oca Particle Swarm Optimization

slide-58
SLIDE 58

In Proceedings of the 1999 IEEE Congress on Evolutionary Computation, pages 1945–1950, Piscataway, NJ, USA, 1999. IEEE Press. Ponnuthurai N. Suganthan. Particle swarm optimiser with neighbourhood operator. In Proceedings of the 1999 IEEE Congress on Evolutionary Computation, pages 1958–1962, Piscataway, NJ, USA, 1999. IEEE Press. Ioan C. Trelea. The particle swarm optimization algorithm: Convergence analysis and parameter selection. Information Processing Letters, 85(6):317–325, 2003. Kang-Ping Wang, Lan Huang, Chun-Guang Zhou, and Wei Pang. Particle swarm optimization for traveling salesman problem. In Proceedings of the 2003 IEEE International Conference on Machine Learning and Cybernetics, pages 1583 – 1585, Piscataway, NJ, USA, 2003. IEEE Press.

Marco A. Montes de Oca Particle Swarm Optimization