- (Back to some of my other maths pages)

Maurice.Clerc@WriteMe.com

(Below are some drafts and source codes. See also the Particle Swarm Central (useful links about people, papers etc.).
Note that some items are not specific to PSO, but more generally about optimisation.
Last item added or modified (2021-06-19): Iterative optimisation - When a best algorithm does exist and other frustrating theorems
 
 Title
 Comments
 Last update
 Download
DEMO/Fun A simplified adaptive version (C source).No parameters to "guess", and you can add your own objective function.NEW: you can now solve Graph Coloring Problems. Also, you can generate a data file for post-mortem animation. If you want, you can try some options like local queens or Guided Random Generation.  2001-11-12 Some animations(zipped)

ANSI C source code + examples (zipped)

Cheap_PSO (Think locally, act locally)(.pdf + .doc, zipped) (not up to date)

Note: see Tribes for a more recent (and quite different) PSO version

D E M O/Animation A small animation showing how a swarm generates subswarms around suboptima for Alpine function.  2000/10/24 Animated  GIF (zipped)
Some math about Particle Swarm Optimization (PSO) Analytical view. Explicit and Implicit Representations. Particle in Complexland. Convergence and Spaceof States, constriction. 

(quite big. you should better download the PDF file)

 1998/11/27  PDFfile, 182Ko
Algebraic View Basic case. Eigenvalues. Cycles.  1998/04/30  PDF file(zipped)
The Alpine Function Many local optima, just one maximum.  1998/11/24  PDFfile (zipped)
The Most Stupid Algorithm (MSA) Random search. Cost of the algorithm.  1998/11/24  PDF file(zipped)
Performance Maps Deterministic PSO. Convergence time vs two parameters.  1998/11/24  PDFfile (zipped)
Mechanical Analogyspring picture Two parameters. Virtual springs.  1998/11/24  PDFfile
 The Simplest PSO Deterministic, one equation,one parameter. No-hope/Re-hope method. Alpine and Banana functions.  1998/12/11  PDF file(zipped)
 The  Swarm & The Queen  A great improvement of the "Simplest PSO", particularly thanks to a gravity center (the "queen") and a better adaptive No-hope/Re-hope method.  The paper has been presented atCEC99.   1999/09/22  Play with an worksheet(2D Alpine function, French Excel version, 440Ko). 
Slides (1999/07/26): Powerpoint file (Macintosh, may work on some PC)(zipped)
Online (Web pages)
Draft PDF version of the paper (slightly updated in 2001) (zipped)
 A little fuzzy toolbox  Fuzzy set computation tools to prepare a fuzzy PSO. Arithmetic functions and some classical ones (exp, log, sin).  1999/03/28  ANSI C sourcecode
The Particle Swarm: Explosion, Stability, and Convergence in a Multi-Dimensional Complex Space by Maurice Clerc and James Kennedy. 

Big paper about Algebraic and Analytical views, with examples. Published by IEEE in 2002, so I can give you here just the draft.    2005 IEEE Transactions on Evolutionary Computation award ($1,000 for Jim and me ;-) )



1998 
 
 
 

2002


A preliminary paper

Draft (zipped MSWord file)

DiscretePSO: Traveling Salesman Problem (v. 4.2).  PSO  can work on finite state spaces and with non continuous functions. Here is an example: looking for shortest cycles in a valuated graph. The last version is more clean, and strictly written in ANSI C. 2000/02/29
2003-04-23
.pdf file (4.2,zipped)
C sources (4.4, zipped)
ANSI C sources (zipped)
DiscretePSO: Fuzzy Combinatorial Black Box When  running a combinatorial black box with PSO,the problem is that, depending on parameters, results can  be quitegood ... or worse than random or exhaustive  search. If you have noway to guess what could be the best parameters, use fuzzy representations:you will never have very bad results (and almost never very good ones). 2000/04/28 PDF file(zipped)
Optimisation par essaim particulaire Présentation faite à Paris lors de la JET5 2000/05/05 Fichiers PowerPoint zippés OEP1erOPE2
When Ant Colony Optimization does not need Swarm Intelligence A very short paper formally proving that ACO, as defined by Marco Dorigo in 1999, does not really use swarm intelligence. The underlying idea is to analyze ACO in order to see if some principles could be useful to improve PSO. 2000/08/08 PDF file (zipped)
 OEP et coloriage de graphe pour l'affectation defréquences (PSO and Graph Coloring for Frequency Assignement)  (in French, expurgated version). A hybrid system using an adaptive  PSO and some more classical algorithmes gives interesting results  2001-07  PDF file (zipped)
PSO and Information What kind of information PSO uses to find a solution and how it uses it? First part (draft) of this study 2001-11-16 MSWord file and PDF file(zipped)
A few words about constriction A conjecture and a rule of thumb to find "good" parameters 2001-12-11 MSWord file and PDF file(zipped)
The old bias and its solution  All PSO works using randomness also use biased formulas. It does not prevent to converge, but in increases the variance of a set of runs. This short paper explain the bias and how to avoid it 2001-12-12
 
 
 
 

2003-07-10

MSWord file and PDF file(zipped)
 
 
 
 

Warning. This version is also biased! See below Uniform hyperspherical distribution

F(?)AQ Not so F, but you quite often ask me some questions. Here are some answers. Don't hesitate to write me, however. 2006-03-11  
Think Locally, Act Locally: A Framework for Adaptive Particle Swarm Optimizers
In Particle Swarm Optimization, each particle moves in the search space and updates its velocity according to best previous positions already found by its neighbors (and itself), trying to find an even better position. This approach has been proved to be powerful but needs parameters predefined by the user, like swarm size, neighborhood size, and some coefficients, and tuning these parameters for a given problem may be, in fact, quite difficult. This paper presents a framework for designing adaptive particle swarm optimizers without any "supervisor", in order to obtain autonomous methods in which the user has almost nothing to do but describe the problem. To illustrate the adaptation process, a possible implementation for a simple PSO version is given, so that we can examine how different, and globally better, is the behavior of the swarm on some classical test functions. 2002-07-01


2004-10-24



  PDF
Tribes, a parameter free PSO Particle Swarm Optimizers usually need some predefined  parameters, i.e. numerical coefficients, swarm size, neighbourhood size and topology. It is not the case with the present version thanks to two techniques: hyperspheres instead of hyperparallelepipeds for proximity areas, and adaptation of the swarm size as well as the relationships between the particles.  2004-05-05




 
 
 
 


.doc paper (zipped)
Warning: older than the C source code
 


Tribes and Strategies Here is the last "public" source code. Different strategies depending on the "past" of the particle, non uniform distributions for proximity areas (Gaussian, for example), fuzzification, noise, recursivity, etc.),
"no clamping and no evaluation" option, etc.

C version contains  a QAP (Quadratic Assignment Problem) option and a Traveling Salesman Problem (TSP) option,. Not that good, but may be useful for small graphs (say 30 nodes). The multiobjective option can also be used to cope with indicative (soft) constraints.
2006-11-10



2006-10-26



2006-09-21

2006-01-03










2005-03-25
 
 

2004-10-04
 
 
 
 














Fixed some bugs in the simplified Java version. Results are now far better (in particular for Schwefel+noise, and Elliptic)

Fixed a bug in the simplified Java version (about info-network)

Slightly simplified Java version

Java version. It includes the "sunnyspell" method.  I have  added the 24 constrained problems of CEC 2006. Also you can now safely launch several problems on the same time.
 
Zipped C sources of last public version

Added 14 of the 25 functions of the CEC05 benchmark set. If you want to use them, you must also download this BM repertory

Added a pseudo-gradient method to choose the best informer. Valid only if the search space is a metric one, so it is an option.

This version now contains a correct hyperspherical distribution. The independent Gaussian distributions are quite good for small problems partly combinatorial.

A new distribution has been added (positive ellipsoïdal sectors).

You can now use any kind of discrete variable, by giving the list of acceptable values.

Some "classical" examples have been added (gear train, pressure vessel, coil spring).  At this date, result are slightly better than the best published ones, without any parameter to tune.
 

Tribes and MO A small "paper" about Multiobjective problems with Tribes, compared to MOPSO. Three examples are given, but the above source code contains far more (see myfunction.c). 2003-05-02 Obsolete, see TRIBES-D
Distorted Gaussian Distribution In order to escape "the old bias", Tribes can use different kind of distributions, for example spherical ones or Gaussian ones, but they are all symmetrical. This one is not, "distorted" along an axis, typically p_i <->p_g (best previous position <-> best previous position  in informer group). For it can easily be adapted to any PSO version, I put here just a "generic" C code, using two points A and B. Of course, in real use, you take only one point of the distribution. 2003-06-25 Zipped C source
Uniform Hyperspherical Distribution Last week, Rui Mendes pointed out my "unbiased distribution" was in fact biased. So I have written a new version that should be perfect (and faster, anyway). It is now included in Tribes.
2003-07-10 C source (zipped)
Semi-continuous challenge In order to improve Tribes, I need help to solve these six problems. What are the best PSO versions and the best parameters? Is it possible to define some rules to find these good parameters? For amatheurs, I have added a few words about "theoretical difficulty" and  "handling constraints by homeomorphism". 2004-02-20 .pdf version
OEP
You wanted it, here it is. The parametric PSO I used for the Semi-continuous challenge. Note that comments are in French ...
2004-10-14
C source (zipped)
PSO_basic
Between the too_many_parameters OEP and the no_parameters_at_all Tribes, this simple version may be enough for some people
2004-08-09
C source (zipped)
A peer reviewed electronic paper (in French)
M. Clerc et P. Siarry, Une nouvelle métaheuristique pour l'optimisation : la
méthode des essaims particulaires, J3eA, Vol. 3 (2004) 7
2004-09-20

PDF file
Binary PSO: toolbox, derivations, and mathematical insights
Still a draft, and quite long (40 p.) but you may be interested

2005-02-02
2005-02-02
Open Archive HAL          
 C source (zipped)

Un livre en français, chez Hermès-Lavoisier. 260 pages,




English version: Particle Swarm Optimization (Wiley book, 2006)
L'optimisation par essaims particulaires. Versions paramétriques et adaptives
Contient pas mal de choses jamais publiées, ni ici, ni dans des revues, par exemple sur les distributions de possibilités ou les critères de convergence. Quelques annexes techniques pour les "amatheurs".
2005-01
ERRATUM (2005-02-21)
p. 138 
9.2.3.4
remplacer "K est un fait un maximum" par "K est en fait une valeur moyenne"
9.2.3.6 Remplacer 'Informatrices N" par "Exploratrices N"

p. 239 
Remplacer la dernière formule par
xt+1=xt+vt+1+(p-xt)(1-khi')

Stagnation analysis
or "What happens when nothing happens". Thanks to this mathematical analysis, better confidence coefficients for parametric PSO are defined
2006-08-05




2006-01-19

2005-06-19
version edited by Riccardo Poli (better English but not complete)
PDF   here or on the XPS technical reports page (ref. CSM-460)

Complete PDF file on Open Archive HAL (see in particular Figure 4)
First version (not online anymore)
SunnySpell
Just a way to add a new particle not anymore purely at random
2005-08-26

Ad.
Particle Swarm Optimization, ISTE
This book contains some things never published in English even here, for exemple about non uniform distributions of possibilities, convergence criteria, rumor spreading, etc. This a translation of the 2005 book in French, with a few updates. 2006-02
Confinements and Biases Because of the confinement, PSO performance is depending on where the solution point is (near the center of the search space,  near the bound, etc.). Nine confinement methods (including no confinement at all) are studied. An hybrid more robust one is suggested.  2006-03-12 Open Archive HAL
When a particle leaves the search space the "no clamping and no evaluation" method is the simplest one. However it may happen that all particles are then moving outside the search space, and the algorithm never stops. Also, even if they induce a bias, some confinement methods may be more effective.
Some ideas about PSO Some of them are so crazy that they might work. Some others seems reasonable but may not work. 2006-11-06 PDF file
Comparing two algorithms How to compare two stochastic algorithms from an user point of view 2006-07-04 Link to the corresponding item on the Math page
Non parametric optimisation for unimodal functions: DIchotomy In order to improve Tribes as long as it seems the fitnesse function may be unimodal, we need a specific strategy. Here is one (by dichotomy). 2006-08-14


2006-07-13
New version, that includes a non unimodality test: SciLab/MatLab code
Swissknife PSO
Contains a lot of options, in order to "simulate" most of interesting PSO variations. 2007-03-01 C source
Back to Random topology A small paper re-explaining the method used in some of my "basic" PSOs, why it is equivalent to some "new" ones, and suggesting how it could be improved. 2007-02-26 PDF file
When Nearer is Better Most of iterative optimisation algorithms, if not all, do assume that "nearer is better", more or less.  A mathematical definition, and some fascinating derivations. 2007-03-22 Open archive HAL
TRIBES-D A simplified version for real heterogeneous problems. It easily finds the best known solution for the classical "Pressure vessel" problem.  For multiobjective problems, it is also far better than the previous TRIBES version. 
2013-01-16
2008-02-01
Just for fun, added a Magic square  problem. C code and some results (ZIP file)
Simple Binary PSO Coming partly from the binary toolbox,  a C program easy to use and nevertheless quite effective 2010-09-08
2008-05-11
2008-04-30
C source (zipped)
Bimetric Binary PSO Coming from the previous one. Even simpler, and slightly better. 2008-11-04 C source (zipped)
Balanced PSO Variable swarm size
More options (orthogonal walk, bi-strategy, etc.)
More options (variable ring topology, orthogonal learning, etc.)






Contrarily to often claimed, Standard PSO does not ensure a good balance between exploration and exploitation. This small paper explains why, and suggests then an improved variant.
2010-04-15
2010-03-17
2010-02-28
2010-02-15
2010-01-09

2009-09-30
2009-01-12
2009-01-05
2008-12-24
2008-10-31
C source (zipped)
C source (zipped)
C source (zipped)
C source (zipped)
C source (zipped)

C source (zipped) Version for the Handbook of Swarm Intelligence


PDF paper (2008-10-3, obsolete, there were a few bugs)

Initialisations
PSO can be significantly improved, just by using a good initialisation scheme for the positions and the velocities 2008-12-24
2008-12-14
PDF paper
See Balanced PSO
A method to improve standard PSO Sometimes you know that your problem belongs to a given class. For example : quite low dimension (say smaller than 30) and weakly multimodal. In such a case, it is possible to design a specific PSO more efficient than the standard one. Here is a method. 2009-03-13 PDF paper

Universal Random Number Generator (in Maths Results, but useful for some PSO variants) 2009-12-22

A Mini-benchmark
Four carefully chosen problems, and some results with  Standard PSO 2007 and Variable PSO (see below). Very useful to quickly estimate whether your own brand new algorithm is really promising or not.
2010-04-15
PDF Paper
Variable PSO
By playing with Balanced PSO, I have found this simple and effective variant: like SPSO 2007, but with variable ring topology and variable swarm size. You may try various rules to generate/delete particles
2011-01-20
2010-05-13
2010-04-15
C source (zipped)
C source (zipped)
C source (zipped)
Ice Cream Cone PSO
Towards Standard PSO 201x? Almost as simple as SPSO 2007, but not biased
2010-11-09
C source (zipped)
Standard PSO Descriptions
SPSO 2006, 2007,  2011, as available on the Particle Swarm Central. Unformal and formal descriptions, with a few comments
2012-09-13
2011-07-12
Open archive HAL
FAQ about PSO
Open Office presentation (ICSI 2011 seminar in Cergy)
2011-07-26
ODP file+TrueType font (zipped)
Let's build a PSO
Open Office presentation (ICSI 2011 seminar in Cergy) 2011-07-26
ODP file+TrueType font (zipped)
Linear DE
L-DE is a "geometrical" Differential Evolution variant, whose behaviour is not depending on the coordinate system.
2011-11-04
C source (zipped)
Soft Computing doesn't need to hard
SocPros 2011 presentation in Roorkee. Include the presentation file (you need LibreOffice or OpenOffice to open it), and the text of the talk.
2011-12-26
LibreOffice presentation (zipped)

List Based PSO
When you have to often solve some variants of the same kind of problem, the random number generator can be replaced by a short list of predefined numbers with far better results. Here, a short paper explains how to do, along with the C code. Note that it includes a classical RNG, for tests and comparisons. So for the same price, i.e. nothing, you have also a pretty simple but not bad classical PSO.
2013-09-13
2012-02-18
C source + lists (zipped), Version 2
C source +lists (zipped)
Open archive HAL
Specific SPSO 2011 as LBO (C source + lists)
Simple Generators for List-based Optimisers
Paper
Talk (presented at ICSIBO 2014)
2014-06
2014-05
Open archive (HAL)
LibreOffice Impress file (zipped)
Randomness matters
Depending on the kind of "randomness" you use, performances can be very different.
2012-04-14
Open archive HAL
Cooperations Mechanisms in PSO
Five cooperations mechanisms, viz. Vicinity, Reciprocity, Kin, Reputation and Anybody, are studied. The appendix gives some details about fair comparison of success rates, and the concepts of valued topology and chains of information, which may be worth further investigation.
2013-01-18
Open archive HAL
C source (zipped)
Book (Wiley): Guided Randomness in Optimization The performance of an algorithm used depends on the GNA. This book focuses on the comparison of optimizers, it defines a stress-outcome approach which can be derived all the classic criteria (median, average, etc.) and other more sophisticated.   Source-codes used for the examples are also presented, this allows a reflection on the "superfluous chance," succinctly explaining why and how the stochastic aspect of optimization could be avoided in some cases. 2015
Total Memory Optimiser Talk at ICSIBO 2016, Mulhouse, France.
Talk (modified) at SIS 2016, Athens, Greece
2016 LibreOffice Impress file (zipped)
LibreOffice Impress file (zipped)
Bare Bones PSO with Hyperspheres bbPSOh. An over-simplified version of SPSO 2011. Not as good as the complete C version, but useful for understanding and tests. 2018 Matlab code (zipped)
The questionable balance mantra Talk not given at SocPros 2019, because of cancelled flight.
But finally given (a slightly modified version) at CIS 2020.
2019
2020
LibreOffice Impress + PDF (zipped)
Multi-Agents Multi-Strategies Optimiser A tool to combine strategies coming from several optimisers, including PSO 2021-03-11 Matlab/Octave code (zipped)
Iterative optimisation - When a best algorithm does exist and other frustrating theorems When the benchmark is closed under permutations, there is no best algorithm, in average (No Free Lunch Theorem).  However, in practice, benchmarks are never c.u.p. and it can be proved that such an algorithm does exist. But the proof does not say how to find it. Worse,  this best algorithm on a given benchmark can be the worst on another one (among a set of algorithms we compare). 2021-06-19