SLIDE 1
Kh Khaled Rasheed Co Comp mputer Science Dept. Uni Universi sity of Georgia ht http://www.cs. s.ug uga.edu/ u/~kha haled
SLIDE 2 } Genetic al
algorithms
Parallel el gen genet etic algo gorithms
} Genetic program
amming
} Evolution strat
ategies
} Clas
assifi fier systems
} Evolution program
amming
} Relat
ated topics
} Conclus
Conclusion ion
SLIDE 3
} Fit
Fitness = Heig ight
} Survival
al of f the fi fittest
SLIDE 4
SLIDE 5 } Mai
aintai ain a a populat ation of f potential al so solutions
} New solutions ar
are generat ated by selecting, combining an and modify fying ex exist sting g so solutions
} Objective fu
function = Fitness fu function
- Better solutions favored for parenthood
- Worse solutions favored for replacement
SLIDE 6
} ma
maximiz imize 2X^ X^2-y+ y+5 where X: X:[0,3],Y:[0,3]
SLIDE 7
} ma
maximiz imize 2X^ X^2-y+ y+5 where X: X:[0,3],Y:[0,3]
SLIDE 8 } Rep
Repres esen enta tati tion
} Fi
Fitn tnes ess functi tion
} Ini
Initialization n st strategy
} Se
Sele lection ion st strategy
} Cro
Crossover r
} Mu
Mutation
SLIDE 9
} Rep
Repres esen enta tati tion
} Fi
Fitn tnes ess functi tion
} Ini
Initialization n st strategy
} Se
Sele lection ion st strategy
} Cro
Crossover r opera rators rs
} Mu
Mutation operators
} Rep
Replacem emen ent t st strategy
SLIDE 10 } Proportional
al selection (roulette wheel)
- Selection probability of individual = individual’s
fitness/sum of fitness
} Ran
ank bas ased selection
- Example: decreasing arithmetic/geometric series
- Better when fitness range is very large or small
} Tour
Tourna nament nt selection
- n
- Virtual tournament between randomly selected
individuals using fitness
SLIDE 11 } Point crosso
ssover (classi ssical)
- Parent1=x1,x2,x3,x4,x5,x6
- Parent2=y1,y2,y3,y4,y5,y6
- Child =x1,x2,x3,x4,y5,y6
} Uniform crosso
ssover
- Parent1=x1,x2,x3,x4,x5,x6
- Parent2=y1,y2,y3,y4,y5,y6
- Child =x1,x2,y3,x4,y5,y6
} Arithmetic crosso
ssover
- Parent1=x1,x2,x3
- Parent2=y1,y2,y3
- Child =(x1+y1)/2,(x2+y2)/2,(x3+y3)/2
SLIDE 12 } ch
chan ange on
compon
} Le
Let Child=x1 =x1,x2 x2,P,x3 x3,x4 x4...
} Ga
Gaussi ussian n mut utation: n:
- P ¬ P ± ∆p
- ∆ p: (small) random normal value
} Un
Uniform mutation:
- P ¬ P new
- p new : random uniform value
} bo
bounda dary mutation:
} Bi
Binary y mutation=bit flip
SLIDE 13
} Finds global
al optima
} Can
an han andle discrete, continuous an and mixed var ariab able spac aces
} Eas
asy to use (short program ams)
} Ro
Robust t (less sensiti tive to to noise, ill con condit ition ions)
SLIDE 14
} Relat
atively slower than an other methods (not suitab able fo for eas asy problems)
} Theory lag
ags behind ap applicat ations
SLIDE 15
SLIDE 16
SLIDE 17
SLIDE 18
} Coar
arse-grai ained GA at at high level
} Fin
Fine-grai ained GA at at low level
SLIDE 19
} Coar
arse-grai ained GA at at high level
} Global
al par aral allel GA at at low level
SLIDE 20
} Coar
arse-grai ained GA at at high level
} Coar
arse-grai ained GA at at low level
SLIDE 21
} Introduced (officially) by John Ko
Koza in his bo book (g (gene netic pro progra ramming ng, 1992)
} Ea
Earl rly attempt pts da date ba back k to the he 50s (e (evolving po popu pulations of bi binary obje bject ct co codes)
} Ide
Idea is to evolve comput puter r pro progra rams
} De
Decl clarative p progr gramming l g langu guage ges us usua ually us used d (Lisp) p)
} Pr
Progr grams a are r e rep epres esen ented ted a as tr trees ees
SLIDE 22 } A
A populat ation of trees ees rep epres esen enting pr programs
} Th
The e prog
ams ar are e com compos
ed of
elem emen ents fr from the he FUNCT CTION SET and nd the he TERMINAL SE SET
} Th
Thes ese e set ets ar are e usual ally fixed ed set ets of
} Th
The e funct ction
et for
leaf" n nodes. . (e (e.g .g. + . +,-,* ,*,s ,sin in,c ,cos)
} Th
The e ter erminal al set et for
eaf nod
e.g. x, x,3.7, random())
SLIDE 23
SLIDE 24 } Fi
Fitn tnes ess is usually based ed on I/O tr traces es
} Cro
Crossover r is implement nted by ra rand ndomly sw swapping su subtrees be between n indi ndividua duals
} GP
GP usually does not extensively rely on mu mutation ion (random
s or
subtrees)
} GP
GPs are usually generational (sometimes wi with h a gene nera ration n gap)
} GP usually uses huge populations (1M
M in individ ividuals ls)
SLIDE 25
SLIDE 26
} More fl
flexible representat ation
} Great
ater ap applicat ation spectrum
} If
f trac actab able, evolving a a way ay to mak ake “things” is more usefu ful than an evolving the the “thi “thing ngs”. ”.
} Exam
ample: evolving a a lear arning rule fo for neural al networks (Am Amr Rad adi, GP , GP98) v vs. . evolving the weights of f a a par articular ar NN.
SLIDE 27
} Ex
Extre tremely y slow
} Very poor han
andling of f numbers
} Very lar
arge populat ations needed
SLIDE 28 } Ge
Gene netic programming ng with h line near geno nomes s (W (Wolfgang Ba Banzaf)
- Kind of going back to the evolution of binary
program codes
} Hyb
Hybrids of GP P and other methods that be better handl dle numbe bers:
- Least squares methods
- Gradient based optimizers
- Genetic algorithms, other evolutionary
computation methods
} Ev
Evolving things other than programs
- Example: electric circuits represented as trees
(Koza, AI in design 1996)
SLIDE 29 } Were invented to so
solve numerical optimization pr probl blem ems
} Or
Originated in Europe in the 1960s
} In
Initially: two-me memb mber or (1+1) ES: S:
- one PARENT generates one OFFSPRING per
GENERATION
- by applying normally distributed (Gaussian) mutations
- until offspring is better and replaces parent
- This simple structure allowed theoretical results to be
- btained (speed of convergence, mutation size)
} Later
Later: en enhan anced ed to to a a (µ+1) st strategy which incorporated crosso ssover
SLIDE 30
SLIDE 31 } Sc
Schwefe fel l in introd
mult lti- me memb mbered ESs Ss now
y (µ µ +λ) ) an and (µ, , λ)
} (µ,
, λ) E ES: Th : The p e paren ent gen t gener erati tion i is di disjoint nt fro rom the he chi hild d gene nera ration
} (µ
µ + + λ) ) ES: Some of the pa parents may be be se selected to
- "prop
- pagate" to
- the child
ge generati tion
SLIDE 32 } Real
al val alued vectors consisting of f two par arts:
- Object variable: just like real-valued GA
individual
- Strategy variable: a set of standard
deviations for the Gaussian mutation
} This structure al
allows fo for "Self- ad adap aptat ation“ of f the mutat ation size
- Excellent feature for dynamically changing
fitness landscape
SLIDE 33
} In mac
achine lear arning we seek a a good hy hypo pothe thesis
} The hypothesis may
ay be a a rule, a a neural al network, a a program am ... etc.
} GAs an
and other EC methods can an evolve rules, NNs, program ams ...etc.
} Clas
assifi fier systems (CFS) ar are the most explicit GA bas ased mac achine lear arning to tool.
SLIDE 34 } Ru
Rule a e and m mes essage s ge system tem
- if <condition> then <action>
} Ap
Apporti tionmen ent o t of c cred edit s t system tem
- Based on a set of training examples
- Credit (fitness) given to rules that match the
example
- Example: Bucket brigade (auctions for
examples, winner takes all, existence taxes)
} Ge
Genetic algorithm
- evolves a population of rules or a population
- f entire rule systems
SLIDE 35 } Ev
Evolves a population of rules, the final po popu pulation is used d as the rule and d message sy syst stem
} Di
Dive versity maintenance among rules is hard
} If
If done well converges faster
} Ne
Need to specify how to use the rules to cl clas assify
- what if multiple rules match example?
- exact matching only or inexact matching allowed?
SLIDE 36
} Eac
ach individual al is a a complete set of f ru rules or r comp mplete te soluti tion
} Avoids the har
ard credit as assignment pr probl blem
} Slow becau
ause of f complexity of f spac ace
SLIDE 37
} Clas
assical al EP evolves fi finite stat ate mac achines (or similar ar structures)
} Relies on mutat
ation (no crossover)
} Fitness bas
ased on trai aining seq sequen ence( e(s) s)
} Good fo
for sequence problems (DNA) an and prediction in time series
SLIDE 38
SLIDE 39
} Add a
a stat ate (with ran andom tran ansitions)
} Delete a
a stat ate (reas assign stat ate tran ansitions)
} Chan
ange an an output symbol
} Chan
ange a a stat ate tran ansition
} Chan
ange the star art stat ate
SLIDE 40 } No specific representation } Similar to Evolution Strategies
- Most work in continuous optimization
- Self adaptation common
} No crossover ever used!
SLIDE 41 } Va
Vari riabl ble com
plexity ty linear r re repre presenta tati tion
} Rep
Repres esen entat ations bas ased ed on des escr cription of tra transform
tions
- instead of enumerating the parameters of the individual,
describe how to change another (nominal) individual to be it.
- Good for dimension reduction, at the expense of
- ptimality
} Su
Surroga rrogate te assis iste ted d evolu
tion meth thods
- ds
- Good when objective function is very expensive
- fit an approximation to the objective function and uses it
to speed up the evolution
} Di
Differ eren enti tial al Evol Evoluti tion
SLIDE 42 } Artifi
ficial al life fe
- An individual’s fitness depends on genes
+ lifetime experience
- An individual can pass the experience to
- ffspring
} Co
Co-evo evolution
- Several populations of different types of
individuals co-evolve
- Interaction between populations changes
fitness measures
SLIDE 43
} Ant Colony Optimizat
ation
} Inspired by the social behavior of ants } Useful in problems that need to find paths to goals
} Par
article Swar arm optimizat ation
} Inspired by social behavior of bird flocking or fish schooling } The potential solutions, called particles, fly through the problem space by following the current optimum particles
SLIDE 44 } All evolutionar
ary computat ation models ar are getting closer to eac ach other
} The choice of
f method is importan ant fo for su success ess
} EC provides a
a very fl flexible ar architecture
- easy to combine with other paradigms
- easy to inject domain knowledge
SLIDE 45 } Evolutionar
ary Computat ation
} IEEE tran
ansac actions on evolutionar ary computat ation
} Genetic program
amming an and evolvab able mac achines
} ot
NG ...
SLIDE 46 } Genetic an
and evolutionar ary computat ation confe ference (GECCO)
} Congress on evolutionar
ary computat ation (C (CEC)
} Par
aral allel problem solving fr from nat ature (P (PPSN)
} ot
in desig ign, IJCA CAI, AAAI ...