An Introduction To An Introduction To Optimization Optimization Using Using Evolutionary Algorithms Evolutionary Algorithms
An Introduction To An Introduction To Optimization Optimization Using Using Evolutionary Algorithms Evolutionary Algorithms
Contents
Introduction to Optimization. Solving optimization problems using MATLAB. Global Optima versus Local Optima. Introduction to GA and SA. Solving GA and SA optimization problems using MATLAB MATLAB. Multi-objective Optimization. Solving Multi-objective optimization problems in MATLAB.
Minimization Problems
Unconstrained Constrained Example The Algorithm Description
Function Optimization
Optimization concerns the minimization or maximization of functions Standard Optimization Problem
min f x
x
~
Subject to: j
() h ( x) = 0 g ( x) 0
~
i ~ j ~
x L k xk xU k
Function Optimization
f x
~
()
is the objective function, which measure and evaluate the performance of a system. system In a standard problem, we are minimizing the function. For F maximization, it is equivalent to minimization i i ti i i l tt i i i ti of the ve of the objective function. is a column vector of design variables, which can affect the performance of the system.
x
~
Function Optimization
Constraints Limitation to the design space. Can be linear or nonlinear, explicit or implicit functions
() g ( x) 0
hi x = 0
~
j
Equality Constraints Inequality Constraints Algorithms require Less Than l ih i h Side C Sid Constraints t i t
x L k xk xU k
Optimization Toolbox
Is a collection of functions that extend the capability of MATLAB. The toolbox includes routines for: Unconstrained optimization p Constrained nonlinear optimization, including goal attainment problems, minimax problems, and semi-infinite minimization problems Quadratic and linear programming Nonlinear least squares and curve fitting Nonlinear systems of equations solving Constrained linear least squares Specialized algorithms for large scale problems
Minimization Algorithm
Unconstrained Minimization
Consider the problem of finding a set of values [x1 x2]T that solves
2 min f x = e x1 ( 4 x12 + 2 x2 + 4 x1 x2 + 2 x2 + 1) x
~
()
~
x = [ x1
~
x2 ]
Steps
Create an M-file that returns the function value (Objective Function)
Call it objfun.m
x2 ]
f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1);
Objective Function
Input Arguments
Results
xmin = 0.5000 feval = 1.3028e-010 g exitflag = 1 output = iterations: 7 funcCount: 40 stepsize: 1 firstorderopt: 8.1998e-004 algorithm: 'medium-scale: Quasi-Newton line search' Some other information Exitflag tells if the algorithm is converged. If exitflag > 0, then local minimum is found Objective function value -1.0000 Minimum point of design variables
Example
min f x = x1 x2 x3
x
~
()
~
Subject to:
2 x12 + x2 0
x1 2 x2 2 x3 0 x1 + 2 x2 + 2 x3 72
0 x1 , x2 , x3 30
1 2 2 0 A= , B = 72 1 2 2
0 30 LB = 0 , UB = 30 0 30
2 x12 + x2 0
function [C,Ceq]=nonlcon(x) C=2*x(1)^2+x(2); Ceq=[]; Remember to return a null Matrix if the constraint does not apply
Example (Cont.)
Initial guess (3 design variables) x0=[10;10;10]; A=[-1 -2 -2;1 2 2]; B [0 72]'; B=[0 72] ; LB = [0 0 0]'; UB = [30 30 30]';
1 2 2 0 A= , B = 72 1 2 2 0 30 LB = 0 , UB = 30 0 30
CAREFUL!!!
[x,feval]=fmincon(@myfun,x0,A,B,[],[],LB,UB,@nonlcon)
fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,)
Conclusion
Easy to use! But, we do not know what is happening behind the routine. Therefore, it is still important to understand the limitation of each routine. d d h li i i f h i Basic steps: Recognize the class of optimization problem Define the design variables Create objective function Recognize the constraints Start an initial guess Invoke suitable routine Analyze the results (it might not make sense)
Basins of attraction behave like local attractors. All initial conditions close to a local attractor reach that point.
To T reach a global optima h l b l i we have to ensure that the initial guess we provide is lying close to the attractor y g belonging to the global optima.
Ge e c go Genetic Algorithm
Rather than starting from a single point (or guess) within the search space space, GAs are initialized with a population of guesses. These are usually random and will be spread throughout the search space. A typical algorithm then uses three operators, selection, crossover and mutation (chosen in part by analogy with the natural world) to direct the population (over a series of time steps or generations) towards convergence at the global optimum. Selection attempts to apply pressure upon the population in a manner similar to that of natural selection found in biological systems. Poorer performing individuals are weeded out and better performing, or fitter, individuals have a greater than average chance of promoting the information they contain within the next generation.
You select the loudest barking dogs out of the lot and mate them. Since the parents bark loud its Natural that the pups they generate would bark loudly too!!
Operations in GA
Crossover allows solutions to exchange information in a way similar to that used by a natural organism undergoing sexual reproduction. One method (termed single point crossover) is to choose pairs of individuals promoted by the selection operator randomly operator, choose a single locus (point) within the binary strings and swap all the information (digits) to the right of this locus between the two individuals. Mutation is used to randomly change (flip) the value of single bits within individual strings. Mutation is typically used very sparingly. After selection, crossover and mutation have been applied to the initial population, a new population will have been formed and the generational counter is increased by one. This process of selection, crossover and mutation is continued until a fixed number of generations have elapsed or some form of convergence criterion has been met.
Crossover
Merging 2 chromosomes to create new chromosomes
Single point crossover
0100101010011011 0100101010011011
Mutation
Random changing of a gene in a chromosome
0100101010011011 1
M Mutation can h l escaping from L l optima i help i f Local i in our state space.
An elementary example
A trivial problem might be to maximize a function, f(x), where f(x) = x^2 ; for integer x and 0 < x < 4095. There are of course other ways of finding the answer (x = 4095) to this problem than using a GA, but its simplicity makes it ideal as an example. GA example
Firstly, the exact form of the algorithm must be decided upon. As mentioned earlier, GAs can take many forms. This allows a wealth of freedom in the details of the algorithm. The following (Algorithm 1) represents just one possibility.
1. Form a population, of eight random binary strings of length twelve (e.g. 101001101010, 110011001100, ). 2. Decode each binary string to an integer x (i.e. 000000000111 implies x = 7, 000000000000 = 0, 111111111111 = 4095). 3. Test these numbers as so u o s to the p ob e f(x) = x^2 a d assign a fitness to eac es ese u be s solutions o e problem ( ) and ass g ess o each individual equal to the value of fix) (e.g. the solution x = 7 has a fitness of 7^2 = 49). 4. S l t th best h lf (th 4 Select the b t half (those with highest fit ith hi h t fitness) of th population t go f ) f the l ti to forward t th d to the next generation.
5. Pick pairs of parent strings at random (with each string being selected exactly once) from these more successful individuals to undergo single point crossover. Taking each pair g g p g p in turn, choose a random point between the end points of the string, cut the strings at this point and exchange the tails, creating pairs of child strings. 6. 6 Apply mutation to the children by occasionally (with probability one in six) flipping a 0 to 1 / or vice versa. 7. Allow these new strings, together with their parents to form the new population, which will still contain only eight members. 8. Return to Step 2, and repeat until fifty generations have elapsed.
To further clarify the crossover operator, imagine two strings, 000100011100 and 111001101010, Performing crossover between the third and fourth characters produces two new strings: parents 000/100011100 / 111/001101010 children 000001101010 111100011100
It is this process of crossover which is responsible for much of the power of p p p genetic algorithms.
Population members 1, 3, 6 and 7 have the highest fitness. Deleting those four with the least fitness provides a temporary reduced population ready to undergo crossover:
String
fitness
This temporary population does not contain a 1 as the last digit in any of the strings(whereas the initial population did). This implies that no string from this moment on can contain such a digit and the maximum value that can evolve will be 111111111110after hich 111111111110 after which point this string will reprod ce so as to dominate the ill reproduce population. This domination of the population by a single sub- optimal string gives a first indication of why mutation might be important. Any further populations will only contain the same, identical string. This is because the crossover operator can only swap bits between strings, not introduce any new information. Mutation can thus be seen in part as an operator charged with maintaining the genetic diversity of the population by preserving the diversity embodied in the initial generation.
The inclusion of mutation allows the population to leapfrog over this sticking point. It is worth reiterating that the initial population did include a 1 in all positions. Thus the mutation operator is not necessarily inventing new information but simply working as an insurance policy against premature loss of genetic information information.
Rerunning the algorithm from the same initial population, but with mutation, allows the string 111111111111 to evolve and the global optimum to be found. Mutation has been included by visiting every bit in each new child string, throwing a random number between 0 and 1 and if this number is less than 1/12, flipping the value of the bit.
Summary of GA
start
Simulated Annealing Si l d A li
Dont be too ambitious with this course load. You CANNOT do optimization without a properly posed math problem.
Thank Y Th k You