0% found this document useful (0 votes)
41 views14 pages

BSCM-03 Multiobjective Optimization

This document discusses multi-objective optimization and approaches to solving multi-objective problems, including classical and evolutionary algorithms. Multi-objective optimization involves minimizing or maximizing multiple objective functions simultaneously, as there is typically no single solution that optimizes all objectives. Classical approaches include the weighted sum method, ε-constraint method, and goal programming method. Evolutionary algorithms evolve a population of solutions and select based on Pareto domination, promoting Pareto-dominating solutions. The Nondominated Sorting Genetic Algorithm (NSGA) uses Pareto front ranking and fitness sharing to optimize a population towards the Pareto front.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
41 views14 pages

BSCM-03 Multiobjective Optimization

This document discusses multi-objective optimization and approaches to solving multi-objective problems, including classical and evolutionary algorithms. Multi-objective optimization involves minimizing or maximizing multiple objective functions simultaneously, as there is typically no single solution that optimizes all objectives. Classical approaches include the weighted sum method, ε-constraint method, and goal programming method. Evolutionary algorithms evolve a population of solutions and select based on Pareto domination, promoting Pareto-dominating solutions. The Nondominated Sorting Genetic Algorithm (NSGA) uses Pareto front ranking and fitness sharing to optimize a population towards the Pareto front.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 14

Multiobjective optimization (1/2)

 More than one goal function, formally:


Bioinspired and soft-computing
min F ( x)  [ f1 ( x), f 2 ( x),..., f m ( x )]T
methods in data analysis and x

optimization F :   ℝm

where:
Multiobjective Optimization
Ω – the decision (search) space
ℝm – the objective space
fi – the objective (goal) functions
Krzysztof Michalak
krzysztof.michalak@ue.wroc.pl
 Usually, no single solution minimizes all the
https://krzysztof-michalak.pl/
objectives
 We are looking for the Pareto front 2

Multiobjective optimization (2/2) Approaches to MOO – classical


 Examples  The weighted sum method: add up all the objectives,
 Minimize travel time and costs but with different weights wj, j = 1, …, m
 Maximize strength of an element, minimize weight min F ( x)  w1 f1(x)  w2 f2 (x)  ...  wm f m ( x)
x

 Maximize investment return, minimize risk


 The ε-constraint method: optimize one criterion (e.g.
min f1 ) while constraining the others ( fj ≤ εj for j ≠ 1).
non-dominated solutions
f2 = travel cost

dominated solutions  Goal-programming method: provide goals Gj and


minimize the weighted sum of deviations:
min F( x)  j 1 wj f j ( x)  Gj
m

x x
area of the objective space
min

dominated by x
 By changing the weights, constraints and goals
min f1 = travel time 3
multiple solutions are obtained 4

Approaches to MOO – classical Approaches to MOO – evolutionary


 The weighted sum method: add up all the objectives,  Similar to single-objective EAs, but with different:
but with different weights wj, j = 1, …, m  evaluation (multiobjective)
For more examples of both classical and metaheuristic multiobjective
min F ( x)  w1 f1(x)  w2 f2 (x)  ...  wm f m ( x) selection
optimization methods
x refer to the book: 

Bechikh S., Datta R., Gupta A., "Recent Advances in Evolutionary Multi-
 The ε-constraint method:Learning,
optimize one criterion Based on Pareto domination
series vol.(e.g.

objective Optimization", Adaptation, and Optimization
min f1 ) while
20, Springer, 2017.constraining the others (e.g. f ≤ ε ).
j j  Evolve a population of solutions
https://link.springer.com/book/10.1007/978-3-319-42978-6  Promote Pareto-dominating solutions
 Goal-programming method: provide goals Gj and
minimize the weighted
It is free to download
or labs)
sum
from the UEW of (http
network deviations:
proxy, virtual machines  Decomposition-based
min F( x)  j 1 wj f j ( x)  Gj
m

x
 Transform the multiobjective problem to a set of single-
objective subproblems
 By changing the weights, constraints and goals  In the population each solutions is assigned to a different
multiple solutions are obtained 5 subproblem 6

1
NSGA (1/5) NSGA (2/5)
Nondominated Sorting Genetic Algorithm[1] Pareto front ranking
 Different from the SGA in how the selection is performed FR0 = non-dominated specimens from the population P
 Pareto front ranking FR1 = non-dominated specimens from P \ FR0
 FR0 = nondominated solutions FR2 = non-dominated specimens from P \ (FR0  FR1)
 FR1 = solutions dominated only by those from FR0 FR3 = …
 etc. f2
 Selection procedure
 Solutions are sorted using the Pareto front ranking

min
 Solutions in FRi are preferred to those in FRj for i < j
 Fitness sharing is used within each FRi front to prioritize solutions in
less crowded areas
[1] Srinivas N., Deb K., "Multiobjective Optimization Using Nondominated Sorting in Genetic
Algorithms", Evolutionary Computation, vol. 2, no. 3, p. 221-248, 1994. 7 min f1 8

NSGA (3/5) NSGA (3/5)


Fitness function calculation Fitness function calculation
1. n = 0, f0 = dummy fitness value used for FR0 (typically f0 = pop. size) 1. n = 0, f0 = dummy fitness value used for FR0 (typically f0 = pop. size)
2. Move nondominated solutions from the population to FRn. 2. Move nondominated solutions from the population to FRn.
3. For each solution x  FRn set the fitness value to f(x) = fn. 3. For each solution x  FRn set the fitness value to f(x) = fn.
4. For each solution x  FRn modify the fitness: 4. For each solution x  FRn modify the fitness:

s s
yFRn
xy
Small value if
neighbours
s s
yFRn
xy
Parameter determining
y x are far away y x the niche size

  d ( x, y )  2   d ( x, y )  2
   d ( x, y )   share    d ( x, y )   share
s xy  1    when s xy  1    when
 share   share 
0 otherwise 0 otherwise
 
f ( x)  f ( x ) / s f ( x)  f ( x ) / s Larger fitness if neighbours are far away
Smaller fitness if neigbours are near
9 10

NSGA (4/5) NSGA (5/5)


Fitness function calculation NSGA – The whole algorithm
5. If the entire population has been divided into fronts FR0, …, FRn, then 1. Initialization: t = 0, new population P0.
STOP.
2. Fitness calculation for solutions in Pt.
6. Calculate the maximum fitness value for FRn+1 as:
f n 1  min f ( x) 3. Stochastic remainder selection using the fitness values
xFRn
4. Crossover and mutation. New solutions go to Pt+1.
7. n = n + 1, go back to step 2.
5. t = t + 1, go back to step 2.

11 12

2
NSGA-II (1/16) NSGA-II (2/16)
Nondominated Sorting Genetic Algorithm II[1] NSGA-II – pseudocode
 Drawbacks of the NSGA algoritm 1. Initialization: t = 0, new population P0.

 Large computational complexity 2. Splitting of the population to FRn fronts using the information about
solutions dominating over and dominated by each solution
 No elitism
3. Crowding distance calculation
 Needs setting of the share parameter (maximum distance of
4. Mating pool selection using the binary tournament.
solutions in a niche)
 A lower front number FRn wins
 Most important features of NSGA-II  For the same front a larger crowding distance wins
 Uses the information about solutions dominating over and dominated 5. Crossover and mutation. The offspring join the population Pt.
by each solution to split the population into fronts 6. Population reduction in order to obtain Npop solutions for Pt+1.
 Uses the crowding distance to ensure diversification of solutions  Fronts FRn are copied for an increasing number n
along the Pareto front  If the entire front FRn does not fit in Npop solutions then those
with a larger crowding distance are copied
[1] Deb K., Agrawal S., Pratap A., Meyarivan T., "A Fast Elitist Non-Dominated Sorting Genetic
Algorithm for Multi-Objective Optimization: NSGA-II", 2000. 7. t = t + 1, go back to step 2.
13 14

NSGA-II (3/16) NSGA-II (4/16)


Splitting of the population to FRn fronts (1/2) Splitting of the population to FRn fronts (1/2)
1. FR0 =  Sp FR0, FR1, FR2
f2
2. For each solution p from the population P
Sp =  // the set of solutions dominated by p
np = 0 // the number of solutions dominating over p
Sq1
For each solution q from the population P
if q ≺ p to Sp = Sp  { q } np = 2
p
if q ≻ p to np = np + 1 Sq2
min

q1
If np = 0 then // no solutions dominating over p
q2
FR0 = FR0  { p }
3. R1 = P \ FR0 // the set of solutions not assigned to fronts

np = 2, because p  Sq1 and p  Sq2

min f1
15 16

NSGA-II (5/16) NSGA-II (6/16)


Splitting of the population to FRn fronts (1/2) Splitting of the population to FRn fronts (2/2)
Sp FR0, FR1, FR2 4. n = 1
f2
5. If Rn =  then STOP
6. For each solution p from the front FRn-1
Sq1 For each solution q from the set Sp
np = 2 nq = nq – 1 // treat the solution p from FRn-1 as "removed"
p // decrease the counters for solutions dominated by p
Sq2
min

q1 If nq = 0 then // no solutions dominating over q


q2 FRn = FRn  { q }
7. Rn+1 = Rn \ FRn // the set of solutions not assigned to fronts

8. n = n + 1
No solutions dominate over q1 and q2, so nq1 = 0, nq2 = 0 and q1, q2  FR0 9. Go to step 5.

min f1
17 18

3
NSGA-II (7/16) NSGA-II (8/16)
Splitting of the population to FRn fronts (2/2) Splitting of the population to FRn fronts (2/2)
Sp FR0, FR1, FR2 Sp FR0, FR1, FR2
f2 f2

Sq1 Sq1
np = 2 np = 2
p p
Sq2 Sq2
min

min
q1 q1
q2 q2

When "removing" FR0 we decrease np twice, because p  Sq1 and p  Sq2 "Removing" these solutions from FR0 does not affect np

min f1 min f1
19 20

NSGA-II (9/16) NSGA-II (10/16)


Splitting of the population to FRn fronts (2/2) Splitting of the population to FRn fronts (2/2)
Sp FR0, FR1, FR2 Sp FR0, FR1, FR2
f2 f2

Sq1 Sq1
np = 1 np = 0
p p
Sq2 Sq2
min

min

q1 q1
q2 q2

"Removing" q1 decreases np by one "Removing" q2 decreases np by one also

min f1 min f1
21 22

NSGA-II (11/16) NSGA-II (12/16)


Splitting of the population to FRn fronts (2/2) Splitting of the population to FRn fronts (2/2)
Sp FR0, FR1, FR2 Sp FR0, FR1, FR2
f2 f2

Sq1 Sq1
np = 0 np = 0
p p
Sq2 Sq2
min

min

q1 q1
q2 q2

"Removing" these solutions from FR0 does not affect np After FR0 is "removed" np = 0, so p belongs to FR1

min f1 min f1
23 24

4
NSGA-II (13/16) NSGA-II (14/16)
Splitting of the population to FRn fronts (2/2) Calculating the crowding distance 
Sp FR0, FR1, FR2 For each FRi front separately
f2
n = |FRi |
For each solution x  FRi
x = 0
For each goal function number j = 1, …, m
p I = sort( FRi, j ) // solutions sorted w.r.t fj
min

For solutions with minimal and maximal values of fj


I [1] = ∞ // solutions at the edges are "not crowded"
I [n] = ∞
Normalized distance
For k = 2, …, n – 1 between neighbours
of I [k]
Some solutions belong to Sp. When FR1 is "removed" their "dominated by" f j ( I [ k  1])  f j ( I [ k  1])
counters will be decreased  I [k ]   I [k ] 
max( f j ( x ))  min ( f j ( x ))
xFRi xFRi
min f1
25 26

NSGA-II (15/16) NSGA-II (16/16)


Crowding distance calculation example (1/2) Crowding distance calculation example (2/2)
For the f1 goal function (j = 1) For the f2 goal function (j = 2)
f2 f j ( I [ k  1])  f j ( I [ k  1]) f2 f j ( I [ k  1])  f j ( I [ k  1])
I [1] = ∞  I [k ]   I [k ]  I [1] = ∞  I [k ]   I [k ] 
max( f j ( x ))  min ( f j ( x )) max( f j ( x ))  min ( f j ( x ))
xFRi xFRi xFRi xFRi

k -1 k -1
min

min

k k
k+1 k+1

I [n] = ∞ I [n] = ∞
f j ( I [k  1])  f j ( I [k  1])

max ( f j ( x ))  min ( f j ( x ))
x FRi x FRi

min f1 min f1
27 28

NSGA-II Solving a MOTSP Instance NSGA-II Solving a MOTSP Instance


 Video: MOTSP_kroAB100_nsga2_2-opt.avi  Video: MOTSP_kroAB100_nsga2_2-opt.avi
 Problem instance: kroAB100  Problem instance: kroAB100
The 2-opt local search is capable of improving solutions to the TSP
 Number of cities: n = 100  Number of cities: n = 100
problem very fast, so the algorithm finds a well diversified set of Pareto
optimal solutions quickly (in 200 generations).
 Search space size:  Search space size:
Note, however, that we are solving the kroAB100 instance with n = 100
| Ω | = 100! = 9.33  10157 = 100!
| Ω locations
|locations = 9.33  10157PCB drilling problem with n = 1002
here, as opposed to the
which we solved with the SGA.
 Population size: 100  Population size: 100
kroAB100: | Ω | = 100! = 9.33  10157
Generations: 200 Generations:
pr1002:
200
| Ω | = 1002! = 4.04  10 2573
 

 EA: NSGA-II  EA: NSGA-II


 Local search: 2-opt  Local search: 2-opt

29 30

5
MOEA/D (1/10) MOEA/D (2/10)
Multiobjective Evolutionary Algorithm Based on Neighbourhood structure in MOEA/D
Decomposition[1]
f2
 Most important features of MOEA/D
Neighbourhood B(i) of
Decomposes a multiobjective optimization problem to Npop single-

the i-th subproblem for
objective subproblems
T=5
Objective functions for the subproblems are obtained using weight B(i)

min

vectors i for i = 1, …, Npop


i
 Uses a neighbourhood structure to transfer information between
solutions to subproblems Solutions
 Has a relatively low computational complexity z* Reference (utopia)
point z*
min f1
[1] Zhang Q., Li H., "MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition",
IEEE Transactions on Evolutionary Computation, vol. 11, no. 6, p. 712–731, 2007. 31 32

MOEA/D (3/10) MOEA/D (4/10)


Decomposition methods (1/3) Decomposition methods (2/3)
Weighted sum decomposition. Tchebychef decomposition.
Minimize: Minimize:

 
m
g ( x |  )   i f i ( x)
ws
g te ( x |  , z * )  max i f i ( x)  zi*
1 i  m
i 1
m i  0
i  0, 
i 1
i 1
z *  ( z1* ,..., z m* )
f2

zi*  max f i ( x) f1 ( x)  z1* F(x)


min

If the Pareto front is convex then every point can be obtained for some f 2 ( x)  z2*
weight vector λ.
Weighted sum decomposition is not suitable for concave Pareto fronts. F ( x )  [ f1 ( x ), f 2 ( x ),..., f m ( x )]T z*

33 min f1 34

MOEA/D (5/10) MOEA/D (6/10)


Decomposition methods (3/3)  Inputs
 A multiobjective optimization problem
Penalty Boundary Intersection (PBI) decomposition.
min F ( x)  [ f1 ( x), f 2 ( x),..., f m ( x)]T
Minimize: x

F :  ℝ m
g ( x |  , z )  d1  d 2
bip *
where:
Ω – the decision (search) space
( z *  F ( x))T  
d1  f2
ℝm – the objective space
 F(x)
fi – the objective (goal) functions
A stopping criterion
d 2  F ( x)  ( z  d1 ) * d2 

 Npop – population size


min

 λ(1), …, λ(N ) pop– weight vectors


d1
 T – the neighbourhood size
F ( x )  [ f1 ( x ), f 2 ( x ),..., f m ( x )]T λ z*
 Output
min f1 35  EP – the external population which holds the PF approximation 36

6
MOEA/D (7/10) MOEA/D (8/10)
The algorithm keeps  Step 1 - Initialization
 A population of Npop solutions: x(1), …, x(N
pop )  , where x(i) is a 1.1) EP = 
solution to the i-th subproblem
1.2) For each weight vector λ(i) determine the neighbourhood – the
 Values of the objective vectors F(x(i)) set of T indices of the weight vectors λ(i1), …, λ(iT) closest (w.r.t.
the Euclidean distance in ℝm) to λ(i) :
The reference point z  ( z1 ,..., z m ) , where
* * *
 zi* is the best value of B (i )  i1 ,..., iT 
the objective function fi found so far.
1.3) Generate the initial population: x(1), …, x(N )
pop

 The best approximation of the Pareto front EP, that is a set of


nondominated solutions found by the algorithm so far 1.4) Calculate the objectives F(x(i)), for i = 1, …, Npop
1.5) Initialize the reference point z  ( z1 ,..., z m ) either using the
* * *

calculated objectives or in a problem-specific way

37 38

MOEA/D (9/10) MOEA/D (10/10)


 Step 2 – Solution update (1/2)  Step 2 – Solution update (2/2)
For each subproblem i = 1, …, Npop 2.4) Neighbouring solutions update: for each j  B(i)
if g ( y |  , z )  g ( x |  , z ) then set x(j) = y
te ( j) * te ( j) ( j) *
2.1) Reproduction: select two indices k, l  B(i) and generate a new
solution y form x(k) and x(l) using the genetic operators
2.5) Update of the Pareto front
2.2) Improvement: use a problem specific repair or local search on y  Remove from the EP the vectors dominated by F(y)

2.3) Update of the reference point: for each j = 1, …, m if z *j  f j ( y )  If F(y) is not dominated by any vector from EP then set
then set z j  f j ( y )
*
EP = EP  { y }

 Step 3 – Stopping criterion


If the stopping criterion is satisfied, return EP and STOP
Otherwise go to step 2.

39 40

Many-objective optimization Many-objective optimization


 When defining a multiobjective optimization  Often, we assume that many-objective optimization
problem we imposed no upper limit on the numer of problems have m ≥ 4 objectives
the objectives m
 Thus, we can say that multiobjective optimization
 It turns out however, that with increasing number problems in practice only have m = 2 or 3 objectives
of objectives previously unseen difficulties appear and with more we have many-objective problems
In practical applications we can find problems with
Obviously, any optimization problem with m > 1 is,


as many as 10 to 15 objectives
formally, a multiobjective optimization problem
 Thus, in the MOO field a concept of many-objective  Yes, you can run NSGA-II or MOEA-D for m ≥ 4
optimization is studied  But, the performance can be degraded
41 42

7
Many-objective optimization Many-objective optimization
 A large fraction of population is non-dominated  An obvious problem: visualizing m ≥ 4 objectives
 The Pareto Front is a set in ℝm  We can easily understand 2-d and 3-d plots[1]
 Approximating such set requires many solutions ZDT-3 DTLZ-7
 Similar to the curse of dimensionality in ML

 Recombination operation may be inefficient


 A limited population in a large search space
 Parents far away from each other
 Offspring not likely to be similar to parents

 Costly computations [1] Kaifeng Yang et al. „Multiobjective Memetic Estimation of Distribution Algorithm Based on an
Incremental Tournament Local Searcher”, The Scientific World Journal, vol. 2014, Article ID
43 836272. 44

Many-objective optimization NSGA-III (1/8)


 With more we can try to use paralel coordinates  A reference-point based many-objective EA[1,2]
Petelin G., Antoniou M., Papa G.
„Multi-objective approaches to
 Based on the previous version: NSGA-II[3]
ground station scheduling for
optimization of communication with  Emphasizes population members which are non-
satellites”, Optimization and
Engineering, 2021. dominated yet close to a set of supplied reference
Objectives (maximized) points
FitAW (Access window) [1] Deb K., Jain H., "An Evolutionary Many-Objective Optimization Algorithm Using Reference-
Maximize the duration of
point Based Non-dominated Sorting Approach, Part I: Solving Problems with Box Constraints",
communication
IEEE Transactions on Evolutionary Computation, vol. 18(4), 577-601, 2014.
FitCS (Communication clash) https://www.egr.msu.edu/~kdeb/papers/k2012009.pdf
Normalized number of windows
where no clash was detected [2] Jain H., Deb K., "An Evolutionary Many-Objective Optimization Algorithm Using Reference-
point Based Non-dominated Sorting Approach, Part II: Handling Constraints and Extending to
FitTR (Communication time)
an Adaptive Approach", IEEE Transactions on Evolutionary Computation, vol. 18(4), 602-622,
For how many communcation cases
there is enough transmission time 2014.
https://www.egr.msu.edu/~kdeb/papers/k2012010.pdf
FitGU (Ground station usage)
Utilization percentage [3] Deb K., Agrawal S., Pratap A., Meyarivan T., "A Fast Elitist Non-Dominated Sorting Genetic
45 Algorithm for Multi-Objective Optimization: NSGA-II", 2000. 46

NSGA-III (2/8) NSGA-III (3/8)


 Uses a similar procedure as the NSGA-II Pareto front ranking (used also in NSGA and NSGA-II)
 Population size Npop FR0 = non-dominated specimens from the population P
FR1 = non-dominated specimens from P \ FR0
 The parent population Pt, genetic operators are used to
FR2 = non-dominated specimens from P \ (FR0  FR1)
generate the offspring population Qt
FR3 = …
 The union Pt  Qt is reduced to Npop solutions f2

 Non-dominated sorting
 If a front FRn has too many solutions to be included
min

 Assign solutions to reference points


 Use a niching mechanism to decide how many soutions to select for each
reference point

47 min f1 48

8
NSGA-III (4/8) NSGA-III (5/8)
Reference points Penalty Boundary Intersection (PBI) decomposition
Used to ensure even spreading of solutions along the Pareto front used in MOEA/D Note that PBI decomposition is not used in
NSGA-III!
Minimize:
This slide is just to remind the PBI technique,
g bip ( x |  , z * )  d1  d 2 because a similar approach is used in NSGA-III
to assign solutions to reference points.

( z *  F ( x))T  
d1  f2
 F(x)

d 2  F ( x)  ( z *  d1 ) d2

min
d1

F ( x )  [ f1 ( x ), f 2 ( x ),..., f m ( x )]
T
λ z*

49 min f1 50

NSGA-III (6/8) NSGA-III (7/8)


NSGA-III uses a similar approach to assign solutions to Assigning solutions to reference points for m=3
reference points
1. Reference lines are drawn from the ideal (utopia) point z *(◆)
through the reference points (▲)
2. Solutions are assigned to the reference point whose line is the
closest one
f2

λ z*

f1 51 52

NSGA-III (8/8) NSGA-III solving the ZDT-3 problem


 Overall, the new population consists of  The ZDT-3 is a biobjective minimization problem
 The non-dominated solutions FR0  Discontinuous Pareto Front
 Those fronts FR1, …, FRn-1 which fit entirely in the Npop
population size
 Solutions from the front FRn (which does not fit entirely in
the Npop population size) selected in the following way:
 Select as many solutions as needed to obtain Npop solutions for the
next generation
 Use the reference points (▲) to implement a niching mechanism
 Assign solutions to the closest reference lines
 Using solutions from the front FRn fill the least populated niches Figure from: Roudenko O., Schoenauer M., „A Steady Performance Stopping Criterion for Pareto-
first based Evolutionary Algorithms”, 6th International Multi-Objective Programming and Goal
53 Programming Conference, Apr 2004, Hammamet, Tunisia, hal-01909120. 54

9
NSGA-III solving the ZDT-3 problem NSGA-III solving the ZDT-3 problem
 For better visualization translated vertically and  Selection: we want to select N pop solutions
scaled by half  After splitting to non-dominated fronts:
the true Pareto front non-dominated
solutions: FR 0

other solutions
selected from the
population: FR 1, …

n -th non-dominated
front which did not fit
in the next population
as a whole

55 56

NSGA-III solving the ZDT-3 problem NSGA-III solving the ZDT-3 problem
 If the non-dominated front and fronts FR 1, …, FR n-1  If there are too many solutions in the n -th front FR n
add up exactly to N pop solutions we are done (i.e. there are more than N pop solutions in i  0 FRi )…
n

non-dominated non-dominated
solutions: FR 0 solutions: FR 0

other solutions other solutions


selected from the selected from the
population: FR 1, … population: FR 1, …

n -th non-dominated n -th non-dominated


front which did not fit front which did not fit
in the next population in the next population
as a whole as a whole

57 58

NSGA-III solving the ZDT-3 problem NSGA-III solving the ZDT-3 problem
 … we need to select some solutions from the n -th  The ideal (a.k.a. utopia) point: best (minimum) value
front FRn so that altogether we have N pop solutions of each objective found so far
non-dominated
 We use niching based on the reference points to solutions: FR 0
select those solutions that maximize diversity: other solutions
 Draw reference lines through the reference points selected from the
population: FR 1, …
n 1
 Assign solutions from 
i 0
FRi to the closest reference lines
n -th non-dominated
front which did not fit
 Add those solutions from FRn which correspond to the least in the next population
populated niches The ideal point
as a whole

ideal point

59 60

10
NSGA-III solving the ZDT-3 problem NSGA-III solving the ZDT-3 problem
 Shift (translate) the population so that the ideal point  Extreme points selected from the non-dominated
is placed at [0, 0] front FR 0 (those that are the closest to the axes)
non-dominated non-dominated
solutions: FR 0 solutions: FR 0

other solutions other solutions


selected from the selected from the
population: FR 1, … population: FR 1, …

n -th non-dominated n -th non-dominated


front which did not fit front which did not fit
in the next population in the next population
as a whole as a whole

ideal point extreme points

61 62

NSGA-III solving the ZDT-3 problem NSGA-III solving the ZDT-3 problem
 The nadir point: worst (maximal) objectives from the  Scale the points so that the ideal point is [0, 0] and
extreme points the nadir point is [1, 1]
non-dominated non-dominated
solutions: FR 0 solutions: FR 0
[1, 1]
The nadir point other solutions other solutions
selected from the selected from the
population: FR 1, … population: FR 1, …

n -th non-dominated n -th non-dominated


front which did not fit front which did not fit
in the next population in the next population
as a whole as a whole
[0, 0]
nadir point

63 64

NSGA-III solving the ZDT-3 problem NSGA-III solving the ZDT-3 problem
 Now, we can place the reference points (▲) and  Each reference line is a niche. Assign solutions to
draw reference lines niches. Populate the least crowded niches first
non-dominated non-dominated
solutions: FR 0 solutions: FR 0

other solutions other solutions


selected from the selected from the
population: FR 1, … population: FR 1, …

n -th non-dominated n -th non-dominated


front which did not fit front which did not fit
in the next population in the next population
as a whole as a whole

reference points solutions assigned to


the niche (reference
line) marked green
65 66

11
NSGA-III solving the ZDT-3 problem NSGA-III solving the ZDT-3 problem
 Each reference line is a niche. Assign solutions to  Each reference line is a niche. Assign solutions to
niches. Populate the least crowded niches first niches. Populate the least crowded niches first
non-dominated non-dominated
solutions: FR 0 solutions: FR 0

other solutions other solutions


selected from the selected from the
This niche population: FR 1, … This niche population: FR 1, …
contains many contains no
solutions from n -th non-dominated solutions from n -th non-dominated
fronts FR0 up front which did not fit fronts FR0 up front which did not fit
to FRn-1. No in the next population to FRn-1. A new in the next population
solutions from as a whole solution from as a whole
FRn are added. FRn is added.

solutions assigned to a solution assigned to


the niche (reference the niche (reference
line) marked green line) marked green
67 68

NSGA-III solving the DTLZ-7 problem NSGA-III solving the DTLZ-7 problem
 The DTLZ-7 is a 3-objective minimization problem  Splitting to non-dominated fronts
 Discontinuous Pareto Front
non-dominated
solutions: FR 0

other solutions
selected from the
population: FR 1, …

n -th non-dominated
front which did not fit
in the next population
as a whole

Figure from the website of professor Carlos Artemio Coello Coello, PhD:
http://delta.cs.cinvestav.mx/~ccoello/ 69 70

NSGA-III solving the DTLZ-7 problem NSGA-III solving the DTLZ-7 problem
 Shifting the ideal point to [0, 0]  Shifting the ideal point to [0, 0]
The ideal point

non-dominated non-dominated
solutions: FR 0 solutions: FR 0
The ideal point
other solutions other solutions
selected from the selected from the
population: FR 1, … population: FR 1, …

n -th non-dominated n -th non-dominated


front which did not fit front which did not fit
in the next population in the next population
as a whole as a whole

the ideal point the ideal point

71 72

12
NSGA-III solving the DTLZ-7 problem NSGA-III solving the DTLZ-7 problem
 Scaling using extreme points and the nadir point  Scaling using extreme points and the nadir point

non-dominated non-dominated
solutions: FR 0 solutions: FR 0

other solutions other solutions


selected from the selected from the
population: FR 1, … population: FR 1, …

n -th non-dominated n -th non-dominated


front which did not fit front which did not fit
in the next population in the next population
as a whole as a whole

extreme points extreme points


The nadir point
the nadir point the nadir point
The nadir point 73 74

NSGA-III solving the DTLZ-7 problem NSGA-III solving the DTLZ-7 problem
 The reference points (▲) and reference lines  Assigning solutions to niches

non-dominated non-dominated
solutions: FR 0 solutions: FR 0

other solutions other solutions


selected from the This niche selected from the
population: FR 1, … contains many population: FR 1, …
solutions from
n -th non-dominated fronts FR0 up n -th non-dominated
front which did not fit to FRn-1. No front which did not fit
solutions from
in the next population in the next population
FRn are added.
as a whole as a whole

reference points solutions assigned to


the niche (reference
line) marked green
75 76

NSGA-III solving the DTLZ-7 problem Summary


 Adding more solutions taken from FRn  NSGA (Nondominated Sorting Genetic Algorithm)
 Solutions are sorted using the Pareto front ranking
non-dominated
 Solutions in FRi are preferred to those in FRj for i < j
solutions: FR 0  Fitness sharing is used within each FRi front to prioritize
solutions in less crowded areas
other solutions
This niche selected from the
contains one population: FR 1, … f2
FR0 = non-dominated specimens
solution from
fronts FR0 up n -th non-dominated from the population P
to FRn-1. A new front which did not fit
solution from FR1 = non-dominated specimens
in the next population
FRn is added.
as a whole from P \ FR0
FR2 = non-dominated specimens
a solution assigned to
the niche (reference from P \ (FR0  FR1)
line) marked green
77 f1 FR3 = … 78

13
Summary Summary
 NSGA-II (Nondominated Sorting Genetic Algorithm II)  MOEA/D (Multiobjective Evolutionary Algorithm Based on
 Uses the crowding distance to ensure diversification of Decomposition)
solutions along the Pareto front  Decomposes a multiobjective optimization problem to Npop
 Selection performed using a binary tournament w.r.t. the single-objective subproblems
non-dominated front number and the crowding distance  Objective functions for the subproblems are obtained using
weight vectors i for i = 1, …, Npop
 NSGA-III (Nondominated Sorting Genetic Algorithm III)  Uses a neighbourhood
 Dedicated to many-objective optimization (m ≥ 4) structure to transfer
information between
Based on the previous version: NSGA-II

min
 solutions to subproblems
 Emphasizes population members which are non-dominated
yet close to a set of supplied reference points
Uses niching based on reference lines
min

79 80

14

You might also like