A Dynamic Adaptive Particle Swarm Optimization and
A Dynamic Adaptive Particle Swarm Optimization and
Abstract
A dynamic adaptive particle swarm optimization and genetic algorithm is presented to solve constrained engineering
optimization problems. A dynamic adaptive inertia factor is introduced in the basic particle swarm optimization algo-
rithm to balance the convergence rate and global optima search ability by adaptively adjusting searching velocity during
search process. Genetic algorithm–related operators including a selection operator with time-varying selection probabil-
ity, crossover operator, and n-point random mutation operator are incorporated in the particle swarm optimization
algorithm to further exploit optimal solutions generated by the particle swarm optimization algorithm. These operators
are used to diversify the swarm and prevent premature convergence. Tests on nine constrained mechanical engineering
design optimization problems with different kinds of objective functions, constraints, and design variables in nature
demonstrate the superiority of the dynamic adaptive particle swarm optimization and genetic algorithm against several
other meta-heuristic algorithms in terms of solution quality, robustness, and convergence rate in most cases.
Keywords
Constrained engineering design optimization problems, continuous and discrete design variables, meta-heuristic, dynamic
adaptive, particle swarm optimization, genetic algorithm
Creative Commons CC BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License
(http://www.creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without
further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/
open-access-at-sage).
2 Advances in Mechanical Engineering
stochastic optimization algorithms, such as the particle constrained PSO algorithm with the stagnation detec-
swarm optimization (PSO) algorithm,2 genetic algo- tion and dispersion mechanism to tackle real word non-
rithm (GA),3–5 firefly algorithm,6 ant colony optimiza- linear and constrained engineering optimization
tion,7 artificial bee colony (ABC),8 mine blast problems. Yang and colleagues27,28 proposed an accel-
algorithm (MBA),9 simulated annealing (SA) algo- erated particle swarm optimization (APSO) algorithm
rithm,10 biogeography-based optimization (BBO) algo- based on the basic PSO algorithm, in which the velocity
rithm11, have been proposed to overcome these vector is removed and particle best positions are
drawbacks. These stochastic optimization algorithms replaced by randomness. This algorithm greatly
are usually meta-heuristic and inspired by physical and improves calculation efficiency and implementation
natural phenomena. convenience. However, this algorithm is easily trapped
Among all these stochastic optimization algorithms, in premature convergence particularly for the problems
the PSO algorithm is widely applied to solve different with high nonlinearity due to the deficiency of diver-
engineering optimization problems as it is efficient in sity.1 This disadvantage was improved by Guedria1 by
computation, easy for implementation, and reliable in incorporating memories of individual particles into
searching for global optima.12–16 The PSO algorithm APSO forming a new algorithm called improved adap-
first proposed by Kennedy and Eberhart2 is based on tive particle swarm optimization (IAPSO).
social sharing of information between individuals in a To improve the swarm diversity and increase conver-
group and is originated from mimicking the flocking gence rate, many hybrid optimization algorithms with
behavior of a swarm of fish and imitating the schooling some operators or other algorithms incorporated into
behavior of birds. The PSO algorithm is made up of a PSO have been proposed.29–34 Novitasari et al.29 pro-
population of particles which are randomly moving posed a hybrid algorithm that combines the SA with
within the parameter space. The position of each indi- PSO algorithm to deal with constrained optimization
vidual particle in the parameter space denotes a candi- problems. He and Wang30 proposed a similar hybrid
date solution of the design optimization problem. By algorithm to optimize a support vector machine. Wang
changing searching velocities and positions of particles, and Yin31 introduced a ranking selection scheme into
the optimal solution is found. The ability of searching the basic PSO to automatically control search perfor-
optima of the PSO algorithm mainly relies on mutual mance of the swarm, which results in a new algorithm
interaction (social learning) and influence of individual called ranking selection–based particle swarm optimiza-
particles (cognitive learning). Particles move toward the tion (RSPSO). The crossover operators or mutation
currently global best position of the swarm in each operators used in GAs were largely adopted by
iteration. A particle can escape from a local optimum researchers and combined with PSO to generate new
with the help of neighboring particles. But if most of its algorithms, such as the modified particle swarm optimi-
neighboring particles are limited to a local extreme zation (MPSO),32 quantum-behaved PSO using muta-
point, it is attracted to the trap of the local optimum, tion operator with Gaussian distribution (G-QPSO),33
and as a result, premature convergence of the algorithm straightforward particle swarm optimization (SPSO)
and the stagnation phenomenon17 occur. To overcome with a logistic chaotic mutation operator,34 self-
these drawbacks of the basic PSO algorithms, different organizing hierarchical particle swarm optimizer with
improvements have been proposed. A descending time-varying acceleration coefficients (HPSO-TVAC),22
dynamic inertia factor or accelerating factor is widely and so on. These operators increase swarm diversity
adopted to balance the convergence rate and space and prevent premature convergence and stagnation of
searching ability of the PSO algorithm during search the PSO algorithms. The hybrid optimization algo-
process.16,18,19 Eberhart and Shi20 applied a random rithms talked above have been used to solve different
inertia weight factor to deal with dynamic systems. specific engineering optimization problems.
Clerc21 presented a constriction factor K to control the In this work, a dynamic adaptive particle swarm
convergence velocity. Apart from using time-varying optimization and genetic algorithm (DAPSO-GA) pre-
inertia weights (TVIW), time-varying accelerating coef- viously proposed by us in Zhu et al.35 is used to solve
ficients (TVAC) were also proposed and used to con- constrained engineering design optimization problems
trol the convergence rate and solution quality.22,23 A with different kinds of design variables. A dynamic
co-evolutionary particle swarm optimization (CPSO) adaptive inertia factor is used in the PSO algorithm to
was presented by He and Wang24 to solve constrained adjust its convergence rate and control the balance of
engineering optimization problems. They used a global and local optima exploration. GA-related opera-
multiple-swarm technique to evolve decision solutions tors including a selection operator with time-varying
and adapt penalty factors. Later, Krohling and selection probability, crossover operator, and n-point
Coelho25 improved the CPSO by dynamically adjusting random mutation operator are incorporated into the
the accelerated coefficients which satisfy Gaussian PSO to further exploit the optimal solutions generated
probability distribution. Worasucheep26 presented a by the PSO-related algorithm. These operators are used
Zhu et al. 3
to diversify the swarm and prevent premature conver- The procedure of the basic PSO algorithm begins
gence. The remainder of this work is organized as bel- with population initialization of particles with random
lows. The DAPSO-GA for both continuous and positions and velocities. The positions and velocities of
discrete optimization problems with constraints is spe- each particle are then updated by equations (1) and (2).
cifically introduced in section ‘‘Introduction of the After that, the corresponding fitness of each particle is
DAPSO-GA.’’ In section ‘‘Constrained engineering evaluated and ranked, and Pi (l) and Pg (l) are updated.
optimization problems,’’ four benchmark constrained The above procedure is repeated until an ending criter-
engineering optimization problems with continuous ion is met. The ending criterion is usually the maximum
design variables and five ones with discrete or mixed number of iterations or a sufficiently low error bound.
design variables are used to evaluate performance of
the DAPSO-GA on real word engineering optimization
PSO-related algorithm in the DAPSO-GA. A dynamic adap-
problems. Conclusions are drawn in section
tive inertia factor vi (l) is introduced into the basic
‘‘Conclusion.’’
PSO to adaptively adjust its searching velocity during
iterations
Introduction of the DAPSO-GA
vi ðl + 1Þ = vi ðlÞ vi ðlÞ + c1 r1 ðPi ðlÞ xi ðlÞÞ
The DAPSO-GA is a hybrid algorithm that combines ð4Þ
+ c 2 r2 P g ðl Þ x i ðl Þ
the GA and PSO algorithm. Specifically, the GA-
related operators including selection, crossover, and where
n-point random mutation operators are incorporated
into the PSO algorithm with craft. These GA-related bi ðlÞp
vi ðlÞ = vmin +ðvmax vmin Þ sin 2 ½vmin , vmax
operators are used to diversify the swarm and further 2
explore the possible optima based on the feasible solu- ð5Þ
tion provided by the PSO algorithm.
in which
PSO-related algorithm f i ðl Þ f g ðl Þ
bi ðlÞ = 2 ½0, 1, i = 1, 2, . . . , N ð6Þ
Basic PSO algorithm. The basic PSO algorithm is made f w ðl Þ f g ðl Þ
up of a population of particles that are randomly with fi (l) being the fitness value of the ith particle in the
spread within the parameter space. The position of lth iteration, and fg (l) and fw (l) being the best and
each individual particle in the parameter space denotes worst fitness values of the swarm in the lth iteration,
a candidate solution of the design optimization prob- respectively; and they satisfy fg (l) ł fi (l) ł fw (l) and
lem. Each particle has a velocity and moves in the para- thus bi (l) 2 ½0, 1. Particles with the best fitness value
meter space. The position and velocity of the particle i and worst fitness value are called the best particle and
are adjusted in each iteration worst particle in the swarm, respectively. From equa-
tions (5) and (6), the inertia factor is adaptively adjusted
vi ðl + 1Þ = v vi ðlÞ + c1 r1 ðPi ðlÞ xi ðlÞÞ in the range ½vmin , vmax during iteration. The better fit-
ð1Þ
+ c 2 r2 P g ðl Þ x i ðl Þ ness value a particle has, the smaller the inertia factor
is. Large inertia factor represents a large searching velo-
xi ðl + 1Þ = xi ðlÞ + vi ðlÞ ð2Þ city and thus, more solution spaces will be explored. In
where xi (l) and vi (l) are the position and velocity of contrast, small inertia factor can help the PSO algo-
the particle at time step l, respectively; Pi (l) is the his- rithm further exploit the solution space around the best
torical best position of the particle i so far and Pg (l) is particle. Hence, this dynamic adjustment of the inertia
factor can adaptively balance the convergence rate and
the global best position of the whole swarm up to a time
global optima search ability of the PSO algorithm.
step l; r1 and r2 are random numbers within a range
Each particle position xi is limited in the range
from 0 to 1; v is an inertia factor; and c1 and c2 are two
½xmin , xmax . If xi locates outside this range, it will be
accelerating factors used to scale influence of the best
replaced by
positions of the particle i and global best position of the
swarm, respectively. To ensure convergence of the PSO
xkmax , if xki .xkmax
algorithm, the two accelerating factors are constrained xki = , k = 1, 2, . . . , D ð7Þ
xkmin , if xki \xkmin
by13,16
in which D is the particle dimension and xki is the posi-
0\ðc1 + c2 Þ\4 tion of the ith particle in the kth dimension. Each parti-
ð3Þ
ðc1 + c2 Þ=2 1\v\1 cle velocity vi (l) is limited in ½vmin , vmax , in which
4 Advances in Mechanical Engineering
Figure 1. Flowchart of the crossover operator of the GA-related algorithm in the DAPSO-GA.
vmin = (xmax xmin )=2 and vmax = (xmax xmin )=2. fi ðlÞ fg ðlÞ
0\ \h
ð9Þ
If the particle velocity violates this limit, it will be f g ðl Þ
replaced by
where fi (l) is the current fitness value of the ith particle
vkmax , if vki .vkmax at the lth iteration, fg (l) is the best fitness value of the
vki = , k = 1, 2, . . . , D ð8Þ swarm that corresponds to its global best position, and
vkmin , if vki \vkmin
h = hmax l(hmax hmin )=ltot is the time-varying
in which vki is the velocity of the ith particle in the kth selection probability which descends from hmax to hmin
dimension. during iteration process.
(s1 and s2 ) are picked out from the position vectors of number of generations (iterations) S, maximum and
the parents, where 0\e1 ł e2 ł D. Components of the minimum inertia factors vmax and vmin , respectively,
selected two sub-vectors are inter-changed and then a accelerating factors c1 and c2 , maximum and mini-
new particle (offspring) is generated. mum selection probability hmax and hmin , respec-
tively, crossover probability pc , upper and lower
Mutation operator. An n-point random mutation limits of the position of each particle xui and xli ,
operator is used, where n is the mutation dimension respectively. In this work, vmax = 0:7, vmin = 0:4,
(i.e. the number of components or genes of the selected hmax = 0:7, hmin = 0:15, c1 = c2 = 2, and pc = 0:5
particle or chromosome for mutation) which is a ran- are used.
dom integer in ½1, D. It means that there are in total n Step 2: Initialize the swarm: randomly generate a
points (genes) in the selected particle (chromosome) to swarm with a size of M and the initial position of
be changed via mutation. Procedure of the n-point ran- each particle is given by
dom mutation operator is shown in Figure 2. First, the
mutation dimension n of the selected particle is identi- xi ð0Þ = xli + rand xui xli , i = 1, 2, . . . , M ð11Þ
fied by n = round(rand 3 D), in which rand is a random
number in ½0, 1 and round is an operator to round off Step 3: Evaluate the fitness value of each initially
the product of rand and D. Second, n different integers generated particle and rank their positions. The ini-
(i.e. m1 , m2 , . . . , mn ), which are limited in the range tial best particle position Pi (0) and initial global best
½1, D, are randomly generated. These integers repre- and worst positions Pg (0) and Pw (0) of the swarm,
sent the mutation positions in the position vector of the respectively, are then identified.
selected particle. Next, values of the components of the Step 4: Update the current position xi (l) and velo-
selected particle position vector are randomly changed city vi (l) of the ith particle according to equations
via the following equation (2) and (4)–(6).
Step 5: Evaluate the current fitness value of each par-
p k
xki = xkmin + sin rand 3 3 xmax xkmin , k = m1 , ticle, and update the best particle position Pi (l) and
2 global best and worst positions of the swarm Pg (l)
m2 , . . . , mn and Pw (l), respectively.
ð10Þ Step 6: Generate new particles (offspring) according
to the GA-related algorithm to diversify the swarm.
If the GA-selection criterion in equation (9) is met,
the crossover operator and n-point random muta-
Implementation procedure of the DAPSO-GA
tion operator are applied to update the position of a
algorithm selected particle to generate a new particle xi (l) as
Flowchart of the DAPSO-GA is shown in Figure 3 and presented in section ‘‘Crossover and mutation
it is briefly described as follows: operator.’’
Step 7: Evaluate the fitness value of the new particle
Step 1: Set initial values of the optimization para- fi (l) and compare it with the best and worst fitness
meters including the population size M, maximum values of the swarm fg (l) and fw (l), respectively. If
6 Advances in Mechanical Engineering
fi (l)\fg (l), replace the best particle by it; otherwise, optimization problems with discrete variables. For the
replace the worst particle by it if fi (l)\fw (l). Update discrete optimization problems, the DAPSO-GA can
the best particle position Pi (l) and global best and be modified using the rounding off approach. In this
worst positions of the swarm Pg (l) and Pw (l) if approach, either the continuous or discrete variables
necessary. are treated as continuous variables during optimization
Step 8: Repeat the above steps 4–7 until the termina- processes. Only at the end of the optimization proce-
tion criterion, which is a predefined number of itera- dure, the discrete variables will be rounded off to evalu-
tion, is met and then output the optimal results. ate the fitness value of each particle as shown below
Table 1. Comparison of optimal solutions obtained from different optimization algorithms for tension/compression spring design
problem.
IAPSO: improved adaptive particle swarm optimization; APSO: accelerated particle swarm optimization; DV: design variable; G-QPSO: quantum-
behaved PSO using mutation operator with Gaussian distribution; MBA: mine blast algorithm; DAPSO-GA: dynamic adaptive particle swarm
optimization and genetic algorithm.
Note: The boldfaced data in each table mean the best one among all the results provided by different algorithms.
Zhu et al. 9
Table 2. Comparison of statistical results obtained from different optimization algorithms for tension/compression spring design
problem.
SD: standard deviation; IAPSO: improved adaptive particle swarm optimization; APSO: accelerated particle swarm optimization; MBA: mine blast
algorithm; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic algorithm; PSO: particle swarm optimization; CPSO: co-
evolutionary particle swarm optimization; NFE: number of function evaluation; G-QPSO: quantum-behaved PSO using mutation operator with
Gaussian distribution; HPSO: hybrid particle optimization algorithm; PSO-DE: Particle swarm optimization with differential optimization.
Note: The boldfaced data mean optimal results provided by the DAPSO-GA algorithm.
Table 3. Comparison of optimal solutions obtained from different optimization algorithms for the three-bar truss design problem.
DAPSO-GA: dynamic adaptive particle swarm optimization and genetic algorithm; DV: design variable; PSO-TVAC: Particle swarm optimization with
time-varying accelerating coefficients.
Note: The boldfaced data in each table mean the best one among all the results provided by different algorithms.
Table 4. Comparison of statistical results obtained from different optimization algorithms for the three-bar truss design problem.
SD: standard deviation; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic algorithm; NFE: number of function evaluation.
Note: The boldfaced data mean optimal results provided by the DAPSO-GA algorithm.
Figure 8. Convergence history of GA, standard PSO, and the Figure 9. Schematic diagram of the welded beam.
proposed DAPSO-GA for the three-bar truss design problem.
Welded beam design. The welded beam design problem
of other optimization algorithms. DSS-MDE provides is a famous constrained optimization problem which is
the best solution with the largest NFEs (15,000). Thus, widely used as a benchmark problem to evaluate per-
the superiority of the proposed DAPSO-GA for the formance of newly proposed optimization algorithms.9
three-bar truss structure design problem in solution Figure 9 shows the schematic diagram of a welded
quality and convergence rate is justified. Figure 8 beam structure which consists of a beam and weld. The
shows the convergence history of GA, standard PSO, optimization target is the minimum fabrication cost of
and the proposed DAPSO-GA for the three-bar truss the beam subject to constraints on bending and shear
structure design problem. It is seen that the DAPSO- stress (s and t) on the bar, bucking load (Pb ), and its
GA converges faster to the near optimal solution at end deflections (d). The design variables for this design
early iterations and then gradually improves the solu- problem are the weld thickness h, weld length l, beam
tion accuracy due to the technique of the proposed width t, and beam thickness b, which are respectively
algorithm in adaptively balancing the exploration and denoted by x1 , x2 , x3 , and x4 in the objective function
exploitation during searching process. and constraint equations as presented in Appendix 1
Zhu et al. 11
Table 5. Comparison of optimal solutions obtained from different optimization algorithms for the welded beam design optimization
problem.
IAPSO: improved adaptive particle swarm optimization; APSO: accelerated particle swarm optimization; MBA: mine blast algorithm; CPSO:
co-evolutionary particle swarm optimization; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic algorithm; DV: design variables.
Note: The boldfaced data in each table mean the best one among all the results provided by different algorithms.
(section ‘‘Welded beam design’’). The DAPSO-GA with algorithms. The proposed algorithm can stably find the
a swarm size of 50 and maximum number of iterations best solution with almost the fewest NFEs (13,356)
of 5000 is used to solve this optimization problem. which is only larger than that of the IAPSO (12,500).
The optimization algorithms previously used to In terms of SD, the proposed algorithm has better
solve this design optimization problem include GA3,56 robustness in detecting the best solution than other
GA4,35 APSO, IAPSO, MBA, LCA, WCA, DE, SC, reported optimization algorithms apart from the
NM-PSO, PSO-DE, HPSO,29 CPSO,24 CAEP, GA1, IAPSO, MBA, LCA, hybrid particle swarm optimiza-
hybrid PSO-GA (HPSO),39 ABC2,40 and GA2. Table 5 tion and genetic algorithm (HPSO-GA), ABC2, and
presents the comparison of optimal solutions provided PSO-DE. Figure 10 shows the convergence history of
by the previously reported algorithms and proposed GA, standard PSO, and the proposed DAPSO-GA for
DAPSO-GA. From Table 5, a new optimal solution, the welded beam design problem. It is seen that the
which is better than those provided by previously pro- standard PSO and DAPSO-GA convergence faster
posed algorithms, is found by the proposed DAPSO- than GA, while the DAPSO-GA has better global opti-
GA with the objective function value of 1.6600473. mum searching ability.
Note that the optimal solution provided by CAEP is
infeasible as the constraints g1 (x) and g2 (x) are vio- Belleville disc spring design problem. As shown in Figure 11,
lated. Table 6 presents the comparison of statistical Belleville disc spring is made up of several conical discs
results provided by all previously reported algorithms with uniform rectangular cross-sections. The design
and proposed DAPSO-GA for the welded beam design objective of the Belleville disc spring is to minimize its
optimization problem in terms of the worst, mean, and total weight subject to geometric constraints concerns the
best solutions as well as the SD and NFEs. As seen outer and inner diameter, slope and height to maximum
from Table 6, DAPSO-GA provides better solutions height, and kinematic and strength constraints concerns
than the newly proposed optimization algorithm WCA, the compression deformation and stress and height to
MBA, and IAPSO as well as other optimization deformation. There are four design variables for this
12 Advances in Mechanical Engineering
Table 6. Comparison of statistical results obtained from different optimization algorithms for the welded beam design optimization
problem.
SD: standard deviation; APSO: accelerated particle swarm optimization; IAPSO: improved adaptive particle swarm optimization; MBA: mine blast
algorithm; CPSO: co-evolutionary particle swarm optimization; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic algorithm.
Note: The boldfaced data mean optimal results provided by the DAPSO-GA algorithm.
Table 7. Comparison of optimal solutions obtained from different optimization algorithms for the Belleville disc spring design
optimization problem.
MBA: mine blast algorithm; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic algorithm.
Note: The boldfaced data in each table mean the best one among all the results provided by different algorithms.
Table 8. Comparison of statistical results obtained from different optimization algorithms for the Belleville disc spring design
optimization problem.
SD: standard deviation; ABC: artificial bee colony; MBA: mine blast algorithm; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic
algorithm.
Note: The boldfaced data mean optimal results provided by the DAPSO-GA algorithm.
Table 9. Comparison of optimal solutions obtained from different optimization algorithms for the speed reducer design
optimization problem.
MBA: mine blast algorithm; APSO: accelerated particle swarm optimization; IAPSO: improved adaptive particle swarm optimization; DAPSO-GA:
dynamic adaptive particle swarm optimization and genetic algorithm; MDE: modified differential evolution.
such as Gene AS1, Gene AS2, SC, ABC, MBA, aug- Statistical results provided by the previously reported
mented Lagrangian (AL) method,62 branch and bound algorithms and proposed DAPSO-GA for this design
(BB) method,63 APSO, IAPSO, and UPSO.Table 11 optimization problem are compared in terms of the
presents the comparison of optimal solutions provided worst, mean, and best solutions as well as the SD val-
by the previously reported algorithms and proposed ues and NFEs, as shown in Table 12. It is demon-
DAPSO-GA. According to the research of H Barbosa strated that the proposed DAPSO-GA, MBA, and
(September 1996, personal communication, San IAPSO are superior to other algorithms in terms of
Francisco, CA) who computes all possible gear teeth both SD and NFEs. The mean, best, and worst solu-
combinations (494 or about 5.76 million), it can be vali- tions provided by these three algorithms are at a same
dated that the optimal solutions provided by Gene level, and they stably convergence to the best solution
AS1, ABC, and the proposed DAPSO-GA are globally with similar computing efforts and SD values. Figure
best solutions. Whereas SC, MBA, APSO, and IAPSO 16 shows the convergence history of the proposed
find a different best solution as shown in Table 11. DAPSO-GA for the gear train design problem.
16 Advances in Mechanical Engineering
Table 10. Comparison of statistical results obtained from different optimization algorithms for the speed reducer design
optimization problem.
SD: standard deviation; ABC: artificial bee colony; MBA: mine blast algorithm; APSO: accelerated particle swarm optimization; IAPSO: improved
adaptive particle swarm optimization; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic algorithm.
Note: The boldfaced data mean optimal results provided by the DAPSO-GA algorithm.
Table 11. Comparison of optimal solutions obtained from different optimization algorithms for the gear train design optimization
problem.
x1 49 33 43 49 43 33 45 43 43 49
x2 16 14 16 16 16 15 22 16 16 16
x3 19 17 19 19 19 13 18 19 19 19
x4 43 50 49 43 49 41 60 49 49 43
f (x) 2.7 E–12 1.4E–09 2.7E–12 2.7E–12 2.7E–12 2.1E–08 5.7E–06 2.7E–12 2.7E–12 2.7E–12
ABC: artificial bee colony; MBA: mine blast algorithm; AL: augmented Lagrangian; BB: branch and bound; APSO: accelerated particle swarm
optimization; IAPSO: improved adaptive particle swarm optimization; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic
algorithm.
Note: The boldfaced data in each table mean the best one among all the results provided by different algorithms.
Multiple disc clutch brake design problem. Figure 17 shows a minimum problem which aims to minimize its total
a schematic diagram of a multiple disc clutch brake. mass subject to geometrical constraints and constraints
The design problem of the multiple disc clutch brake is concerning shear stress, temperature, relative speed of
Zhu et al. 17
Table 12. Comparison of statistical results obtained from different optimization algorithms for the gear train design optimization
problem.
SD: standard deviation; MBA: mine blast algorithm; APSO: accelerated particle swarm optimization; IAPSO: improved adaptive particle swarm
optimization; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic algorithm.
Note: The boldfaced data in each table mean the best one among all the results provided by different algorithms.
the slip–stick, and stopping time.64 The design variables This design optimization problem was previously
for this design problem are inner and outer radius (ri studied by many researchers using different optimiza-
and r0 ), disc thickness (A), actuating force (F), and tion algorithms such as non-dominated sorting genetic
number of contact surfaces (Z), which are denoted by algorithm (NSGA-II),65 TLBO, WCA, ABC, APSO,
x1 , x2 , x3 , x4 , and x5 , respectively. The objective variable and IAPSO. Table 13 presents the comparison of opti-
x4 only contains in the constraint equations (a side con- mal solutions provided by the earlier reported algo-
straint). All design variables are discrete and should be rithms and proposed DAPSO-GA. It is shown that the
selected from x1 = 60, 61, . . . , 80; x2 = 90, 91, . . . , DAPSO-GA, IAPSO, WCA, and TLBO have the same
110; x3 = 1, 1:5, . . . , 3; x4 = 600, 610, . . . , 1000; objective function value of 0.31365661, although the
x5 = 2, 3, . . . , 9. The discrete DAPSO-GA with a values of the variable x4 in the optimal solutions pro-
swarm size of 40 and maximum number of iterations of vided by these four algorithms are different. This is
100 is used to solve this optimization problem. All because x4 only needs to satisfy the constraint condi-
design variables are regarded as continuous variables tions and is independent of the objective function.
and rounded off until at the end of the iterations. Statistical results provided by the previously reported
Besides, novel techniques are applied on the discrete algorithms and DAPSO-GA for this design optimiza-
variables x3 and x4 in this algorithm: x3 is regarded as a tion problem are compared as shown in Table 14. The
continuous variable limited to the range ½2, 6 and statistical results demonstrate the superiority of the
divided by two after being rounded to an integer; x4 is proposed DAPSO-GA against all proposed optimiza-
regarded as a continuous variable limited to the range tion algorithms in both NFEs and SD value. APSO
½60, 100 and multiplied by 10 after being rounded to performs the worst among all algorithms in terms of
an integer. solution quality (mean and best solutions), SD value
18 Advances in Mechanical Engineering
Table 13. Comparison of optimal solutions obtained from different optimization algorithms for the multiple disc clutch brake design
optimization problem.
x1 70 70 70 76 70 70
x2 90 90 90 96 90 90
x3 1.5 1.0 1.0 1.0 1.0 1.0
x4 1000 810 910 840 900 1000
x5 3 3 3 3 3 3
g1 (x) 0 0 0 0 0 0
g2 (x) 22.00 24.00 24.00 24.00 24.00 24.00
g3 (x) 0.90052816 0.91942781 0.90948063 0.92227317 0.91047534 0.90052816
g4 (x) 9790.5816 9830.3711 9809.4293 9824.2113 9811.5234 9790.5816
g5 (x) 7894.6966 7894.6966 7894.6966 7738.378 7894.6966 7894.6966
g6 (x) 60,625.0 37,706.25 49,768.75 48,848.372 48,562.5 60,625.0
g7 (x) 11,647.293 14,297.987 12,768.578 12,873.649 12,906.636 11,647.293
g8 (x) 3352.7067 702.0132 2231.4215 2126.3515 2093.3635 3352.7067
f (x) 0.4704 0.313656 0.313656 0.337181 0.31365661 0.31365661
APSO: accelerated particle swarm optimization; IAPSO: improved adaptive particle swarm optimization; DAPSO-GA: dynamic adaptive particle
swarm optimization and genetic algorithm.
Note: The boldfaced data in each table mean the best one among all the results provided by different algorithms.
Table 14. Comparison of statistical results obtained from different optimization algorithms for the multiple disc clutch brake design
optimization problem.
SD: standard deviation; ABC: artificial bee colony; APSO: accelerated particle swarm optimization; IAPSO: improved adaptive particle swarm
optimization; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic algorithm.
Note: The boldfaced data mean optimal results provided by the DAPSO-GA algorithm.
Table 15. Comparison of optimal solutions obtained from different optimization algorithms for the pressure vessel design
optimization problem.
GA1 0.8125 0.4375 42.0974 176.6540 22.01E–03 23.58E–02 224.7593 263.3460 6059.9463
GA2 0.8125 0.4375 42.0974 176.6540 20.2E–05 23.589E–02 227.8861 263.3460 6059.9463
CDE 0.8125 0.4375 42.0974 176.6376 26.67E–07 23.58E–02 23.71051 263.3623 6059.734
APSO 0.8125 0.4375 42.0974 176.6374 29.54E–07 23.59E–02 263.3626 20.9111 6059.7242
IAPSO 0.8125 0.4375 42.0974 176.6366 24.09E–13 23.58E–02 21.39E–07 263.3634 6059.7143
CPSO 0.8125 0.4375 42.0913 176.7465 21.37E–06 23.59E–04 2118.7687 263.2535 6061.0777
MBA 0.7802 0.3856 40.4292 198.4694 0 0 286.3645 241.5035 5889.3216
NM-PSO 0.8036 0.3972 41.6392 182.412 3.65E–05 3.79E–05 21.5914 257.5879 5930.3137
G-QPSO 0.8125 0.4375 42.0984 176.6372 28.79E–07 23.58E–02 20.2179 263.3628 6059.7208
WCA 0.7781 0.3846 40.3196 200.0000 22.95E–11 27.15E–11 21.35E–6 240.00 5885.3327
HPSO-GA 0.7782 0.3846 40.3196 200.0000 0 0 24.656E–10 240 5885.3328
ABC2 0.7782 0.3847 0.3211 199.9802 21.40E–06 22.84E–06 21.1418 240.0197 5885.4033
DAPSO-GA 0.8125 0.4375 42.0984 176.6366 24.09E–13 23.58E–02 21.39E–07 263.3634 6059.7143
APSO: accelerated particle swarm optimization; IAPSO: improved adaptive particle swarm optimization; CPSO: co-evolutionary particle swarm
optimization; MBA: mine blast algorithm; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic algorithm.
Note: The boldfaced data in each table mean the best one among all the results provided by different algorithms.
20 Advances in Mechanical Engineering
Table 16. Comparison of statistical results obtained from different optimization algorithms for the pressure vessel design
optimization problem.
SD: standard deviation; PSO: particle swarm optimization; APSO: accelerated particle swarm optimization; IAPSO: improved adaptive particle swarm
optimization; MBA: mine blast algorithm; CPSO: co-evolutionary particle swarm optimization; DAPSO-GA: dynamic adaptive particle swarm
optimization and genetic algorithm.
Note: The boldfaced data mean optimal results provided by the DAPSO-GA algorithm.
Table 17. Comparison of optimal solutions obtained from different optimization algorithms for the rolling element bearing design
optimization problem.
MBA: mine blast algorithm; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic algorithm.
Note: The boldfaced data in each table mean the best one among all the results provided by different algorithms.
different kinds of objective functions, design variables, meet the GA-selection criterion with time-varying selec-
and constraints in nature. The presented algorithm uses tion probability are adaptively selected to update their
a dynamic adaptive inertia weighting factor, which positions via a crossover and n-point mutation operator
adaptively adjusts the search velocity in optimum in each iteration process. Global best and worst posi-
searching process, to balance the exploitation (local tions of the PSO are updated according to the refined
search) and exploration (global search). In the pro- particle position generated by GA. With the three GA-
posed algorithm, GA-related operators are incorpo- related operators, the particle swarm is greatly diversi-
rated into PSO and used to refine the optimal solution fied and as a result, premature convergence is effectively
provided by the PSO. Few particles in the swarm that prevented. The promising prospect of the proposed
22 Advances in Mechanical Engineering
Table 18. Comparison of statistical results obtained from different optimization algorithms for the rolling element bearing design
optimization problem.
SD: standard deviation; ABC: artificial bee colony; MBA: mine blast algorithm; DAPSO-GA: dynamic adaptive particle swarm optimization and genetic
algorithm.
Note: The boldfaced data mean optimal results provided by the DAPSO-GA algorithm.
ORCID iDs
Hao Zhu https://orcid.org/0000-0003-3910-2947
Weidong Zhu https://orcid.org/0000-0003-2707-2533
References
1. Guedria NB. Improved accelerated PSO algorithm for
mechanical engineering optimization problems. Appl Soft
Comput 2016; 40: 455–467.
2. Kennedy J and Eberhart R. Particle swarm optimization.
In: IEEE international conference on neural networks
(ICNN), Perth, WA, Australia, vol. 4, 27 November–1
December 1995, pp.1942–1948. New York: IEEE.
Figure 22. Convergence history of the proposed DAPSO-GA 3. Holland JH. Outline for a logical theory of adaptive sys-
for the rolling element bearing design problem. tems. J ACM 1962; 9: 297–314.
4. Bagley JD. The behavior of adaptive systems which employ
genetic and correlation algorithms. Ann Arbor, MI: Dis-
DAPSO-GA for engineering constrained optimization sertation Abstracts International, University of Michi-
problems is evaluated by solving nine different bench- gan, 1967.
mark mechanical engineering design optimization prob- 5. Goldberg DE. Genetic algorithms in search, optimization
lems with continuous, discrete, or mixed design and machine learning. Boston, MA: Addison-Wesley
Longman Publishing Co., Inc., 1989.
variables. For most of the considered mechanical engi-
6. Yang XS. Firefly algorithms for multimodal optimiza-
neering design optimization problems, statistical results tion. In: Watanabe O and Zeugmann T (eds) Stochastic
show that the proposed DAPSO-GA convergences to algorithms: foundations and applications, vol. 5792. Ber-
the best or similar solution with the smallest SD values lin: Springer, 2009, pp.169–178.
and lowest computation efforts (NFEs) against other 7. Dorigo M, Birattari M and Stutzle T. Ant colony optimi-
meta-heuristic algorithms. zation artificial ants as a computational intelligence tech-
nique. IEEE Comput Intell M 2006; 1: 28–39.
Declaration of conflicting interests 8. Karaboga D and Basturk B. Artificial bee colony (ABC)
The author(s) declared no potential conflicts of interest with optimization algorithm for solving constrained optimiza-
respect to the research, authorship, and/or publication of this tion. In: Melin P, Castillo O, Aguilar LT, et al. (eds)
article. Foundations of fuzzy logic and soft computing. IFSA 2007
(Lecture Notes in Computer Science, vol. 4529). Berlin:
Funding Springer, 2007, pp.789–798.
9. Sadollah A, Bahreininejad A, Eskandar H, et al. Mine
The author(s) disclosed receipt of the following financial sup- blast algorithm: a new population based algorithm for
port for the research, authorship, and/or publication of this solving constrained engineering optimization problems.
article: The authors gratefully acknowledge to the financial Appl Soft Comput 2013; 13: 2592–2612.
Zhu et al. 23
10. Kirkpatrick S, Gelatt CD and Vecchi MP. Optimization 26. Worasucheep C. Solving constrained engineering optimi-
by simulated annealing. Science 1983; 220: 671–680. zation problems by the constrained PSO-DD. In: 2008
11. Garg H. An efficient biogeography based optimization 5th international conference on Electrical Engineering/
algorithm for solving reliability optimization problems. Electronics, Computer, Telecommunications and Informa-
Swarm Evol Comput 2015; 24: 1–10. tion Technology, Krabi, Thailand, 14–17 May 2008, pp.5–
12. Gouttefarde M and Gosselin CM. Analysis of the 8. New York: IEEE.
wrench-closure workspace of planar parallel cable-driven 27. Yang XS. Engineering optimization: an introduction with
mechanisms. IEEE T Robot 2006; 22: 434–445. metaheuristic applications. Hoboken, NJ: John Wiley &
13. Perez RE and Behdinan K. Particle swarm optimization Sons, Inc., 2010.
in structural design. In: Chan FTS and Tiwari MK (eds) 28. Yang XS, Deb S and Fong S. Accelerated particle swarm
Swarm intelligence: focus on ant and particle swarm opti- optimization and support vector machine for business
mization. Vienna: Intech Education and Publishing, 2007, optimization and applications. In: Fong S (ed.) Net-
pp.373–394. worked digital technologies. NDT 2011. Communications
14. Hassan R, Cohanim BE and de Weck OL. A comparison in computer and information science, vol. 136. Berlin:
of particle swarm optimization and the genetic algorithm. Springer, 2011, pp.53–66.
In: 46th AIAA/ASME/ASCE/AHS/ASC structures, struc- 29. Novitasari D, Cholissodin I and Mahmudy WF. Hybri-
tural dynamics, and materials conference, Austin, TX, 18– dizing PSO with SA for optimizing SVR applied to soft-
21 April 2005, AIAA paper no. 2005–1897. Reston, VA: ware effort estimation. TELKOMNIKA 2016; 14:
American Institute of Aeronautics and Astronautics. 245–253.
15. Mortazavi A and Toğan V. Simultaneous size, shape, and 30. He Q and Wang L. A hybrid particle swarm optimization
topology optimization of truss structures using integrated with a feasibility-based rule for constrained optimization.
particle swarm optimizer. Struct Multidiscip O 2016; 54: Appl Math Comput 2007; 186: 1407–1422.
715–736. 31. Wang J and Yin Z. A ranking selection-based particle
16. Joshua TB, Xin J and Sunil KA. Optimal design of cable- swarm optimizer for engineering design optimization
driven manipulators using particle swarm optimization. J problems. Struct Multidiscip O 2008; 37: 131–147.
Vib Acoust 2016; 8: 041003. 32. Lei J-J and Li J. A modified particle swarm optimization
17. Esmin AAA and Matwin S. HPSOM: a hybrid particle for practical engineering optimization. In: 2009 fifth inter-
swarm optimization algorithm with genetic mutation. Int national conference on natural computation, vol. 3, Tian-
J Innov Comput I 2013; 9: 1919–1934. jin, China, 14–16 August 2009, pp.177–180. New York:
18. Lovbjerg M and Krink T. Extending particle swarm opti- IEEE.
mizers with self-organized critically. In: Proceedings of 33. Coelho LDS. Gaussian quantum-behaved particle swarm
the 2002 congress on evolutionary computation. CEC’02, optimization approaches for constrained engineering
vol. 2, Honolulu, HI, 12–17 May 2002, pp.1588–1593. design problems. Expert Syst Appl 2010; 37: 1676–1683.
New York: IEEE. 34. Mahmoodabadi MJ and Bisheban M. An online optimal
19. Shi Y and Eberhart R. Empirical study of particle swarm linear state feedback controller based on MLS approxi-
optimization. In: Proceedings of the 1999 congress on evo- mations and a novel straightforward PSO algorithm. T I
lutionary computation-CEC99, Washington, DC, 6–9 July Meas Control 2014; 36: 1132–1142.
1999, pp.1945–1950. New York: IEEE. 35. Zhu H, Hu YM, Zhu WD, et al. Optimal design of an
20. Eberhart RC and Shi Y. Tracking and optimizing auto-tensioner in an automotive belt drive system via a
dynamic systems with particle swarms. In: Proceedings of dynamic adaptive PSO-GA. J Mech Design 2017; 139:
the 2001 congress on evolutionary computation, Seoul,
093302.
Korea, 27–30 May 2001, pp.94–97. New York: IEEE.
36. Zhang M, Luo W and Wang X. Differential evolution
21. Clerc M. The Swarm and The Queen: towards a determi-
with dynamic stochastic selection for constrained optimi-
nistic and adaptive particle swarm optimization. In: Pro-
zation. Inform Sciences 2008; 178: 3043–3074.
ceedings of the 1999 congress on evolutionary computation,
37. Coello CAC. Theoretical and numerical constraint-
Washington, DC, 6–9 July 1999. New York: IEEE.
handling techniques used with evolutionary algorithms: a
22. Ratnaweera A, Halgamuge SK and Harry CW. Self-orga-
survey of the state of the art. Comput Method Appl M
nizing hierarchical particle swarm optimizer with time-
2002; 191: 1245–1287.
varying acceleration coefficients. IEEE T Evolut Comput
38. Mezura-Montes E and Coello CAC. A simple multimem-
2004; 8: 240–255.
bered evolution strategy to solve constrained optimiza-
23. Shi Y and Eberhart RC. Parameter selection in particle
tion problems. IEEE T Evolut Comput 2005; 9: 1–17.
swarm optimization. In: Porto VW, Saravanan N, Waa-
39. Garg H. A hybrid PSO-GA algorithm for constrained
gen D, et al. (eds) Evolutionary programming VII. Berlin:
optimization problems. Appl Math Comput 2016; 274:
Springer, 1998, pp.591–600.
292–305.
24. He Q and Wang L. An effective co-evolutionary particle
40. Garg H. Solving structural engineering design optimiza-
swarm optimization for constrained engineering design
tion problems using an artificial bee colony algorithm. J
problems. Eng Appl Artif Intel 2007; 20: 89–99.
Ind Manag Optim 2014; 10: 777–794.
25. Krohling RA and Coelho LDS. Coevolutionary particle
41. Coello CAC. Use of a self-adaptive penalty approach for
swarm optimization using Gaussian distribution for sol-
engineering optimization problems. Comput Ind 2000; 41:
ving constrained optimization problems. IEEE T Syst
Man Cy B 2006; 36: 1407–1416. 113–127.
24 Advances in Mechanical Engineering
42. Coello CAC and Montes EM. Constraint-handling in mechanical design optimization problems. Comput Aided
genetic algorithms through the use of dominance-based Design 2011; 43: 303–315.
tournament selection. Adv Eng Inform 2002; 16: 193–203. 58. Coello CAC. Treating constraints as objectives for single-
43. Eskandar H, Sadollah A, Bahreininejad A, et al. Water objective evolutionary optimization. Eng Optimiz 2000;
cycle algorithm—a novel metaheuristic optimization 32: 275–308.
method for solving constrained engineering optimization 59. Siddall JN. Optimal engineering design: principles and
problems. Comput Struct 2012; 110: 151–166. applications. New York: Marcel Dekker, 1982.
44. Lampinen J. A constraint handling approach for the dif- 60. Deb K and Goyal M. Optimizing engineering designs
ferential evolution algorithm. In: Proceedings of the 2002 using a combined genetic search. In: Proceedings of the
congress on evolutionary computation. CEC’02, Honolulu, seventh international conference on genetic algorithms,
HI, 12–17 May 2002, pp.1468–1473. New York: IEEE. East Lansing, MI, 19–23 July 1997, pp.521–528. San
45. Wang L and Li LP. An effective differential evolution Francisco, CA: Morgan Kaufmann Publishers Inc.
with level comparison for constrained engineering design. 61. Mezura-Montes E, Velázquez-Reyes J and Coello CAC.
Struct Multidiscip O 2010; 41: 947–963. Modified differential evolution for constrained optimiza-
46. Zahara E and Kao YT. Hybrid Nelder–Mead simplex tion. In: 2006 IEEE international conference on evolution-
search and particle swarm optimization for constrained ary computation, Vancouver, BC, Canada, 16–21 July
engineering design problems. Expert Syst Appl 2009; 36: 2006, pp.25–32. New York: IEEE.
3880–3886. 62. Kannan BK and Kramer SN. An augmented Lagrange
47. Wang Y, Cai Z, Zhou Y, et al. Constrained optimization multiplier based method for mixed integer discrete con-
based on hybrid evolutionary algorithm and adaptive tinuous optimization and its applications to mechanical
constraint-handling technique. Struct Multidiscip O 2009; design. J Mech Design 1994; 116: 405–411.
37: 395–413. 63. Sandgren E. Nonlinear integer and discrete program-
48. Davoodi E, Hagh MT and Zadeh SG. A hybrid improved ming in mechanical design. J mech Design 1990; 112:
quantum-behaved particle swarm optimization-simplex 223–229.
method (IQPSOS) to solve power system load flow prob- 64. Osyczka A. Evolutionary algorithms for single and multi-
lems. Appl Soft Comput 2014; 21: 171–179. criteria design optimization (Studies in Fuzziness and Soft
49. Ray T and Liew KM. Society and civilization: an optimi- Computing). Heidelberg: Physica-Verlag, 2002.
zation algorithm based on the simulation of social beha- 65. Deb K and Srinivasan A. Innovization: innovating design
vior. IEEE T Evolut Comput 2003; 7: 386–396. principles through optimization. In: Proceedings of the
50. Kashan AH. An efficient algorithm for constrained glo- 8th annual conference on genetic and evolutionary compu-
bal optimization and application to mechanical engineer- tation, Seattle, WA, 8–12 July 2006, pp.1629–1636. New
ing design: league championship algorithm (LCA). York: ACM Press.
Comput Aided Design 2011; 43: 1769–1792. 66. Becerra RL, Coello CAC. Cultured differential evolution
51. Parsopoulos KE and Vrahatis MN. Unified particle for constrained optimization. Computer Methods in
swarm optimization for solving constrained engineering Applied Mechanics and Engineering 2006; 195: 4303–4022.
optimization problems. In: Wang L, Chen K and Ong YS 67. Gupta S, Tiwari R and Nair SB. Multi-objective design
(eds) Advances in natural computation. Berlin: Springer, optimization of rolling bearings using genetic algorithms.
2005, pp.582–591. Mech Mach Theory 2007; 42: 1418–1443.
52. Mezura-Montes E and Coello CAC. Useful infeasible
solutions in engineering optimization with evolutionary Appendix 1
algorithms. In: Gelbukh A, de Albornoz Á and Tera-
shima-Marı́n H (eds) MICAI 2005: advances in artificial Tension/compression spring design problem
intelligence (Lecture Notes in Computer Science, vol.
3789). Berlin: Springer, 2005, pp.652–662.
53. Liu H, Cai Z and Wang Y. Hybridizing particle swarm Minimize f ðxÞ = (x3 + 2)x2 x21
optimization with differential evolution for constrained Subject to
numerical and engineering optimization. Appl Soft Com-
put 2010; 10: 629–640. x32 x3
g1 ðxÞ = 1 ł0
54. Henendz S. Multiobjective structural optimization. In: 71785x41
Kodoyalam S and Saxena M (eds) Geometry and optimi- 4x22 x1 x2 1
zation techniques for structural design. Amsterdam: Else- g2 ðxÞ = 3 + 1ł0
vier, 1994, pp.341–362. 12566 x2 x1 x1 4 5108x21
55. Ray T and Saini P. Engineering design optimization using 140:45x1
g3 ðxÞ = 1 ł0
a swarm with an intelligent information sharing among x22 x3
individuals. Eng Optimiz 2001; 33: 735–748. x1 + x2
56. Deb K. Optimal design of a welded beam via genetic g4 ðxÞ = 1ł0
1:5
algorithms. AIAA J 1991; 29: 2013–2015.
57. Rao RV, Savsani VJ and Vakharia DP. Teaching–learn- where 0:05 ł x1 ł 20, 0:25 ł x2 ł 1:3, and
ing-based optimization: a novel method for constrained 2:00 ł x3 ł 15:00.
Zhu et al. 25
where
6 K 1 2
a=
p ln K K
6 K 1
b= 1
p ln K ln K
6 K1
g=
p ln K 2
Pmax = 5400 lb, dmax = 0:2 in, S = 200 kpsi, E = 30E06 psi, m = 0:3, H = 2 in, Dmax = 12:01 in, K = De =Di ,
dl = f (a)a, a = h=t.
Values of vary as detailed in Table 19.
a <1:4 1.5 1.6 1.7 1.8 1.9 2.0 2.1 2.2 2.3 2.4 2.5 2.6 2.7 ø 2:8
f (a) 1 0.85 0.77 0.71 0.66 0.63 0.60 0.58 0.56 0.55 0.53 0.52 0.51 0.51 0.50
26 Advances in Mechanical Engineering
Appendix 2
where 2:6 ł x1 ł 3:6; 0:7 ł x2 ł 0:8; 17 ł x3 ł 28; 7:3 ł x4 ł 8:3; 7:3 ł x5 ł 8:3 2:9 ł x6 ł 3:9; 5:0 ł x7 ł 5:5.
where
2 r3 ri3
Mh = mFZ 02
3 r0 ri2
2 F
Prz =
3 p r02 ri2
3
2pn r0 ri3
vrz =
90 r02 ri2
Iz pn
T=
30 Mh Mf