Professional Documents
Culture Documents
a r t i c l e i n f o a b s t r a c t
Article history: During millions of years, nature has developed patterns and processes with interesting characteristics.
Received 5 March 2017 They have been used as inspiration for a significant number of innovative models that can be extended
Revised 5 March 2018
to solve complex engineering and mathematical problems. One of the most famous patterns present in
Accepted 31 March 2018
nature is the Golden Section (GS). It defines an especial proportion that allows the adequate formation,
Available online 9 April 2018
selection, partition, and replication in several natural phenomena. On the other hand, Evolutionary algo-
Keywords: rithms (EAs) are stochastic optimization methods based on the model of natural evolution. One important
Evolutionary algorithms process in these schemes is the operation of selection which exerts a strong influence on the performance
Golden Section of their search strategy. Different selection methods have been reported in the literature. However, all of
Selection methods them present an unsatisfactory performance as a consequence of the deficient relations between elitism
Genetic algorithms (GA) and diversity of their selection procedures. In this paper, a new selection method for evolutionary com-
Evolutionary strategies (ES)
putation algorithms is introduced. In the proposed approach, the population is segmented into several
Genetic Programming (GP)
Evolutionary computation
groups. Each group involves a certain number of individuals and a probability to be selected, which are
determined according to the GS proportion. Therefore, the individuals are divided into categories where
each group contains individual with similar quality regarding their fitness values. Since the possibility
to choose an element inside the group is the same, the probability of selecting an individual depends
exclusively on the group from which it belongs. Under these conditions, the proposed approach defines
a better balance between elitism and diversity of the selection strategy. Numerical simulations show that
the proposed method achieves the best performance over other selection algorithms, in terms of its so-
lution quality and convergence speed.
© 2018 Elsevier Ltd. All rights reserved.
https://doi.org/10.1016/j.eswa.2018.03.064
0957-4174/© 2018 Elsevier Ltd. All rights reserved.
184 E. Cuevas et al. / Expert Systems With Applications 106 (2018) 183–196
Table 1
Comparative overview of classical selection methods.
Proportional method (Holland, 1975) It assigns a selecting probability to each individual in Problems in case of negative objective functions values or
terms of its relative fitness value. minimization tasks.
Tournament selection (Blickle et al., It selects the best solution for a set of q different Low selective pressure that promotes extensively the
1995) individuals. exploration, but adversely punishes the exploitation.
Linear ranking (Darrell, 1989) It uses a linear function to map the ranking index of each Solutions with the same fitness value can obtain a very
solution to a selection probability. different selection probability affecting the search
strategy.
nation interchanges essential characteristics of the search space mance of the EAs, the current research for the design or modifica-
among individuals. Selection (Bäck, 1992) conducts the search tion of selection operators has been practically overlooked.
strategy towards promising regions of the search space by the use The concept of selective pressure (Li et al., 2015; Pascal et al.,
of information currently available in the population. Both Mutation 2011) has been extensively used in the literature to characterize
and recombination (Bäck, 1992) allow the exploration of the search the performance of a selection approach. The index of takeover
space while selection exploits the information already present in time (Goldberg & Deb, 1991) allows evaluating the selective pres-
the population with the objective of improving it. sure of a determined selection method adequately. It quantifies the
The balance between exploration and exploitation (Nandar & number of generations required by a selection method to fill the
Ponnuthurai, 2015) in a search strategy can be interpreted, as the complete population with copies of the best initial solution. There-
conflicting action of increasing the solution diversity and simulta- fore, initially, the population presents only a single best individual
neously refining the solutions, already known, which mainly main- g (g1 ,g2 ,…gd ) and N-1 worse elements. Then, the selection method
tain the best fitness values. In EAs, this balance is critical in or- is operated until the whole population contains N copies of g.
der to achieve a good performance of the search strategy. Under On the other hand, nature is an exciting and inexhaustible
such circumstances, the selection operator provides an important source of solutions to several biological problems, which were
mechanism to modify the relation exploration-exploitation (Bäck & solved as a result of natural selection during millions of years
Hoffmeister, 1991). To increase the intensity of selecting individu- of evolution (Julian, Olga, Nikolaj, Bowyer, & Pahl, 2006; Sonya &
als with high fitness values augments the exploitation of the search William, 2010). As functional entities, nature has suggested impor-
strategy (Bäck, 1994). On the other hand, decreasing the emphasis tant patterns with interesting characteristics. They have been used
on selecting such individuals permits the selection of low-quality as inspiration for a significant number of innovative models that
solutions. In these conditions, the exploration of the optimization can be extended to solve complex engineering and mathematical
strategy is promoted (Baker, 1987). problems (Clark, Kok, & Lacroix, 1999). One of the most famous
Different selection techniques have been proposed in the liter- patterns present in nature is the Golden Section (GS) (Benavoli &
ature with different performance levels. The most popular meth- Chisci, 2009; Newell &Pennybacker, 2013). It defines an especial
ods are the Roulette method (Holland, 1975), the tournament se- proportion that allows the adequate formation, selection, partition,
lection (Blickle, Thiele, & Eshelman, 1995) and the linear rank- and replication of several natural phenomena (Walser, 2001). It ap-
ing (Darrell, 1989). The proportional method assigns a selecting pears in a variety of schemes, including the geometry of crystals,
probability to each individual regarding its relative fitness value. the spacing of stems in plants, the proportion of body parts in
This mechanism presents several flaws in case of negative objec- animals, and in the proportion of feature size in the human face
tive functions or minimization tasks (Noraini &, Geraghty, 2011). (Dunlap, 1997). GS, sometimes known as the golden ratio or golden
In the tournament selection technique, it is selected the best so- number, has been studied widely and has attracted the interest of
lution of a set of q different individuals randomly obtained from many scientific communities. As a result, its use has been extended
the whole population. The standard size of the tournament set is to several disciplines such as architecture (Krishnendra, 2015),
q = 2. Under tournament, it has been demonstrated that the se- arts (Loai, 2012), engineering, industrial design (Lu, 2003), biol-
lective pressure is very low, favoring the exploration extensively ogy (Ciucurel, Georgescu, & Iconaru, 2018) and quantum mechanics
and punishing the exploitation of the search strategy (Brad & Gold- (Coldea1 et al., 2010).
berg, 1995). Finally, the linear ranking method uses a linear func- Alternatively, to traditional selection methods, in this paper, a
tion to map the ranking index of each solution to a selection prob- new simple selection method for EAs is introduced. In the pro-
ability. Although linear ranking maintains a good selective pres- posed method, the population is segmented into several groups.
sure, the method presents a critical difficulty. Since linear rank- The number of individuals and the selection probability of each
ing assigns a selection probability to each solution depending on group is determined so that the proportion of two consecutive
its respective ranking index, two solutions with the same fitness groups maintains the GS. The proportions are assigned consider-
value can obtain a very different selection probability. This incon- ing that the group with the highest proportion of individuals cor-
sistency adversely affects the performance of the search strategy responds to the smallest probability to be selected. This group as-
during the optimization (Blickle & Thiele, 1996). In general, most sembles the elements with the worst fitness quality of the popula-
of the selection methods present an unsatisfactory performance as tion. In contrast, the group with the lowest proportion of elements
a consequence of the deficient relations between elitism and di- associates the highest probability to be chosen. Such a group con-
versity of their selection procedures. To provide an overview of all tains the best individuals in the population. Since the possibility
methods, we summarize the set of comparative features for each to choose an element inside the group is the same, the probabil-
method in Table 1. ity to select an individual depends exclusively on the group from
A selection operator is entirely independent of the Evolutionary which it belongs. Numerical simulations show that the proposed
Algorithm (EA). In particular, any selection method can be modi- method achieves the best performance over other selection algo-
fied or replaced, regardless of the global structure or the rest of rithms, with regard to solution quality and convergence speed. The
the operators used for a specific EA (De la, Maza, & Tidor, 1993). main contributions of this research are:
In spite of the importance of the selection operation in the perfor-
E. Cuevas et al. / Expert Systems With Applications 106 (2018) 183–196 185
(a) (b)
(c) (d)
Fig. 6. Division process of the individual space of Pk considering 6(b) two, 6(c) three and 6(d) four sections in terms of the objective function presented in 6(a).
yellow squares, remain without change. With the distributions pre- 1. (S) The proposed method presents a better balance between
sented in Fig. 6(b)−(d), it is clear that the more groups are set, the elitism and diversity of the selection strategy.
more elitist the selection method becomes. 2. (S) The structure of the algorithm is simple and fast. Once
On the other hand, Fig. 7 presents the probability distribution the amount of groups is defined, the number of individu-
of each group configuration. In Fig. 7(a) the selection probabilities als and their selecting probabilities are fixed without using
of groups g21 and g22 are shown. From the figure, it is clear that the other operation during the optimization process.
best individuals of maintaining a high probability to be selected in 3. (W) The selection of an individual involves the use of two
comparison with g22 . Fig. 7(b) considers the probability distribution different probabilistic decisions: One for selecting the group
in the case of three sections. Under this configuration, the best in- and other for extracting the individual inside the group. This
dividuals, stored in g31 , also have the maximal probability to be se- fact can be understood as a disadvantage considering that all
lected. However, the probabilities of groups g32 and g33 are obtained selection methods use only one probabilistic determination.
by the sub-division of the probability assigned to g22 (g21 = g31 + g32 ). 4. (W) The proposed approach incorporates a configuration pa-
Therefore, the probability of g22 that includes the worst elements of rameter m which specifies the number of groups. Its defini-
Pk is sub-divided in two new probability sections. In case of seg- tion can be considered a disadvantage since all the other se-
menting Pk in four sections, the probability proportions are dis- lection methods operate automatically without specifying an
tributed according to Fig. 7(c). Similar to the cases of two and extra parameter. (S) On the other hand, its use allows modi-
three groups, the first group g41 maintains the best selection fre- fying the association between elitism and diversity of the se-
quency. On the other hand, the selection probability of g41 is still lection strategy. Under such conditions, it can be employed
reduced to assign higher probabilities to groups with individuals to fulfill the exploration and exploitation requirements of
of better qualities. specific optimization problems.
Different advantages and disadvantages of the proposed ap-
proach can be identified through the comparison of similarities 5. Experimental results
and differences with other selection strategies. As a summary, we
can enumerate the strengths (S) and weaknesses (W) of the pro- In this paper, the proposed method based on the GS is used
posed method as follows: to select individuals from a population. The method is also evalu-
190 E. Cuevas et al. / Expert Systems With Applications 106 (2018) 183–196
(a) (b)
(c)
Fig. 7. Division process of the probability space considering 7(a) two, 7(b) three and 7(c) four sections.
ated in comparison with other selection approaches popularly em- In the comparison, a representative set of 7 functions, col-
ployed in evolutionary algorithms. In the experiments, we have ap- lected from Refs. (Ali et al., 1995; Chelouah & Siarry, 20 0 0) have
plied the GS method to different selection schemes whereas its re- been considered to test the performance of the proposed selection
sults are also compared to those produced by the roulette method method. Table 3 presents the test functions employed in the simu-
(Holland, 1975) and the tournament selection approach (Blickle lations. In Table 3, d represents the dimension of the function, f(x∗ )
et al., 1995; Clark et al., 1999). In case of the roulette method, a characterizes the minimum value of the function in location x∗ and
positive constant is added to each fitness value in order to avoid S is the search space. In the test, all functions are operated in 30
negative probabilities. On the other hand, the tournament selec- dimensions (d = 30).
tion approach is set with q = 2. This configuration is considered the In the simulations, two versions of the proposed GS method
most popular for selection proposes. are considered: with 3 sections (GS3) and with 4 sections
The experimental results are divided into three sub-sections. (GS4). Such techniques are compared along with the roulette
In the first Section (5.1), the performance of selection methods is method (Holland, 1975) and the tournament selection approach
compared regarding solution quality and convergence time. In the (Blickle et al., 1995) considering the functions from Table 3. To
second Section (5.2), the takeover time index is used to charac- minimize the stochastic effect of the results, each benchmark func-
terize the selective pressure of each method. Then, in the third tion is executed independently 30 times. The results, presented in
Section (5.3), the methods are analyzed using the concepts of the Table 4, report the averaged best fitness values f¯ and the standard
differential selection and the response to selection. Finally, in the deviations (σ f ) obtained through the 30 executions.
fourth Section (5.4), an analysis of the experimental results is pre- The best results in Table 4 are highlighted. According to Table 4,
sented. the approach GS3 provides better performance than GS4, tourna-
ment, and roulette for all functions. These performance differences
are directly related to a better trade-off between elitism and diver-
5.1. Solution quality and convergence sity produced by the formulated selection method. Fig. 8 presents
graphically the evolution of the average results of each method for
In this section, the performance of the selection methods re- functions 8(a) f1 , 8(b) f3 , 8(c) f4 and 8(d) f7 . With the visualiza-
garding solution quality and convergence time are compared. In tion of these graphs, the objective is to evaluate the velocity with
the comparisons, a generic Genetic Algorithm (GA) is employed to which a compared method reaches the optimum. An analysis of
solve several optimization problems represented by a set of bench- Fig. 8 shows that the proposed methods with 3 or 4 sections are
mark functions. In the experiments, each selection method is used faster in reaching their optimal fitness values than the tournament
as selection operator in a genetic algorithm whereas its operations and the roulette techniques.
of crossover and mutation remain without change. The comparison of the final fitness values cannot completely
The employed GA is set according to the following parameters, describe the performance of a selection algorithm. Therefore, in
the population size (N) is 100, the crossover probability is 0.60, and this part, a convergence test on the four compared algorithms has
the mutation probability is 0.10. Such values present according to been conducted. In the experiments, the convergence time t is as-
(Hamzaçebi, 2008) an acceptable performance. As termination cri- sessed as follows:
teria, it is considered the maximum number of iterations which
has been set to 500 (gen = 500). This stop criterion has been se-
lected to maintain compatibility with similar works reported in the
literature (Shilane, Martikainen, Dudoit, & Ovaska, 2008). t = min(tb , gen ), (6)
E. Cuevas et al. / Expert Systems With Applications 106 (2018) 183–196 191
Table 3
Benchmark functions used in the experimental study.
Table 4
Comparison of solution quality among the selection methods.
where gen represents the maximum number of allowed genera- and convergence properties of a selection method in the ab-
tions and tb is the generation number in which the best fitness sence of any other evolutionary operation. Assuming a popula-
value has been found. tion Pk ({pk1 , pk2 , . . . , pkN }) of N candidate solutions with a single best
In the simulations, the algorithms are executed 30 times over elementpbest , takeover time is defined as the number of genera-
seven functions. Under such conditions, 210 different executions tions GT spent by a selection method until the rest of the N-1 in-
are conducted for each selection method. To evaluate the conver- dividuals consists of copies of pbest .
gence of the selection methods, a study of the produced conver- Therefore, considering that E(k) represents the number of
gence times is conducted. In the study, the frequency of the re- copies of pbest in Pk at the k iteration:
sulting 210 convergence times is analyzed by using histograms.
E (k ) = NepkNe = pbest , Ne = 1, . . . , N (7)
Fig. 9 presents the resulting histograms for 9(a) GS4, 9(b) GS3,
9(c) Tournament and 9(d) Roulette. In each histogram, the com- The takeover time GT is calculated as follows:
plete number of generations gen are divided into intervals of ap-
proximately 50 iterations. For the sake of clarity, the intervals that GT = min {k|E (k ) = N } (8)
do not contain more than ten events have been deleted from the Small GT values define a high selective pressure while large val-
histogram. Fig. 9(a) exhibits the convergence time distribution for ues characterize a low selective pressure of the search strategy.
the case of the GS with four sections. According to this Figure, the To evaluate the takeover time of each selection method, an ex-
frequency peak is reached in the interval from 225 to 275 gen- periment has been conducted. In the test, a population Pk of 400
erations whereas the average convergence time is approximately (N = 400) individuals is randomly initialized, containing a single
located in 290 iterations. Fig. 9(b) presents the convergence time best individual pbest . Then, the selection method is applied to the
distribution for the case of the GS with three sections. An analy- population whereas the empirical takeover times GT are registered.
sis of this histogram shows that the maximal frequency is attained It is important to remark that the process does not consider the
in the interval from 170 to 210 generations whereas the average effect of any other evolutionary operation such as crossover and
convergence time is localized in 160 iterations. On the other hand, mutation.
Fig. 9(c) and (d) show the convergence time distributions for the In the simulations, two versions of the proposed GS method
case of the Tournament and Roulette methods. From the figures, are considered: with 3 sections (GS3) and with 4 sections
it is clear that both methods present average convergence times (GS4). Such techniques are compared along with the roulette
comparatively higher than those produced by the GS approaches. method (Holland, 1975) and the tournament selection approach
(Blickle et al., 1995). In Table 5, the takeover times produced by
5.2. Takeover time each selection method are presented. In the Table, it is also re-
ported the execution time in which GT time takes place. To min-
The concept of takeover time, introduced by Goldberg and imize the stochastic effect, the results consider the average val-
Deb (1991), has been used to evaluate the selective pressure ues obtained from 30 executions of each method. An analysis of
192 E. Cuevas et al. / Expert Systems With Applications 106 (2018) 183–196
(a) (b)
(c) (d)
Fig. 8. Fitness average results obtained during the evolution process for functions 8(a) f1 , 8(b) f3 , 8(c) f4 and 8(d) f7 .
(a) (b)
(c) (d)
Fig. 9. Resulting histograms for (a) GS4, (b) GS3, (c) Tournament and 9(d) Roulette.
E. Cuevas et al. / Expert Systems With Applications 106 (2018) 183–196 193
Table 5
Algorithm 2 Pseudo code of the tournament selection method.
Takeover times produced by each selection method.
Table 5 suggests that the GS3 method maintains the best perfor-
mance in comparison to the other selection methods. From the
results, it is clear than the Tournament, and Roulette methods
present the worst indexes regarding their takeover and execution
times. generations, Fig. 10 shows only 50 generations from 500 used in
Fig. 10 shows the comparative dynamics of E(k) of each method the evolution process. From Fig. 10, it is clear that the GS3 method
through the takeover time test. The curve represents the averaged is the fastest in filling Pk with pbest (around of 5 Generations). On
values obtained from a set of 30 independent executions. Since the the other hand, the Roulette and Tournament approaches obtain
takeover times of all approaches are located within the first 50 the worst performance.
To show the performance of the proposed approach, several ex- Blickle, T., & Thiele, L. (1996). A comparison of selection schemes used in evo-
periments have been conducted. In the study, it has been com- lutionary algorithms. Evolutionary Computation, 4(4), 361–394 December 1996.
doi:http://dx.doi.org/10.1162/evco.1996.4.4.361.
pared to other selection methods such as tournament and roulette Bliss, P. M., & Brown, D. A. (2009). Geometric properties of three-dimensional fractal
schemes. The results demonstrated that the proposed GS method trees. Chaos, Solitons and Fractals, 42, 119–124.
outperforms the other algorithms for most of the experiments in Brad, L. Miller, & Goldberg, D. (1992). Genetic algorithms, tournament selection and
the effects of noise. Complex Systems, 9(1995), 193–212.
terms of solution quality and convergence properties. Chelouah, R., & Siarry, P. (20 0 0). A continuous genetic algorithm designed for the
In general, the main contributions of this paper can be sum- global optimization of multimodal functions. Journal of Heuristics, 6(2), 191–213.
marized as follows: The design of a new selection technique based Ciucurel, C., Georgescu, L., & Iconaru, E. (2018). ECG response to submaximal exer-
cise from the perspective of Golden Ratio harmonic rhythm, biomedical. Signal
on the GS that defines a better balance between elitism and diver-
Processing and Control, 40, 156–162.
sity in comparison to other selection methods currently in use. Its Clark, G., Kok, R., & Lacroix, R. (1999). Mind and autonomy in engineered biosys-
effective implementation in order to maintain a simple structure tems,. Engineering Applications of Artificial Intelligence, 12, 389–399.
Coldea1, R., Tennant, D., Wheelerl, E., Wawrzynska, E., Prabhakaran1, D., Telling, D.,
and fast performance. The integration of an experimental scheme
et al. (2010). Quantum criticality in an ising chain: experimental evidence for
for the evaluation of selection methods that includes benchmark emergent E8 symmetry. Science, 327(5962), 177–180.
functions and consistent performance indexes. The application of Darrell, L. W. (1989). The GENITOR algorithm and selection pressure: why
the proposed selection method to improve the search strategy of rank-based allocation of reproductive trials is best. In J. D. Schaffer (Ed.), Pro-
ceedings of the 3rd international conference on genetic algorithms (pp. 116–123).
Genetic Algorithms (GA). San Francisco, CA: Morgan Kaufmann Publishers Inc..
This paper shows an attempt to enhance the performance of De Jong, K. (2006). Evolutionary computation: A unified approach. Cambridge, MA:
evolutionary algorithms through the modification of their selec- MIT Press.
De la Maza, M., & Tidor, B. (1993). An analysis of selection procedures with particu-
tion strategy, and lots of work can be done as future research to lar attention paid to proportional and Boltzmann selection. In Proceedings of the
improve the performance of the proposed selection method. The 5th international conference on genetic algorithms (pp. 124–131). June 01.
main directions that deserve further research include: Dunlap, R. A. (1997). The golden ratio and fibonacci numbers. Singapore: World Sci-
entific Publishing Co. Pte. Ltd..
Esmaeili, M., Gulliver, T. A., & Kakhbod, A. (2009). The Golden mean, Fibonacci ma-
• To characterize the performance of the proposed method in trices and partial weaklysuper-increasing sources. Chaos, Solitons and Fractals,
other evolutionary approaches. In this paper, extensive work 42, 435–440.
Goldberg, D.E. (1989). Genetic algorithm in search optimization and machine learn-
has been conducted to evaluate the GS selection strategy in
ing, Addison–Wesley.
Genetic Algorithms (GA). Further work is necessary to prove Goldberg, D. E., & Deb, D. (1991). A comparative analysis of selection schemes used
the capacities of the proposed technique in other evolutionary in genetic algorithms. In G. J. E. Rawlins (Ed.), Foundations of genetic algorithms
schemes such as Evolutionary Strategies and Genetic Program- (pp. 69–93).
Grefenstette, J., & Baker, J. (1989). How genetic algorithms work: A critical look at
ming. With the inclusion of the GS selection method in such implicit parallelism. In D. Schaffer (Ed.), Proceeding of the third international con-
algorithms, they could substantially improve their performance ference on genetic algorithms (pp. 20–27). Morgan Kaufmann publishers.
as search strategies. Hamzaçebi, C. (2008). Improving genetic algorithms’ performance by local search for
continuous function optimization. Applied Mathematics and Computation, 196(1),
• To examine the abilities of the GS selection technique in dy- 309–317.
namic optimization problems. It is well known that dynamic Hancock, P. (1994). An empirical comparison of selection methods in evolutionary
optimization problems require an appropriate diversity of solu- algorithms. In T. C. Fogarty (Ed.), Evolutionary computing: AISB workshop leeds
(pp. 80–94). U.K.: Springer. April 11–13, 1994 Selected Papers, 1994 AISB Work-
tions due to the constant changes in the objective function. Un- shop, Leeds, Berlin Heidelberg.
der such conditions, the proposed method could provide good Holland, J. (1975). Adaptation in natural and artificial systems. Ann Arbor, MI: The
results through the definition of an adequate number of groups University of Michigan Press.
Iosa, M., Morone, G., & Paolucci, S. (2018). Phi in physiology, psychology and biome-
m.
chanics: The golden ratio between myth and science. BioSystems, 165, 31–39.
• To design a mechanism for the adaptation of the proposed GS Jackson, P. (1998). Introduction to expert systems, (3 ed.), (p. 2). Addison Wesley. ISBN
method during the evolution. In this paper, the selection strat- 978-0-201-87686-4.
Julian, F. V. V., Olga, A. B., Nikolaj, R. B., Bowyer, A., & Pahl, A. K. (2006). Biomimet-
egy maintains its operation without changes during the opti-
ics: Its practice and theory. Journal of The Royal Society Interface, 3, 471–482.
mization process. However, a further state is that the selection Kenneth, F. (2003). Fractal geometry: Mathematical foundations and applications. John
strategy experiment changes in the association between elitism Wiley & Sons. ISBN 0-470-84862-6.
and diversity as it is required by the optimization strategy. Shekhawat, K. (2015). Why golden rectangle is used so often by architects: A math-
ematical approach. Alexandria Engineering Journal, 54, 213–222.
• To define new performance indexes that allow correctly to eval- Li, F. Z., Zhou, C. X., He, R., Xu, Y., & Yan, M. L. (2015). A novel fitness allocation
uate the capacities of a selection method. The Takeover time algorithm for maintaining a constant selective pressure during GA procedure.
and the response index R(k) do not directly reflex the effect of Neurocomputing, 148, 3–16.
Loai, M. D. (2012). Geometric proportions: The underlying structure of design
the selection method in the search strategy. process for Islamic geometric patterns. Frontiers of Architectural Research, 1,
380–391.
Lu, Y. (2003). A Golden Section approach to optimization of automotive friction ma-
References terials. Journal of Materials Science, 38(5), 1081–1085.
Mandelbrot, B.B. (1982). The fractal geometry of nature. W.H. Freeman and Com-
Ali, M., Khompatraporn, C., & Zabinsky, Z. (1995). A numerical evaluation of several pany, New York.
stochastic algorithms on selected continuous global optimization test problems. Mühlenbein, H., & Schlierkamp-Voosen, D. (1993). Predictive models for the breeder
Journal of Global Optimization, 31(4), 635–672. genetic algorithm: Continuous parameter optimization. Evolutionary Computa-
Bäck, T. (1992). The Interaction of mutation rate, selection, and self-adaptation tion, 1(1), 25–49.
within a genetic algorithm. PPSN, 87–96. Nandar, L., & Ponnuthurai, N. S. (2015). Heterogeneous comprehensive learning par-
Bäck, T. (1994). Selective pressure in evolutionary algorithms: A characterization of ticle swarm optimization with enhanced exploration and exploitation. Swarm
selection mechanisms. In International conference on evolutionary computation and Evolutionary Computation, 24, 11–24.
(pp. 57–62). Newell, A. C., & Pennybacker, M. (2013). Fibonacci patterns: Common or rare? Pre-
Bäck, T., & Hoffmeister, F. (1991). Extended selection mechanisms in genetic algo- cedia IUTAM, 9, 86–109.
rithms. ICGA, 92–99. Noraini, M. R., & Geraghty, J. (2011). Genetic Algorithm Performance with Different
Baker, J. (1987). Reducing bias and inefficiency in the selection algorithm. In John Selection Strategies in Solving TSP. In Proceedings of the World Congress on Engi-
J. Grefenstette (Ed.), Proceedings of the second international conference on genetic neering 2011, Vol II, WCE 2011, July (pp. 6–8).
algorithms on genetic algorithms and their application (pp. 14–21). Hillsdale, NJ: Pascal, C., Vassiliev, S., Houghten, S., & Bruce, D. (2011). Genetic algorithm with al-
L. Erlbaum Associates Inc. ternating selection pressure for protein side-chain packing and pK(a) prediction.
Benavoli, A., & Chisci, L. F. (2009). Fibonacci sequence, Golden Section, Kalman filter Biosystems, 105(3), 263–270.
and optimal control. Signal Processing, 89, 1483–1488. Ramírez, J. L., Rubiano, G. N., & De Castro, R. (2014). A generalization of the Fi-
Blickl, T., & Thiele, L..A mathematical analysis of tournament selection. In Proceed- bonacci word fractal and the Fibonacci snowflake. Theoretical Computer Science,
ings of the 6th international conference on genetic algorithms (1995), Eshelman, 528, 40–56.
L. J. (Ed.), San Francisco, CA, USA: Morgan Kaufmann Publishers Inc, 9–16. Shilane, D., Martikainen, J., Dudoit, S., & Ovaska, S. (2008). A general framework
196 E. Cuevas et al. / Expert Systems With Applications 106 (2018) 183–196
for statistical performance comparison of evolutionary computation algorithms. Sonya, Q., & William, G. (2010). Bionics—An inspiration for intelligent manufacturing
Information Sciences, 178, 2870–2879. and engineering. Robotics and Computer-Integrated Manufacturing, 26, 616–621.
Sigalotti, L., & Mejias, A. (2006). The golden ratio in special relativity, Chaos,. Solitons Walser, H. (2001). The Golden Section. The Mathematical Association of America.
and Fractals, 30, 521–524.