Artículos
Received: 13 April 2024
Accepted: 05 August 2024
DOI: https://doi.org/10.19053/01211129.v33.n69.2024.17895
ABSTRACT: Two distinctive behaviors of the cuckoo bird have inspired several metaheuristic algorithms to solve continuous optimization problems. In addition to the well-known parasitic breeding behavior that gave rise to several cuckoo search (CS) algorithms, another behavior related to their clustering and the way they locate food sources has given rise to the COA algorithm. As a result, there are several variants to solve continuous optimization problems; however, it is necessary to define which one is the most suitable under specific requirements. This paper compares six of these algorithms, including CS+LEM (proposed in this paper), which consists of a hybridization of the CS algorithm with learning evolutionary models (IBM) using an approach known as "metaheuristics enhanced by artificial intelligence". Three assessments were performed using a set of 61 continuous test functions: 1) the optimal value achieved with a fixed execution time; 2) the number of objective function evaluations required to reach the global optimum; and 3) the optimal value achieved with a fixed number of objective function evaluations. CS+LEM presents the best results in evaluation 1, while COA presents the best results in evaluations 2 and 3. The results were analyzed using the Friedman and Wilcoxon nonparametric statistical tests.
Keywords: Artificial intelligence, Cuckoo search algorithm, large-scale continuous problems, metaheuristics, optimization.
RESUMEN: Dos comportamientos distintivos del pájaro cuco han inspirado varios algoritmos metaheurísticos para resolver problemas de optimización continua. Además del conocido comportamiento de reproducción parasitaria que dio origen a diversos algoritmos de búsqueda cuco (CS por sus siglas en inglés), otro comportamiento relacionado con sus agrupaciones y la forma en que localizan las fuentes de alimento ha dado lugar al algoritmo COA. Como resultado, existen diferentes variantes para resolver problemas de optimización continua; sin embargo, es necesario definir cuál es el más adecuado para resolver un problema bajo requerimientos específicos. En este trabajo se realiza una comparación entre seis de estos algoritmos incluido CS+LEM (propuesto en este artículo), una hibridación del algoritmo CS con modelos evolutivos que aprenden (LEM por sus siglas en inglés) usando un enfoque conocido como "metaheurística mejorada por inteligencia artificial". Se realizaron tres evaluaciones utilizando un conjunto de 61 funciones de prueba continuas: 1) el valor óptimo alcanzado con un tiempo fijo de ejecución; 2) el número de evaluaciones de la función objetivo necesarias para alcanzar el óptimo global; 3) el valor óptimo alcanzado con un número fijo de evaluaciones de la función objetivo. CS+LEM presenta los mejores resultados en la evaluación 1, mientras que COA presenta los mejores resultados en las evaluaciones 2 y 3. Los resultados se analizaron mediante las pruebas estadísticas no paramétricas de Friedman y Wilcoxon.
Palabras clave: Algoritmo de búsqueda del cuco, inteligencia artificial, metaheurísticas, optimización, problemas continuos a gran escala.
1. INTRODUCTION
Metaheuristic algorithms are general-purpose algorithms used to find the solution to a specific problem. Metaheuristics are considered technical or high-level strategies that combine low-level techniques and tactics to explore and exploit the search space [1]. Among the best-known metaheuristics are Genetic Algorithm (GA) [2], Memetic algorithm [3], Tabu Search [4], Ant Colony Optimization [5], Particle Swarm Optimization (PSO) [6], Grey Wolf Optimization [7], Firefly algorithm [8]-[9], Differential Evolution (DE) [10], and Harmony Search algorithms [11]-[12].
In recent years, optimization algorithms have become one of the most essential fields in science and engineering -especially when minimizing time, costs, materials, and space-; therefore, this area is currently the object of scientific research and development, looking to improve productivity, cost, and processing time [13].
Cuckoo Search (CS) is a metaheuristic algorithm based on the exciting breeding behavior (brood parasitism) of certain species of cuckoo birds, in combination with the Lévy flight behavior of some birds or fruit flies [14]. The success of this algorithm versus Genetic Algorithms (GA) and Particle Swarm Optimization (PSO) algorithms has made this useful for engineering optimization problems like the design of springs, design of beam structures, and data fusion in wireless sensor networks [14]-[15]. The advantage of CS against GA and PSO lies in the balance of randomization, intensification, and the lowest number of parameters to be controlled [14]. Cuckoo Optimization Algorithm (COA) is another metaheuristic algorithm inspired by cuckoo birds, but in this case, based on their lifestyle (immigration of societies or groups), that has shown promising results in continuous optimization problems. This paper compares six cuckoo-inspired algorithms to help solution designers select the appropriate algorithm for a specific optimization problem. Three scenarios of evaluation were used: 1) the optimal value achieved with a fixed time of execution, which is used for applications that need the best possible solution in a very short execution time (for online problems); 2) the number of objective function evaluations (OFE) required to reach the global optimum, it was designed to define the cuckoo-inspired algorithm that can solve (find optimal solutions) the greatest number of problems without restrictions on execution time (for offline optimization problems); and 3) the optimal value achieved with a fixed number of OFEs, where the behavior between online and offline applications based on the number of OFEs was analyzed; currently, this test is the least commonly used by the research community because algorithms can spend a variable time between each objective function evaluation (some algorithms spend excessive time, even greater than that required to evaluate the objective function, while others need much less time).
The paper is organized as follows: Section II summarizes five cuckoo-inspired algorithms; Section III presents a new algorithm proposed by the authors, in which CS is hybridized with Learnable Evolution Models (LEM); Section IV presents the analysis of the experimental results of the algorithm against a broad set of test functions; finally, some concluding remarks and suggestions for future work are presented.
2. CUCKOO-INSPIRED ALGORITHMS
Cuckoo Search has been applied to solve many engineering optimization problems and underwent several modifications that led to various improvements in accuracy and convergence time over general or specific optimization problems, e.g., decreasing the noise sensitivity of CS. The CS algorithm is described below, followed by three modifications that will be considered in the comparative study, namely: Improved Cuckoo Search Algorithm (ICS) [16], Modified Cuckoo Search (MCS) [17], and Modified Cuckoo Search Algorithm (MCSA) [18]. Finally, in this section, another algorithm inspired by the lifestyle of the cuckoo is presented, the Cuckoo Optimization Algorithm (COA) [19].
The Cuckoo Search (CS) algorithm provides a new way for intensification (search for better solutions in the neighborhood of the current solution) and diversification (make sure the algorithm can explore the search space efficiently). By simplifying the breeding behavior of the cuckoo, a set of three idealized rules can be established [14]: 1) Each cuckoo lays one egg at a time and deposits it in a randomly chosen nest; 2) The best nests with high-quality eggs will be carried over to the next generations; 3) The number of available host nests is fixed, and the host bird discovers the egg laid by a cuckoo with a probability ραЄ [0,1] (ρα, Percentage of abandonments). In this case, the host bird can either get rid of the egg or abandon the nest and build an entirely new nest. Likewise, in nature, many animals and insects search for food through a random or quasi-random walk (since the next step is always based on the current location and the probability of moving to the following location) that can be modeled with a Lévy distribution (a continuous probability distribution for a non-negative random variable) known as Lévy flights [20]; many studies have shown that the flight behavior of some animals and insects follows its typical characteristics. This kind of search was included in the CS algorithm in Step 1, when the cuckoo randomly chose a nest.
The Improved Cuckoo Search Algorithm (ICS) was proposed by Valian, Mohanna, and Tavakoli [16]; the main difference with CS algorithm lies in the manner of adjusting the ρα and a parameters (the rate of abandonment and the Lévy flight step size, respectively). In ICS, the ρα and α values are dynamically changed with the number of generations. In the first generations, the values of ρα and α must be large enough to ensure that the algorithm increases the diversity of solution vectors (diversification). However, these values are decreased in the later generations to enable a better fit of the solution vectors (intensification).
The Modified Cuckoo Search (MCS) was proposed by Walton, Hassan, Morgan, and Brown [17]. It presents two modifications to the CS algorithm. The first change was made in the step size of Lévy flights (α parameter). The value of α decreases as the number of generations increases, thus increasing the intensification. The second modification was to add an information exchange between eggs to accelerate the convergence to the optimal solution. In the CS algorithm, there is no exchange of information between individuals, and searches are performed independently. In this version, a fraction of the eggs with the best fitness is in a "top" group of eggs. For each of these, an egg is randomly selected from the top eggs, and a new egg is generated on the line connecting the two eggs. The new egg is generated at the midpoint of both eggs using the Golden Ratio (an irrational mathematical constant frequently present in distance ratios taken of the simple geometric figures such as pentagon, pentagram, decagon, and dodecahedron, and defined by φ=(1+√5)⁄2)). A significantly better performance was achieved using Golden Radio than with a random fraction.
The Modified Cuckoo Search Algorithm (MCSA) proposed by Tuba, Subotiv, and Stanarevic [18] proposed a change in how to calculate the size of the random steps. The change includes a function that allows sorting the matrix of candidate solutions (nests) for the fitness value of the solutions contained. Thus, the solutions with higher fitness have a slight advantage over those with lower fitness. This method maintains the selection pressure (the degree to which high-fitness level solutions are selected) toward the best solutions, thus facilitating better results.
The Cuckoo Optimization Algorithm (COA) is inspired by the lifestyle of cuckoo birds and was proposed in 2011 by Rajabioun [19]. In COA, a habitat (a matrix of size Npop x Mdim) is generated with random points, and the utility function is calculated for each. A specific number of eggs (5-20) and an Egg Laying Radius (ELR) that defines the maximum distance where cuckoos can host their eggs are assigned to each cuckoo. In each generation, zones (groups) are defined for the habitat of each cuckoo. Each area is evaluated and a utility value is assigned to it; this value represents the survival rate for an individual and its eggs, being defined by the number of individuals that have proliferated in the area, the availability of food, and the similarities between the characteristics of the cuckoo eggs and the host bird eggs. Accordingly, the area with the highest utility value is set as the target point, that is, the best migration habitat for mature cuckoos who establish a flight path defined by a percentage of the distance λ, and a deflection angle ϕ (-π⁄6, π⁄6 radians). Generation after generation, the cuckoos are unevenly distributed over the search space, and thus it is complex to identify the cuckoos that belong to each group; therefore, the k-means clustering algorithm is used (where k is between 3 and 5) to help in adequately defining the cuckoos in each group.
3. PROPOSED CS+LEM ALGORITHM
Inspired by the concept of Learnable Evolution Models (LEM) proposed by Michalski [21][22], a new version of the Cuckoo Search (CS) algorithm is proposed in this section. In LEM, machine learning techniques are used to generate new populations along with the Darwinian method, applied in evolutionary computation and based on mutation and natural selection. This method can determine which individuals in a population (or set of individuals from previous populations) are better than others in performing specific tasks. This reasoning, expressed as an inductive hypothesis, is used to generate new populations. Then, when the algorithm is running in Darwinian evolution mode, it uses random or semi-random operations to generate new individuals (employing traditional mutation or recombination techniques).
In this research, the algorithm proposed in [23] carries out the machine learning process, which is responsible for the rule inference process. The latter defines a set of conjunctive rules (P←R1 ∧R2 ∧…∧Rn ) that delineate the regions where there is a greater chance of finding a better value for each dimension xi (for example LVxi ≤ xi ≤ HVxi; here, LV and HV are the lower and upper limits of the rules for the value on dimension). Given the combination of rules (R) for each dimension, the search space is limited to regions most likely to generate a global optimum. The rule inference process is run for the first time immediately after creating the initial population of nests. The steps of the rule inference process are summarized in Figure 1.

The new proposal is called Cuckoo Search using Learnable Evolution Models (CS+LEM), and the steps of the algorithm are presented in Figure 2. If the LEM process is activated (when then LEM variable is set to true), every time the population of the nest changes, the rule inference process is executed to update rules for each dimension. If the LEM process is activated, the new cuckoo randomly generated via Lévy flights is mutated in some dimensions. The process mutates a dimension with a probability defined by the rule consideration rate (RCR) parameter. By default, the RCR parameter is set to 0.5, meaning half of the dimensions are created via Lévy flights, and the other half are created based on rules. When the algorithm creates a specific number of cuckoos without managing to improve the fitness function (MNIWI parameter), the LEM process is deactivated. Then if the algorithm repeats this situation without improving fitness for a specific MNIWI, the LEM process is activated again, and so on.

4. EXPERIMENTATION
This section shows the performance of cuckoo-inspired algorithms in three different assessments, namely: 1) the optimal value achieved with a fixed time of execution; 2) the number of objective function evaluations required to reach the global optimum; and 3) the optimal value achieved with a fixed number of Objective Function Evaluations (OFE). Previously used parameters for executing all algorithms in each experiment are presented in Table 1. The parameters for all algorithms were used based on values recommended by their original authors. The approach of different challenges in optimization problems was used in our work. Therefore, the parameter values were the same for all optimization problems, i.e., there was no tuning of parameters conducted for each algorithm in each specific problem.

Table 2 shows 6 unimodal separable functions, 22 unimodal non-separable functions, 7 multimodal separable functions, and 26 multimodal non-separable functions. They are based on those proposed in the report "Benchmark Functions for the CEC'2010 Special Session and Competition on Large-Scale Global Optimization" [24][25][26] and the paper "A comparative study of Artificial Bee Colony Algorithm" [9], which provides an adequate range of complexity levels. For each function, the global minimum is searched.

For each assessment, the average and standard deviation of 30 independent executions were reported per function. The initial population is generated randomly within ranges specified for each function. The results were obtained using the same computer configuration (hardware and software). All algorithms were implemented on C# language programming.
A. Assessment 1: Best Optimal Value Reached at Different Times
This assessment aims to identify which cuckoo-inspired algorithm provides the best results (results nearest to the global minimum of the objective function) for the following periods: 5, 10, 20, 40, and 80 seconds. This section shows results in general (all functions as a whole). Additional analyses over four groups of functions (unimodal separable, unimodal non-separable, multimodal separable, and multimodal non-separable) were performed; however, the results are not presented in this paper due to space limitations.
The Friedman test shows that CS+LEM is the best option for solving problems when the designer does not know anything about the landscape of the fitness function, the problem has high dimensionality, and the execution time is short (lower than 80 seconds) (see Table 3). General results based on the Wilcoxon non-parametric test with a 0.95 significance level also show that: 1) At 5s, CS+LEM outperforms COA, MCS, and MCSA, and ICS and MCSA outperform COA and MCS; 2) At 10s and 20s, CS+LEM outperforms all other algorithms, and MCSA outperforms CS, COA, ICS, and MCS; 3) At 40s and 80s, CS+LEM outperforms all other algorithms and COA outperforms CS, ICS, MCS and MCSA.

B. Assessment 2: Number of Objective Function Evaluations (OFE) Required to Reach the Global Optimum
This assessment aims to identify which cuckoo-inspired algorithm provides the best results (results nearest to the global minimum of the objective function with the lowest number of evaluations). All algorithms were executed until the global minimum was found or a maximum of 50000 OFEs was exceeded.
On average, COA reports better results on the number of OFEs required to reach the global optimum than the other five algorithms supported by the Friedman non-parametric test (see Table 4). The main problem with COA is the additional execution time required by each iteration to cluster cuckoos (solutions) using the k-means algorithm because clustering in a continuous space is an NP-hard problem. The k-means algorithm requires O (n*k*t) operations to converge a non-optimal cluster solution, where n is the number of cuckoos, k is the number of clusters, and t is the average cycles (iterations) required by the algorithm to converge. Therefore, COA is more than six times slower than MCS (643.91%), which reports the lowest average time for each generation.

Additionally, COA outperforms all algorithms; CS outperforms ICS and MCS with a significance level=0.95, and CS+LEM outperforms MCS with a significance level=0.95 in Wilcoxon test results. It is essential to highlight that COA is the best cuckoo-inspired algorithm in this test because it reaches the optimum value in 55 test functions out of 61 functions in the maximum specified number of OFEs; this is a high number of resolved test functions in comparison with other cuckoo-inspired algorithms which can solve only 7 to 14 test functions using 50000 OFEs. It can be further noted that CS+LEM ranks second with 14 resolved test functions. Based on the above, COA is the best alternative for environments where much time can be expected to find the optimal solution.
C. Assessment 3: Best Optimal Value Reached at Different Number of Objective Function Evaluations
This assessment aims to identify which cuckoo-inspired algorithm provides the best results (the results nearest to a global minimum for the objective function) at 5000, 10000, 20000, and 50000 OFEs. Friedman analysis shows that COA reports the best results for the highest number of OFEs, while MCS is best for a small number (Table 5).

Results using Wilcoxon test (by default with a significance level of 0.95) show that at 5000 OFEs, MCS outperforms CS, COA, ICS, and CS+LEM, and ICS outperforms COA and CS+LEM; at 10000 OFEs, MCS outperforms CS, COA, ICS, and CS+LEM, and ICS outperforms COA and CS+LEM; At 20000 OFEs, COA and ICS outperform CS and CS+LEM, and COA outperforms ICS, MCSA, and CS+LEM [Wilcoxon significance level of 0.90]; at 50000 OFE - COA outperforms CS, CSA, and CS+LEM, and ICS outperforms CS and CS+LEM. It is essential to consider that this test is the least used by the research community since algorithms that spend much time calculating the changes in the current individuals to obtain the new ones have a clear advantage over those that spend less time.
5. CONCLUSIONS AND FUTURE WORK
This paper presents a new CS algorithm called Cuckoo Search using Learnable Evolution Models, or CS+LEM. The proposed algorithm uses LEM techniques to create rules that enable inferring new candidates in the population that emerges not only from the random scan. The algorithm was subjected to 61 classic optimization features and obtained the best results in most of the functions when the designer does not know anything about the landscape of the fitness function, the problem has high dimensionality, and the execution time is short (Assessment 1).
The hybridization of CS with K-means (COA) reports better results in Test 2 (number of objective function evaluations required to reach the global optimum) than the other five algorithms. It is essential to highlight that COA reaches the optimum value for 55 test functions over 61 total functions; this is a high number of resolved test functions compared with other cuckoo-inspired algorithms that can only solve 7 to 14 test functions using 50000 OFEs. Unfortunately, COA requires a much longer execution time than the other CS algorithms. Therefore, COA is the best option for offline scenarios where users can wait much more time for the global optimum solution of the problem. Additionally, the CS+LEM algorithm ranks second in this assessment (Assessment 2).
COA and MCS report the best results in Assessment 3 (Best optimal value reached at different number of OFEs); for a high number of OFEs, COA reports the best results, while the MCS Algorithm reports the best results for a small number of evaluations.
The different methods for diversification and intensification employed by the cuckoo-inspired algorithms compared in the paper mean that each algorithm can provide a better solution to a specific problem in a specific scenario. This concept is supported by the "no-free lunch theorem for optimization" [28]. The paper helps define the best algorithms for three scenarios and allows designers to select the most appropriate algorithm for a specific optimization problem. Regarding future work, the research group proposes to compare the previously studied cuckoo-inspired algorithms exhaustively and in detail with other heuristics like PSO, DE, HS, and ABC.
ACKNOWLEDGMENTS
The University of Cauca partially supported this work.
REFERENCES
L. Velasco, H. Guerrero, and A. Hospitaler, "A Literature Review and Critical Analysis of Metaheuristics Recently Developed," Arch. Comput. Methods Eng., vol. 31, 2023. https://doi.org/10.1007/s11831-023-09975-0
R. R. Abo-Alsabeh and A. Salhi, "The Genetic Algorithm: A study survey," Iraqi J. Sci., vol. 63, no. 3, pp. 1215-1231, 2022, https://doi.org/10.24996/ijs.2022.63.3.27
F. Neri and C. Cotta, "Memetic algorithms and memetic computing optimization: A literature review," Swarm Evol. Comput., vol. 2, pp. 1-14, 2012. https://doi.org/10.1016/j.swevo.2011.11.003.
V. K. Prajapati, M. Jain, and L. Chouhan, "Tabu Search Algorithm (TSA): A Comprehensive Survey," in Proceedings of3rdInternational Conference on Emerging Technologies in Computer Engineering: Machine Learning and Internet of Things, ICETCE2020, 2020, pp. 222-229. https://doi.org/10.1109/ICETCE48199.2020.9091743
N. Nayar, S. Gautam, P. Singh, and G. Mehta, "Ant Colony Optimization: A Review of Literature and Application in Feature Selection," in Lecture Notes in Networks and Systems, 2021, pp. 285-297. https://doi.org/10.1007/978-981-33-4305-4_22
A. G. Gad, "Particle Swarm Optimization Algorithm and Its Applications: A Systematic Review," Arch. Comput. Methods Eng., vol. 29, no. 5, pp. 2531-2561, 2022. https://doi.org/10.1007/s11831-021-09694-4
I. Sharma, V. Kumar, and S. Sharma, "A Comprehensive Survey on Grey Wolf Optimization," Recent Adv. Comput. Sci. Commun., vol. 15, no. 3, pp. 323-333, 2022. https://doi.org/10.2174/2666255813999201007165454
J. Li, X. Wei, B. Li, and Z. Zeng, "A survey on firefly algorithms," Neurocomputing, vol. 500, pp. 662-678, 2022. https://doi.org/10.1016/j.neucom.2022.05.100
D. Karaboga and B. Akay, "A comparative study of Artificial Bee Colony algorithm," Appl. Math. Comput., vol. 214, no. 1, pp. 108-132, 2009. https://doi.org/10.1016/ij.amc.2009.03.090
M. F. Ahmad, N. A. M. Isa, W. H. Lim, and K. M. Ang, "Differential evolution: A recent review based on state-of-the-art works," Alexandria Eng. J., vol. 61, no. 5, pp. 3831-3872, 2022. https://doi.org/10.1016/j.aej.2021.09.013
F. Qin, A. M. Zain, and K.-Q. Zhou, "Harmony search algorithm and related variants: A systematic review," Swarm Evol. Comput., vol. 74, 2022. https://doi.org/10.1016/j.swevo.2022.101126
E. Ruano-Daza, C. Cobos, J. Torres-Jimenez, M. Mendoza, and A. Paz, "A multiobjective bilevel approach based on global-best harmony search for defining optimal routes and frequencies for bus rapid transit systems," Appl. Soft Comput. J., vol. 67, pp. 567-583, Jun. 2018. https://doi.org/10.1016/j.asoc.2018.03.026
S. Salhi and J. Thompson, "An overview of heuristics and metaheuristics," in The Palgrave Handbook of Operations Research, 2022, pp. 353-403. https://doi.org/10.1007/978-3-030-96935-6_11
X.-S. Yang and S. Deb, "Cuckoo Search via Lévy flights," in 2009World Congress on Nature & Biologically Inspired Computing (NaBIC), 2009, pp. 210-214. https://doi.org/10.1109/NABIC.2009.5393690.
K. Safdar, K. N. Abdul Rani, H. A. Rahim, S. J. Rosli, and M. A. Jamlos, "A Review on Research Trends in using Cuckoo Search Algorithm: Applications and Open Research Challenges," Prz. Elektrotechniczny, vol. 1, no. 5, pp. 18-24, 2023. https://doi.org/10.15199/48.2023.05.04.
E. Valian, S. Mohanna, and S. Tavakoli, "Improved Cuckoo Search Algorithm for Feed forward Neural Network Training," Int. J. Artif. Intell. Appl., vol. 2, no. 3, pp. 36-43, 2011. https://doi.org/10.5121/ijaia.2011.2304.
S. Walton, O. Hassan, K. Morgan, and M. R. Brown, "Modified cuckoo search: A new gradient free optimisation algorithm," Chaos, Solitons and Fractals, vol. 44, no. 9, pp. 710-718, 2011. https://doi.org/101016/jchaos201106004
M. Tuba, M. Subotic, and N. Stanarevic, "Modified Cuckoo Search Algorithm for Unconstrained Optimization Problems," Proc.5thEur. Conf. Eur. Comput. Conf., pp. 263-268, 2011, [Online]. Available: http://www.wseas.us/e-library/conferences/2011/Paris/ECC/ECC-43.pdf
R. Rajabioun, "Cuckoo Optimization Algorithm," Appl. Soft Comput., vol. 11, no. 8, pp. 5508-5518, 2011. https://doi.org/doi.org/10.1016/j.asoc.2011.05.008
J. Li , Q. An, H. Lei, Q. Deng, and G.-G. Wang, "Survey of Lévy Flight-Based Metaheuristics for Optimization," Mathematics, vol. 10, no. 15, 2022. https://doi.org/10.3390/math10152785
R. S. Michalski, "Learnable evolution model: evolutionary processes guided by machine learning," Mach. Learn., vol. 38, no. 1, pp. 9-40, 2000. https://doi.org/10.1023/a:1007677805582
A. L. da Costa Oliveira, A. Britto, and R. Gusmão, "Machine learning enhancing metaheuristics: a systematic review," Soft Comput., 2023. https://doi.org/10.1007/s00500-023-08886-3
C. Cobos, D. Estupiñán, and J. Pérez, "GHS + LEM: Global-best Harmony Search using learnable evolution models," Appl. Math. Comput., vol. 218, no. 6, pp. 2558-2578, 2011. https://doi.org/10.1016/j.amc.2011.07.073.
P. N. Suganthan et al., "Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization," Singapore, 2005. [Online]. Available: http://www.cmap.polytechnique.fr/~nikolaus.hansen/Tech-Report-May-30-05.pdf
K. Tang et al., "Benchmark functions for the CEC'2008 special session and competition on large scale global optimization," in IEEEWorld Congress on Computational Intelligence, Rio de Janeiro, Brazil: IEEE, 2008, pp. 1-18. [Online]. Available: http://sci2s.ugr.es/programacion/workshop/Tech.Report.CEC2008.LSGO.pdf
T. Ke, L. Xiaodong, S. P. N. , Y. Zhenyu, and W. Thomas, "Benchmark Functions for the CEC'2010 Special Session and Competition on Large-Scale Global Optimization," Shanghai, China, 2010. [Online]. Available: http://goanna.cs.rmit.edu.au/~xiaodong/cec13-lsgo/competition/cec2013-lsgo-benchmark-tech-report.pdf
M. Molga and C. Smutnicki, "Test functions for optimization needs," Test functions for optimization needs, no. c. pp. 1-43, 2005. [Online]. Available: https://robertmarks.org/Classes/ENGR5358/Papers/functions.pdf
D. H. Wolpert and W. G. Macready, "No free lunch theorems for optimization," IEEE Trans. Evol. Comput., vol. 1, no. 1, pp. 67-82, 1997. https://doi.org/10.1109/4235.585893.
Notes
Author notes