The Swarm intelligence algorithm is a very prevalent field in which some scholars have made outstanding achievements. As a representative, Slime mould algorithm (SMA) is widely used because of its superior initial performance. Therefore, this paper focuses on the improvement of the SMA and the mitigation of its stagnation problems. For this aim, the structure of SMA is adjusted to develop the efficiency of the original method. As a stochastic optimizer, SMA mainly stimulates the behavior of slime mold in nature. For the harmony of the exploration and exploitation of SMA, the paper proposed an enhanced algorithm of SMA called ECSMA, in which two mechanisms are embedded into the structure: elite strategy, and chaotic stochastic strategy. The details of the original SMA and the two introduced strategies are given in this paper. Then, the advantages of the improved SMA through mechanism comparison, balance-diversity analysis, and contrasts with other counterparts are validated. The experimental results demonstrate that both mechanisms have a significant enhancing effect on SMA. Also, SMA is applied to four structural design issues of the welded beam design problem, PV design problem, I-beam design problem, and cantilever beam design problem with excellent results.
Slime mould algorithmmetaheuristic algorithmcontinuous optimizationchaos random strategyengineering designIntroduction
Swarm intelligence (SI) algorithms are currently a very prevalent topic and are receiving increasing attention. Therefore, they have been employed in many fields, such as medical prediction, tourism path planning, urban construction planning, and engineering design problems. Therefore, the SI algorithm is widely adopted in many real-world application scenarios, where the slime mould algorithm (SMA) [1] has superior performance and has recently been proposed to reveal new aspects of new problems. The SI algorithms are mostly inspired by optimization phenomena in nature, such as Harris hawks optimizer (HHO) [2,3] {Kennedy, 2010 #158}, multi-verse optimizer (MVO) [4], and particle swarm optimization (PSO) [5]. Some algorithms stimulate the physical phenomenon such as sine cosine algorithm (SCA) [6,7] and gravitational search algorithm (GSA) [8].
In addition, some novel algorithms have been proposed one after another, which include not only some novel basic algorithms, such as weighted mean of vectors (INFO) [9], hunger games search (HGS) [10], colony predation algorithm (CPA) [11], and Runge Kutta optimizer (RUN) [12], but also some newly proposed variants of basic algorithms, such as evolutionary biogeography-based whale optimization (EWOA) [13], Harris hawks optimization with gaussian mutation (GCHHO) [14], opposition-based ant colony optimization (ADNOLACO) [15], Harris hawks optimization with elite evolutionary strategy (EESHHO) [16], improved whale optimization algorithm (LCWOA) [17], ant colony optimization with Cauchy and greedy Lévy mutations (CLACO) [18], chaotic, random spare ant colony optimization (RCACO) [19], moth-flame optimizer with sine cosine mechanism (SMFO) [20], adaptive chaotic sine cosine algorithm (ASCA) [7], and sine cosine algorithm with linear population size reduction mechanism (LSCA) [21]. Of course, they have been successfully applied to many other fields as well, such as gate resource allocation [22,23], feature selection [24,25], bankruptcy prediction [26,27], expensive optimization problems [28,29], image segmentation [30,31], robust optimization [32,33], solar cell parameter identification [34], train scheduling [35], multi-objective problem [36,37], resource allocation [38], scheduling problems [39–41], optimization of machine learning model [42], medical diagnosis [43,44], and complex optimization problem [45]. These excellent SI algorithms, including SMA, have shown some superiority. But there are still some common drawbacks for these algorithms, such as slow convergence speed, more iterations consumed, and they are prone to stagnating in premature solutions on certain functions with some harsh or flat feature space.
SMA was proposed in 2020 that imitates the behavioral and morphological transforms of the slime mould during food-seeking and solves the optimization problem by weighting the positive and negative feedback during foraging. Compared with peers, SMA has the advantages of justifiability of logical principle, few variables, and energetic, dynamic explorative capability. However, the local search capability of SMA is still deficient in some functions, and, as a newly proposed meta-heuristic algorithm, there are relatively few improvements to SMA at present. From the existing improved algorithms, it can be clearly seen that adding effective mechanisms or combining specific procedures contributes to the performance upgrading of the algorithms. For example, Ebadinezhad et al. [46] developed an adaptive ant colony optimization (ACO) called DEACO, adopting a dynamic evaporation strategy. The experimental results showed that compared with the conventional ACO, the convergence speed of DEACO is faster and the search accuracy is higher. Chen et al. [47] presented an augmented SCA with multi-strategy. Specifically, the proposed memory-driven algorithm called MSCA combines a reverse learning strategy, chaotic local search mechanism, Cauchy mutation operation as well as two operators from differential evolution. The overall outcomes demonstrate the superior solution quality and convergence speed of the proposed MSCA to its competitors. Guo et al. [48] presented a WOA with the wavelet mutation strategy and the social learning. The algorithm proposed in the article was applied to three water resource prediction models.
Jiang et al. [49] designed a chaotic gravitational search algorithm based on balance tuning (BA-CGSA) with sinusoidal stochastic functions and equilibrium mechanisms, and the overall outcomes revealed its efficiency in continuous optimization problems. Javidi et al. [50] introduced an enhanced crow search algorithm (ECSA) that combines a free-flight mechanism and an individual cap strategy that replaces each offending decision variable with a corresponding decision variable and global optimal solution. Therefore, ECSA obtained better or very competitive results. Tawhid et al. [51] presented a new hybrid binary bat enhanced PSO (HBBEPSO), and the outcomes indicated the capability of the proposed HBBEPSO to search for optimal feature combinations in the feature space. Luo et al. [52] proposed a boosted MSA named elite opposition-based MSA (EOMSA). The presented EOMSA employed an elite opposition-based strategy to increase population variation and exploration capability. The results showed that EOMSA is capable of probing more accurate solutions with fast convergence and high stability compared to other population-based algorithms. To handle a complex power system problem, economic environmental dispatch (EED), Sulaiman et al. [53] presented a hybrid optimization algorithm EGSJAABC3, which combined the evolutionary gradient search (EGS) and the recently proposed artificial swarm variant (JA-ABC3), and obtained the performance enhancement. The obtained benchmark function and EED application results revealed the optimization efficacy of EGSJAABC3. Consequently, it can be observed that the new mechanisms and the hybrid of algorithms based on the origin greatly improve the capability.
SMA has excellent convergence and accuracy, so it is also challenging to improve SMA. Here, an idea for improving the SMA is provided: using elite strategy and chaotic randomness to improve SMA coefficients A and B. The elite strategy is utilized to ensure convergence while chaotic randomness is utilized to enhance exploration tendencies. To demonstrate the effectiveness of ECSMA, several advanced elevating algorithms were compared against SMA. Besides, this paper attempts to apply ECSMA to several engineering design problems.
The main contributions of this paper are listed as follows:
In this paper, a new SMA-based swarm intelligence optimization algorithm, called ECSMA, is proposed.
The ECSMA skillfully combines the elite strategy and the chaotic stochastic strategy with the original SMA to enhance its performance effectively.
ECSMA is compared with some state-of-the-art similar algorithms on 31 benchmark functions and its performance is well demonstrated.
ECSMA is applied to four engineering design problems and achieves excellent results.
This paper is structured as follows. The principle and description of SMA are given in Section 2. Hereafter, Section 3 describes the detail of the improved ECSMA. In Section 4, the test function experimental results and explanation is presented. The application experiments of ECSMA to fundamental engineering problems are given in Section 5. Section 6 gives the primary contributions of this thesis and presents future work.
Background Principle of SMA
There are many different types of slime mould, while they have different morphological structures and behaviors. Therefore, the type of slime mould studied by the original author is mainly Physarum polycephalum. The slime mould covers the search space as much as possible by forming a large-scale diffusion net. When spreading, the organic matter at the front of the slime mould diffuses into a fan-shaped structure to expand the expansion area. Organic substances containing enzymes flow in the vein structure of slime molds and digest the covered edible substances. Furthermore, the spread network structure also ensures that slime mould can cover multiple food sources at the same time, thereby forming a node network based on food concentration.
In 2020, Li et al. established a mathematical model for slime mould based on their foraging behavior in nature and applied SMA to solve a series of optimization problems. The major steps of SMA are shown below:
Approach food:
By assessing the concentration of food in the air using receptors, slime mould spread in a general direction toward the food. The authors used the following formula to simulate their expansion and contraction behavior roughly.
X(t+1)→={Xb(t)→+vb→⋅(W→⋅XA(t)→−XB(t)→),r<pvc→⋅X(t)→,r≥pwhere W→ represents the weight of slime mould, XA→ and XB→ denote two individuals randomly selected from slime mould, b linearly decreases with the number of iterations from 1 to 0, X→ is the value of slime mould, vc→ oscillates between [−b,b]. vb→ is a parameter within the range of [−a,a], Xb→ denotes the individual location with the highest odor concentration currently found, t represents the current iteration.
The adaptive parameter p is calculated as below:
p=tanh|S(i)−DF|where DF denotes the best fitness acquired in all iterations, i∈1,2,…,n, S(i) denotes the fitness of X→.
The oscillation parameter vb→ is calculated as below:
SmellIndex=sort(S)where SmellIndex presents the sequence of sorted fitness values (ascends in the minimum value problem), wF is the worst fitness value acquired in the current iteration, bF presents the optimal fitness value obtained in the current iteration, r denotes a random number in the range of [0,1], condition indicates that S(i) ranks the first half of the population.
Wrap food:
The following equation is utilized to update the values for slime mould in each iteration:
X(t+1)→={rand⋅(UB−LB)+LB,rand<zXb(t)→+vb→⋅(W⋅XA(t)→−XB(t)→),r<pvc→⋅X(t)→,r≥pwhere UB and LB denote the maximum and minimum values of search space, rand and r denote the random number in [0,1]. The value of z is set to 0.3 as the original.
Oscillation:
The tendency of slime mould towards high-quality food arises from propagation waves generated by biological oscillators used to alter cytoplasmic flow in mucilage veins. W→, vb→ and vc→ are used to simulate the conversion of the vein width of slime mould.
Among them, W→ mainly adjusts the expansion speed of slime mold under different food concentration conditions, to realize the vibration intensity of the vein structure at different concentrations. The vibration frequency is smaller when the concentration is lower, and vice versa. At the same time, a certain degree of fault tolerance is also considered in the weight evaluation, so the balance between the diffusion and the convergence is well coordinated.
The pseudocode of the original SMA is shown in Algorithm 1.
Enhanced SMA Method (ECSMA)
The improved ECSMA is equipped with two valid strategies. First, the elite strategy is introduced to enhance the exploitation of SMA and reduce the adverse effects of false solutions on the optimal solution. Second, a chaotic strategy is added to improve the ergodicity of SMA and prevent SMA from falling into local optimum (LO) prematurely.
Elite Strategy (ES)
The MGABC algorithm [54] improves the neighbor search formula by randomly selecting two neighbors and using the optimum individual in the population as the initialization state for the search. Convergence speed is accelerated thanks to the guidance of the global optimal individual. On this basis, the paper selects two elite individuals as neighbors to further enhance the rate of convergence, as shown in Eq. (8).
X(t+1)→=Xb(t)→+vb→⋅(W→⋅XA(t)→−XB(t)→)where XA, XB are the two individuals in the top half of the ranking.
Chaotic Stochastic Strategy (CSS)
A chaotic stochastic strategy is used to randomly select XA, XB to increase the ductility of the algorithm, thus increasing the exploration capability of the algorithm and avoiding falling into LO too early. The specific process is shown in Eqs. (9)–(11).
ch(1)=x
ch(i+1)=4⋅ch(i)⋅(1−ch(i)
A,B=N2⋅ch(i)+1,rand≥zwhere x is a random number ∈[0,1], which is not equal to 0.25, 0.5, 0.75, and 1. A,B are two integers with two different chaos factors.
The Proposed ECSMA
Although SMA already has good convergence and accuracy, it still has some room for improvement in these two aspects. Therefore, the elite strategy and the chaotic stochastic strategy are utilized to improve SMA coefficients A and B. The use of elite strategy is to ensure the convergence of SMA, while the use of chaotic randomness is to enhance the exploration tendency of SMA. Algorithm 2 shows the procedure of ECSMA, while Fig. 1 displays the flowchart of ECSMA.
Flowchart of ECSMA
The proposed ECSMA’s time complexity includes several aspects: the number of algorithm iterations (T), the number of search agents (N), and the dimensions of the optimization problems (D). Therefore, the complexity of calculating fitness and sorting fitness is both O(N), while the computational complexity of calculating weight and updating individual is both O (N×D). The time complexity equations of the proposed algorithm is O(ECSMA)=O(Initialization)+T×(O(Calculationofthefitness))+O(Fitnesssorting)+O(Calculationofweight)+O(Thepositionupdateoftheslimemould. Finally, the proposed ECSMA’s time complexity is O(ECSMA) = O (D) +T × (2O(N) + 2O (N×D)).
Discussions on Experimental Results
The benchmark experiment is carried out in this section. Firstly, the diversity and balance of ECSMA and SMA are analyzed. Then, we proved the performance of ECSMA through mechanism comparison and experiment comparison with other algorithms.
Validation Using Benchmark Problems and Parameter Settings
In this experimental section, the optimizer’s efficiency with distinguished test functions is benchmarked. The function equations are shown in Table A1. In this paper, 23 benchmark functions and 8 composite functions in CEC2014 are opted to evaluate the efficacy of ECSMA. As we all know, the unimodal function has only one optimal solution, so it proves the exploitation ability of the method well. Compared with single-peak functions, multi-peak functions are more likely to lead to LO cases. Moreover, the increase of function dimensionality increases the complexity of LO cases. Therefore, the multi-peak function is suitable for testing the exploration capability of the algorithm and the ability to jump out of LO. In order to eliminate randomness in the experiment, all the algorithms involved are compared under the same conditions, where the population size is set to 30, the maximum evaluation number MaxFEs is uniformly set to 250,000 times, and all algorithms are independently tested 30 times on the benchmark functions. Also, to better present the comparative results of the experiments, the results were analyzed by the Wilcoxon signed-rank test in this paper.
To ensure fairness, all experiments were conducted on a desktop computer with an Intel(R) Xeon(R) CPU E5-2660 v3 (2.60 GHz) and 16 GB RAM, and all methods mentioned above were coded on the MATLAB R2020b.
Performance Analysis of ECSMA and SMA
We analyzed the algorithmic optimality of ECSMA and SMA including diversity and balance in this part. The related experiment was carried out on 31 benchmark functions; in addition, to ensure fairness, the experiment ensured that the parameters such as dimensionality, population size, and assessment time were the same. To fully analyze the performance, the balance and diversity of ECSMA and SMA are verified on the designed set of benchmark functions. Fig. 2 depicts the balance and diversity of DCSMA and SMA on partial functions. The first column is the diversity image. The x-axis denotes the number of iterations, while the y-axis denotes the diversity measure. The initial population of the algorithm is randomly generated, so the population has rich diversity at first. However, the diversity of the population decreases as the iteration progresses.
Diversity and balance analysis of algorithms
From the diversity analysis image, it can be seen that ECSMA always reaches the bottom of the image earlier than SMA, which shows that ECSMA converges faster than SMA, and ECSMA has stronger exploitation ability than SMA. The second and third columns are balance images, which have three curves: exploration curve, incremental decline curve, and production curve. The exploration stability of the algorithm in optimization stems from the high value of the exploration curve. And, the change in the exploitation curve demonstrates the change in the exploitation ability of the algorithm. The incremental decline curve is the result of balancing the two behaviors of exploitation and exploration. When the exploration efficiency is greater than or equal to the exploitation capacity, the incremental curve will increase. Rather, it is dwindling. The incremental decline curve reaches its maximum when the exploration and exploitation capabilities are the same. From the balance analysis, it can also be seen that ECSMA enters the exploitation stage faster than SMA. So, ECSMA can always spend less time and enter the exploitation stage faster than SMA. Therefore, ECSMA has better performance than SMA.
Impact of ES and CSS
Two strategies are incorporated into the original SMA, called ES and CSS in Section 3. Four different variants of SMA are shown in Table 1 to investigate the impact of the introduced mechanisms. “1” represents that the mechanism is introduced in SMA, and “0” represents that the mechanism is not introduced. For example, ESMA introduced the “ES” mechanism on behalf of SMA.
Various SMAs with three strategies
ES
CSS
SMA
0
0
ESMA
1
0
CSMA
0
1
ECSMA
1
1
Three SMA variants were tested for performance on a benchmark function set. In Table 2, the experimental results show the p-values of the various SMAs ranked by Wilcoxon signed-rank test. The Wilcoxon signed-rank test was used for this experiment, and the significance threshold difference rate between the comparison algorithms was 5%. The symbol “+” in the table indicates that ECSMA performs better than other algorithms. The symbol “−” in the table indicates that ECSMA performance is inferior to other algorithms. The symbol “=” in the table indicates that ECSMA behaves similarly to other algorithms. Regarding “+/−/=”, there is a difference in performance between ECSMA and other algorithms. So, ECSMA is inferior to ESMA, CSMA, and SMA on 3, 0, 0 out of 31 problems. Although the advantages of ECSMA in many functions are not distinct compared with ESMA, CSMA, and SMA, it is not worse or even better than these variants. Therefore, ECSMA demonstrates superiority. Moreover, ECSMA ranks first overall, showing better performance compared with its peers in the face of benchmark functions. Finally, ECSMA is chosen as the best lifting approach for SMA in the light of the above analysis. Therefore, by testing and comparing on benchmark functions, ECSMA is also very advantageous in optimizing performance.
<italic>p</italic>-values of various SMAs in this experiment
Function
ECSMA
CSMA
ESMA
SMA
F_{1}
–
1.000000E+00
1.000000E+00
1.000000E+00
F_{2}
–
1.000000E+00
1.000000E+00
1.000000E+00
F_{3}
–
1.000000E+00
1.000000E+00
1.000000E+00
F_{4}
–
1.000000E+00
1.000000E+00
1.000000E+00
F_{5}
–
5.577400E−01
2.369400E−01
8.307100E−04
F_{6}
–
1.734400E−06
2.353400E−06
1.734400E−06
F_{7}
–
5.716500E−01
6.435200E−01
8.220600E−02
F_{8}
–
4.285700E−06
1.044400E−02
1.734400E−06
F_{9}
–
1.000000E+00
1.000000E+00
1.000000E+00
F_{10}
–
1.000000E+00
1.000000E+00
1.000000E+00
F_{11}
–
1.000000E+00
1.000000E+00
1.000000E+00
F_{12}
–
1.204400E−01
1.107900E−02
1.126500E−05
F_{13}
–
1.956900E−02
1.915200E−01
1.734400E−06
F_{14}
–
1.250000E−01
6.875000E−01
6.103500E−05
F_{15}
–
3.001000E−02
4.165300E−01
6.319800E−05
F_{16}
–
1.000000E+00
1.000000E+00
1.179300E−04
F_{17}
–
4.640700E−01
2.865600E−02
1.733300E−06
F_{18}
–
5.467200E−04
2.944000E−02
6.309400E−05
F_{19}
–
1.734400E−06
1.956900E−02
2.894800E−01
F_{20}
–
3.820300E−01
9.271000E−03
5.307000E−05
F_{21}
–
4.652800E−01
4.681800E−03
1.734400E−06
F_{22}
–
3.326900E−02
4.405200E−01
1.734400E−06
F_{23}
–
3.160300E−02
1.204400E−01
1.734400E−06
F_{24}
–
1.000000E+00
1.000000E+00
1.000000E+00
F_{25}
–
1.000000E+00
1.000000E+00
1.000000E+00
F_{26}
–
1.000000E+00
1.000000E+00
1.000000E+00
F_{27}
–
8.612100E−01
5.857100E−01
1.528600E−01
F_{28}
–
1.000000E+00
1.000000E+00
1.000000E+00
F_{29}
–
1.000000E+00
1.000000E+00
1.000000E+00
F_{30}
–
1.318300E−04
1.000000E+00
3.125000E−02
F_{31}
–
1.821500E−05
1.000000E+00
2.701600E−05
+/−/=
–
2007/3/21
8/0/23
16/0/15
ARV
2.1656
2.3957
2.3591
3.0796
Comparison with Excellent Peers
In Table A2, the improved SMA was compared with the primitive SMA, 12 efficient metaheuristic algorithms and improved metaheuristic algorithms on the functions, including improved GWO algorithm (IGWO) [55], opposition-based learning GWO (OBLGWO) [56], chaotic whale optimizer algorithm (CWOA) [57], improved WOA (IWOA) [58], chaotic map bat algorithm with random black hole model (RCBA) [59], chaos-enhanced moth-flame optimizer (CMFO) [60], adaptive differential evolution (JADE) [61], particle swarm optimization with an aging leader and challengers (ALC-PSO) [62], bat optimizer (BA) [63], differential evolution (DE) [64], whale optimizer (WOA) [65] and grey wolf algorithm (GWO) [66].
The experiments were conducted on the effects of dimensional changes. The dimension of the benchmark experiment is set to 30. Relative parameters and function verification remain unchanged from the original version. Table A2 records the standard deviation (STD) and the mean values (AVG) obtained by algorithms to calculate functions.
In Table A2, the AVG and STD can reflect the stability of an algorithm. It can be observed that the stability of ECSMA is slightly weaker than other algorithms on F6, F12, f13, F20, F23, and F27, but it also ranks in the first few. In many functions such as F1, F2, F9, F11, F14, F16, F17, F18, F19, the AVG and STD of several algorithms, including ECSMA, reach the lowest at the same time. Therefore, proposed ECSMA can dig out the best solution more stably. The proposed model in this paper shows advantages on multiple types of functions. This includes multimodal function, unimodal function, and fixed dimension multimodal function. According to the ranking results, whether dealing with problems with a different dimension, the proposed ECSMA has obtained the first average ranking, which verifies the improvement of ECSMA performance compared to the original SMA.
The Wilcoxon signed-rank test evaluated the significance of the proposed ECSMA and other optimizers on 31 benchmark functions. Its result was also recorded at the end of Table A2, which demonstrated that the presented method performed well on most problems. p-values less than 0.05 in Table 3 means that ECSMA is significantly superior to competitors. Table 3 shows that ECSMA has no discernible difference with other algorithms in F1, F2, F9, F11, F15, F16, F20. But in other functions, it can be seen that ECSMA is obviously superior to most of the comparison algorithms in the convergence rate. This proves the superiority of ECSMA in testing functions.
<italic>p</italic>-values of ECSMA <italic>vs</italic>. other peers
F_{1}
F_{2}
F_{3}
F_{4}
IGWO
1.000000E+00
1.730000E−06
1.730000E−06
1.730000E−06
OBLGWO
5.000000E−01
1.730000E−06
5.000000E−01
3.790000E−06
CWOA
1.000000E+00
1.000000E+00
1.730000E−06
1.730000E−06
IWOA
1.000000E+00
1.000000E+00
1.730000E−06
1.730000E−06
RCBA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
CMFO
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
JADE
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
ALCPSO
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
DE
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
BA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
GWO
1.000000E+00
1.730000E−06
1.730000E−06
1.730000E−06
WOA
1.000000E+00
1.000000E+00
1.730000E−06
1.730000E−06
F_{5}
F_{6}
F_{7}
F_{8}
IGWO
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
OBLGWO
1.730000E−06
1.730000E−06
3.520000E−06
1.730000E−06
CWOA
1.730000E−06
1.730000E−06
1.920000E−06
1.730000E−06
IWOA
1.730000E−06
1.730000E−06
2.350000E−06
1.730000E−06
RCBA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
CMFO
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
JADE
3.820000E−01
1.730000E−06
1.730000E−06
1.730000E−06
ALCPSO
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
DE
1.730000E−06
1.730000E−06
1.730000E−06
2.610000E−04
BA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
GWO
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
WOA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
F_{9}
F_{10}
F_{11}
F_{12}
IGWO
1.000000E+00
4.770000E−07
1.000000E+00
7.710000E−04
OBLGWO
1.000000E+00
1.000000E+00
1.000000E+00
1.730000E−06
CWOA
1.000000E+00
6.110000E−06
1.000000E+00
1.730000E−06
IWOA
1.000000E+00
1.220000E−04
5.000000E−01
3.870000E−02
RCBA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
CMFO
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
JADE
1.000000E+00
1.070000E−06
1.950000E−03
3.590000E−04
ALCPSO
1.730000E−06
1.560000E−06
8.580000E−05
6.730000E−01
DE
1.000000E+00
1.960000E−07
1.000000E+00
1.730000E−06
BA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
GWO
1.000000E+00
6.800000E−08
1.000000E+00
2.350000E−06
WOA
1.000000E+00
4.780000E−05
1.250000E−01
6.890000E−05
F_{13}
F_{14}
F_{15}
F_{16}
IGWO
1.730000E−06
8.140000E−07
1.850000E−01
2.540000E−06
OBLGWO
1.730000E−06
1.730000E−06
7.190000E−02
1.730000E−06
CWOA
1.730000E−06
5.590000E−06
9.710000E−05
6.250000E−02
IWOA
1.730000E−06
1.920000E−05
6.040000E−03
1.220000E−04
RCBA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
CMFO
1.730000E−06
1.020000E−04
1.640000E−05
1.000000E+00
JADE
1.730000E−06
2.730000E−06
3.110000E−05
1.000000E+00
ALCPSO
2.070000E−02
6.330000E−05
1.480000E−02
1.000000E+00
DE
1.730000E−06
1.450000E−04
9.920000E−01
1.000000E+00
BA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
GWO
1.730000E−06
1.730000E−06
6.440000E−01
1.730000E−06
WOA
1.730000E−06
8.900000E−06
1.970000E−05
2.290000E−04
F_{17}
F_{18}
F_{19}
F_{20}
IGWO
1.730000E−06
1.400000E−05
2.410000E−04
4.780000E−01
OBLGWO
1.730000E−06
1.730000E−06
1.730000E−06
1.150000E−04
CWOA
1.730000E−06
1.730000E−06
1.730000E−06
6.290000E−01
IWOA
2.880000E−06
1.730000E−06
4.860000E−05
7.040000E−01
RCBA
1.730000E−06
1.730000E−06
1.730000E−06
9.750000E−01
CMFO
5.650000E−05
3.150000E−06
3.590000E−04
2.840000E−05
JADE
5.650000E−05
2.290000E−06
1.730000E−06
2.370000E−05
ALCPSO
5.650000E−05
1.220000E−05
1.730000E−06
9.320000E−06
DE
5.650000E−05
2.140000E−06
1.730000E−06
3.880000E−06
BA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
GWO
1.730000E−06
1.730000E−06
1.730000E−06
7.810000E−01
WOA
1.730000E−06
1.730000E−06
1.730000E−06
1.590000E−01
F_{21}
F_{22}
F_{23}
F_{24}
IGWO
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
OBLGWO
1.730000E−06
1.730000E−06
1.730000E−06
1.000000E+00
CWOA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
IWOA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
RCBA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
CMFO
9.750000E−01
1.650000E−01
6.440000E−01
1.730000E−06
JADE
6.730000E−01
2.770000E−03
1.480000E−02
4.320000E−08
ALCPSO
2.060000E−01
1.480000E−02
2.770000E−03
1.730000E−06
DE
3.110000E−05
1.730000E−06
3.110000E−05
6.800000E−08
BA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
GWO
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
WOA
1.730000E−06
1.730000E−06
1.730000E−06
1.730000E−06
F_{25}
F_{26}
F_{27}
F_{28}
IGWO
1.730000E−06
9.320000E−06
1.730000E−06
1.730000E−06
OBLGWO
1.000000E+00
9.260000E−01
2.440000E−04
1.220000E−04
CWOA
3.910000E−03
4.900000E−04
1.730000E−06
3.790000E−06
IWOA
5.960000E−05
4.490000E−02
1.730000E−06
2.560000E−06
RCBA
1.730000E−06
6.580000E−01
1.730000E−06
1.730000E−06
CMFO
1.730000E−06
4.450000E−05
1.730000E−06
1.730000E−06
JADE
1.730000E−06
1.730000E−06
1.650000E−01
1.730000E−06
ALCPSO
1.730000E−06
1.730000E−06
2.840000E−05
1.730000E−06
DE
1.730000E−06
1.730000E−06
2.600000E−06
1.730000E−06
BA
1.730000E−06
1.730000E−06
2.060000E−01
1.730000E−06
GWO
1.730000E−06
1.820000E−05
1.040000E−03
1.730000E−06
WOA
1.730000E−06
9.770000E−04
4.720000E−02
1.730000E−06
F_{29}
F_{30}
F_{31}
IGWO
1.730000E−06
1.730000E−06
1.730000E−06
OBLGWO
1.730000E−06
1.730000E−06
1.730000E−06
CWOA
1.730000E−06
1.730000E−06
1.730000E−06
IWOA
1.720000E−06
1.730000E−06
1.730000E−06
RCBA
1.730000E−06
1.730000E−06
1.730000E−06
CMFO
1.730000E−06
1.730000E−06
1.730000E−06
JADE
1.730000E−06
1.730000E−06
1.730000E−06
ALCPSO
1.730000E−06
1.730000E−06
1.730000E−06
DE
1.730000E−06
1.730000E−06
1.730000E−06
BA
1.730000E−06
1.730000E−06
1.730000E−06
GWO
1.730000E−06
1.730000E−06
1.730000E−06
WOA
2.560000E−06
1.710000E−06
1.730000E−06
Fig. 3 shows 12 convergence curves of ECSMA and other competitors on the 30-dimensional benchmark functions. ECSMA shows the best convergence when tackling problems F15, F22, F24, F25, F28, F29, while other optimizers stagnate in the local optimum. When considering F30 and F31, although JADE and DE converge rapidly in the early stage, ECSMA reaches the relative optimum at the later stage of the whole process. Compared with other algorithms, ECSMA converges to the right solution with the fastest speed when dealing with F1 and F2 problems.
Convergence curves of ECSMA and other peersComparison with State-of-the-Art Peers
In this subsection, ECSMA was compared with five state-of-the-art algorithms, mainly SADE [64], PEDE [67], EPSDE [68], LSHADE [69], and LSHADE_cnEpSi [70], which are some of the champion algorithms. The average and variance obtained by ECSMA and these advanced algorithms on each of the benchmark function tests are given in Table A3. It is easy to see that ECSMA achieves very good results compared to these advanced algorithms for the 31 benchmark functions selected in this paper. First, the excellent performance of ECSMA on the mean value fully illustrates that ECSMA has strong optimization ability on the function problems and can outperform these advanced algorithms on most of the benchmark functions. Secondly, from the outstanding performance on the variance of ECSMA, it fully illustrates that ECSMA has strong stability in the optimization process and can perform well.
Further, the significance of the proposed ECSMA and other advanced algorithms on 31 benchmark functions was evaluated using the Wilcoxon signed-rank test. Observing the specific results in Table 4, it can be found that most of the p-values in the Wilcoxon test are less than 0.05, which fully demonstrates the validity of our experiments and the given results are sufficient to prove the advancedness of ECSMA. Finally, the convergence curves of ECSMA and other advanced algorithms on F1, F5, F7, F10, F12, F13, F25, F28, and F29 are given in Fig. 4. In the given convergence curves, it can be seen that ECSMA has excellent convergence effect and the ability to jump out of local optimum. The core advantages of ECSMA are further revealed, indicating that ECSMA is an excellent swarm intelligence optimization algorithm that can be used to solve most optimization problems.
<italic>p</italic>-values of ECSMA <italic>vs</italic>. state-of-the-art peers
SADE
MPEDE
EPSDE
LSHADE
LSHADE_cnEpSi
F1
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F2
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F3
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F4
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F5
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F6
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F7
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F8
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F9
1.733310E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F10
1.734400E−06
1.734400E−06
1.734400E−06
1.594240E−06
1.734400E−06
F11
1.819740E−05
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F12
1.020110E−01
1.360110E−05
1.650270E−01
3.181680E−06
4.729200E−06
F13
6.732800E−01
1.920920E−06
6.732800E−01
2.596710E−05
1.734400E−06
F14
6.334250E−05
6.334250E−05
6.334250E−05
6.334250E−05
6.334250E−05
F15
1.734400E−06
5.709650E−02
4.729200E−06
1.734400E−06
3.112320E−05
F16
1.000000E+00
1.000000E+00
1.000000E+00
1.000000E+00
1.000000E+00
F17
4.882810E−04
4.882810E−04
4.882810E−04
4.882810E−04
4.882810E−04
F18
1.263170E−06
1.150490E−06
1.109130E−06
9.446640E−07
2.042150E−06
F19
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F20
1.126540E−05
3.881110E−04
1.149920E−04
2.105260E−03
1.477280E−04
F21
3.112320E−05
9.753870E−01
3.588840E−04
5.709650E−02
3.709350E−01
F22
1.734400E−06
2.765270E−03
1.734400E−06
2.765270E−03
5.709650E−02
F23
3.112320E−05
3.112320E−05
3.112320E−05
3.588840E−04
3.588840E−04
F24
1.734400E−06
1.726770E−06
1.720260E−06
1.540100E−06
1.734400E−06
F25
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F26
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F27
1.734400E−06
1.734400E−06
8.290130E−01
1.734400E−06
1.734400E−06
F28
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F29
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F30
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
F31
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
1.734400E−06
Convergence curves of ECSMA and other state-of-the-art peers
In summary, the superiority of ECSMA not only in convergence speed as well as convergence accuracy is well demonstrated but also in avoiding falling into local optimum and optimization capability is illustrated through a series of benchmark function comparison experiments.
ECSMA for the Structural Design Issues
To demonstrate the practical performance of the proposed method, the ECSMA-based model is applied to several engineering optimization problems. Engineering optimization problems differ in that the optimal solution must be obtained while satisfying the constraints. So, the value needs to be within a certain range. Four engineering problems as follows.
Structure Design of Welded Beam (WB)
The idea of the structural design problem is to optimize the structure of the WB so that the material consumption of the WB is minimized. The main parameters involved are the length of the bar (l), the thickness of the bar (b), the thickness of the weld (h), the height of the bar (t). Further, the primary constraints are deflection rate (δ), bending stress in the beam (θ), shear stress (τ), bucking load (Pc). The specific formulae and constraints are as follows:
This structural design problem has been studied extensively as a constrained optimization problem. Mirjalili et al. [71] used SSA to optimize this problem. Rashedi et al. [8] proposed GSA to solve the problem. GSA could obtain an optimum cost of 1.879950.
Table 5 shows the results of ECSMA and other similar algorithms for solving WB. ECSMA’s performance is the best. And, best cost is 1.715213. Four parameters: h = 0.195446, l = 3.419576, t = 9.132268, and b = 0.205258. Finally, ECSMA can satisfy the constraints and solve this problem to obtain the minimum manufacturing cost for WB.
Comparison between other widely used methods for the WB case
Technique
Best variables
Best cost
h
l
t
b
ECSMA
0.195446
0.195446
0.195446
0.195446
0.195446
Simple method
0.279200
5.625600
7.751200
0.279600
2.530700
WOA
0.205396
3.484293
9.037426
0.206276
1.730499
SSA
0.205700
3.471400
9.036600
0.205700
1.724910
GSA
0.182129
3.856979
10.00000
0.202376
1.879950
CAEP
0.205700
3.470500
9.036600
0.205700
1.724852
WCA
0.205728
3.470522
9.036620
0.205729
1.724856
Structure Design of PV Design
The cylindrical PV model needs to optimize the constraint variables to reduce the cost. These variables are section range minus head (l), head thickness (Th), shell thickness (Ts) and inner radius (r). The model can be described as follows:
ECSMA was used to optimize the problem. The experimental results of ECSMA were compared with CDE, ACO [72,73], EWOA, MFO [74], MDDE. In Table 6, the comparison outcomes are shown in detail. The consequence of ECSMA is superior to other algorithms, which demonstrate that our proposed ECSMA can effectively handle this problem.
Comparison with other widely used methods of the PV design problem
Algorithm
Optimal values for variables
Optimum cost
Ts
Th
R
L
ECSMA
1.414263
0.656058
65.15476
10.48867
5709.646
CDE
0.812500
0.437500
42.098400
176.637600
6059.7340
ACO
0.8125
0.4375
42.1036
176.5727
6059.0888
EWOA
0.901034
0.452897
46.67809
127.0967
6160.209
MFO
0.8125
0.4375
42.0984
176.6366
6059.7143
MDDE
0.8125
0.4375
42.0984
176.6360
6059.7017
Structure Design of I-Beam
This experimental design minimizes the vertical deflection of the I-beam. And the model requires solving structural parameters such as length, height and thickness. The model is as follows:
Variable range 10≤x1≤50, 10≤x2≤80, 0.9≤x3≤5, 0.9≤x4≤5
The meta-heuristic methods can be adopted in combination with mathematical models to solve the design problem of I-beam (IBD). Meta-heuristic methods include RCBA [59], WEMFO, SCA [6], CS [75], HBO [76], CLSGMFO [26]. The constraint correction equation of the loss function is adopted to deal with the IBD problem. The experimental comparison results of ECSMA and other optimizers are illustrated in Table 6. Further, we use the same penalty function to ensure a fair comparison.
Table 7 indicates that ECSMA is superior to other optimizers compared when handling IBD problems and ultimately yields the most efficient design.
Comparison with other widely used methods for the I-beam problem
Algorithm
Optimal values for variables
Optimum weight
b
h
tw
tf
ECSMA
50.00000
80.00000
1.764406
5.000000
0.006626
RCBA
50.0000
80.0000
4.8149
5.0000
0.0066270
WEMFO
50.0000
80.0000
1.761606
5.0000
0.006626
SCA
50.0000
80.0000
1.760880
5.0000
0.006627
HBO
50.0000
80.0000
1.760220
5.0000
0.006627
CLSGMFO
38.0000
44.0000
3.775681
4.0000
0.006626
Cantilever Beam Design Problem
In this engineering structural design problem, we use ECSMA to obtain the minimum quantity of materials of the cantilever beam. The cantilever beam is composed of five hollow square blocks vertically stacked together, and the inner diameter is arranged in increasing order. The mathematical model equation of the problem is as follows:
Consider x→=[x1x2x3x4x5]
Minimize f(x→)=0.0624(x1+x2+x3+x4+x5)
Subject to g(x→)=61x13+27x23+19x33+7x43+1x53−1≤0
Variable range 0.01≤x1,x2,x3,x4,x5≤100
ECSMA is used to deal with this optimization problem. At the same time, the results of ECSMA with CS, GCA_II [77], GCA_I [77], MMA [77], SOS, and SSA [71] are listed in Table 8.
Comparison with other widely used methods for the cantilever beam problem
Algorithm
Optimal values for variables
Optimum weight
x_{1}
x_{2}
x_{3}
x_{4}
x_{5}
ECSMA
6.09864
5.300626
4.412439
3.518830
2.148634
1.34030
SSA
6.015135
5.3093047
4.4950067
3.5014263
2.1527879
1.339956
SOS
6.01878
5.30344
4.49587
3.49896
2.15564
1.33996
MMA
6.0100
5.3000
4.4900
3.4900
2.1500
1.3400
GCA_I
6.0100
5.3000
4.4900
3.4900
2.1500
1.3400
GCA_II
6.0100
5.3000
4.4900
3.4900
2.1500
1.3400
CS
6.0089
5.3049
4.5023
3.5077
2.1504
1.33999
As outlined in Table 8, it indicates that ECSMA possesses more stability and effectiveness than counterparts compared. Therefore, our method provides more economical results so that it can also be applied to more other fields in the future, such as power flow optimization [78], road network planning [79], information retrieval services [80,81], human activity recognition [82], structured sparsity optimization [83], dynamic module detection [84,85], recommender system [86,87], tensor completion [88], colorectal polyp region extraction [89], image-to-image translation [90], smart contract vulnerability detection [91], and medical data processing [92].
Finally, the experimental results of solving four classical structural design problems with the model designed in this paper demonstrate the feasibility and practicability of ECSMA. The experimental results demonstrate the ability of SMA to solve constrained problems, and ECSMA continues the advantage of SMA in solving for even trends.
Conclusions and Future Works
In this study, the ECSMA is designed for the lack of exploration and exploitation ability of the original SMA. In ECSMA, the elite strategy can facilitate the exploitation capability of SMA, and chaos stochastic mechanism is adopted to enhance the randomness, to improve the exploration ability during the early period. The introduction of the two strategies gives SMA a better balance of exploration and exploitation capabilities. The experimental results on the benchmark function set (including unimodal function, multimodal function, and dimensionally determined multimodal function) show that the two strategies introduced can effectively tackle the problem of function optimization, alleviate the premature convergence of SMA by jumping out of local optimum, and provide better accuracy and diversity of SMA. When handling the above four structure design problems, the simulation outcomes also demonstrate that ECSMA can achieve better accuracy of the calculation results, which has a certain practical value in a real-world application. However, since two improvement strategies are introduced, they inevitably cause an increase in the complexity of the algorithm, which makes ECSMA limited in some scenarios.
In the future, GPU parallel approaches and multi-threaded parallel processing will be considered to solve more complex problems. In addition, given that SMA is a relatively new algorithm, its in-depth study and application in multiple disciplines still need to be fully explored.
Funding Statement
This work was supported in part by the National Natural Science Foundation of China (J2124006, 62076185).
Availability of Data and Materials
The data involved in this study are all public data, which can be downloaded through public channels.
Conflicts of Interest:
The authors declare that they have no conflicts of interest to report regarding the present study.
ReferencesLi, S., Chen, H., Wang, M., Heidari, A. A., Mirjalili, S. (2020). Slime mould algorithm: A new method for stochastic optimization. Heidari, A. A., Mirjalili, S., Faris, H., Aljarah, I., Mafarja, M.et al. (2019). Harris hawks optimization: Algorithm and applications. Chen, H., Jiao, S., Wang, M., Heidari, A. A., Zhao, X. (2019). Parameters identification of photovoltaic cells and modules using diversification-enriched Harris hawks optimization with chaotic drifts. Mirjalili, S., Mirjalili, S. M., Hatamlou, A. (2016). Multi-verse optimizer: A nature-inspired algorithm for global optimization. Del Valle, Y., Venayagamoorthy, G. K., Mohagheghi, S., Hernandez, J. C., Harley, R. G. (2008). Particle swarm optimization: Basic concepts, variants and applications in power systems. Mirjalili, S. (2016). SCA: A sine cosine algorithm for solving optimization problems. Ji, Y., Tu, J., Zhou, H., Gui, W., Liang, G.et al. (2020). An adaptive chaotic sine cosine algorithm for constrained and unconstrained optimization. Rashedi, E., Nezamabadi-Pour, H., Saryazdi, S. (2009). GSA: A gravitational search algorithm. Ahmadianfar, I., Asghar Heidari, A., Noshadian, S., Chen, H., Gandomi, A. H. (2022). INFO: An efficient optimization algorithm based on weighted mean of vectors. Yang, Y., Chen, H., Heidari, A. A., Gandomi, A. H. (2021). Hunger games search: Visions, conception, implementation, deep analysis, perspectives, and towards performance shifts. Tu, J., Chen, H., Wang, M., Gandomi, A. H. (2021). The colony predation algorithm. Ahmadianfar, I., Asghar Heidari, A., Gandomi, A. H., Chu, X., Chen, H. (2021). RUN beyond the metaphor: An efficient optimization algorithm based on Runge Kutta method. Tu, J., Chen, H., Liu, J., Heidari, A. A., Zhang, X.et al. (2021). Evolutionary biogeography-based whale optimization methods with communication structure: Towards measuring the balance. Song, S., Wang, P., Heidari, A. A., Wang, M., Zhao, X.et al. (2021). Dimension decided Harris hawks optimization with Gaussian mutation: Balance analysis and diversity patterns. Zhao, D., Liu, L., Yu, F., Heidari, A. A., Wang, M.et al. (2022). Opposition-based ant colony optimization with all-dimension neighborhood search for engineering design. Li, C., Li, J., Chen, H., Heidari, A. A. (2021). Memetic Harris hawks optimization: Developments and perspectives on project scheduling and QoS-aware web service composition. Wei, Z., Liu, L., Kuang, F., Li, L., Xu, S.et al. (2022). An efficient multi-threshold image segmentation for skin cancer using boosting whale optimizer. Liu, L., Zhao, D., Yu, F., Heidari, A. A., Li, C.et al. (2021). Ant colony optimization with Cauchy and greedy Levy mutations for multilevel COVID 19 X-ray image segmentation. Zhao, D., Liu, L., Yu, F., Heidari, A. A., Wang, M.et al. (2021). Chaotic random spare ant colony optimization for multi-threshold image segmentation of 2D Kapur entropy. Chen, C., Wang, X., Yu, H., Wang, M., Chen, H. (2021). Dealing with multi-modality using synthesis of Moth-flame optimizer with sine cosine mechanisms. Wu, S., Mao, P., Li, R., Cai, Z., Heidari, A. A.et al. (2021). Evolving fuzzy k-nearest neighbors using an enhanced sine cosine algorithm: Case study of lupus nephritis. Deng, W., Xu, J., Zhao, H., Song, Y. (2020). A novel gate resource allocation method using improved PSO-based QEA. Deng, W., Xu, J., Song, Y., Zhao, H. (2020). An effective improved co-evolution ant colony optimisation algorithm with multi-strategies and its application. Hu, J., Gui, W., Heidari, A. A., Cai, Z., Liang, G.et al. (2022). Dispersed foraging slime mould algorithm: Continuous and binary variants for global optimization and wrapper-based feature selection. Liu, Y., Heidari, A. A., Cai, Z., Liang, G., Chen, H.et al. (2022). Simulated annealing-based dynamic step shuffled frog leaping algorithm: Optimal performance design and feature selection. Xu, Y., Chen, H., Heidari, A. A., Luo, J., Zhang, Q.et al. (2019). An efficient chaotic mutative moth-flame-inspired optimizer for global optimization tasks. Zhang, Y., Liu, R., Heidari, A. A., Wang, X., Chen, Y.et al. (2021). Towards augmented kernel extreme learning models for bankruptcy prediction: Algorithmic behavior and comprehensive analysis. Wu, S. H., Zhan, Z. H., Zhang, J. (2021). SAFE: Scale-adaptive fitness evaluation method for expensive optimization problems. Li, J. Y., Zhan, Z. H., Wang, C., Jin, H., Zhang, J. (2020). Boosting data-driven evolutionary algorithm with localized data generation. Hussien, A. G., Heidari, A. A., Ye, X., Liang, G.etal. (2022). Boosting whale optimization with evolution strategy and Gaussian random walks: An image segmentation method. Yu, H., Song, J., Chen, C., Heidari, A. A., Liu, J.et al. (2022). Image segmentation of leaf spot diseases on maize using multi-stage Cauchy-enabled grey wolf algorithm. He, Z., Yen, G. G., Ding, J. (2020). Knee-based decision making and visualization in many-objective optimization. He, Z., Yen, G. G., Lv, J. (2019). Evolutionary multiobjective optimization with robustness enhancement. Ye, X., Liu, W., Li, H., Wang, M., Chi, C.et al. (2021). Modified whale optimization algorithm for solar cell and PV module parameter identification. Song, Y., Cai, X., Zhou, X., Zhang, B., Chen, H.et al. (2023). Dynamic hybrid mechanism-based differential evolution algorithm and its application. Deng, W., Zhang, X., Zhou, Y., Liu, Y., Zhou, X.et al. (2022). An enhanced fast non-dominated solution sorting genetic algorithm for multi-objective problems. Hua, Y., Liu, Q., Hao, K., Jin, Y. (2021). A survey of evolutionary algorithms for multi-objective optimization problems with irregular pareto fronts. Deng, W., Ni, H., Liu, Y., Chen, H., Zhao, H. (2022). An adaptive differential evolution algorithm based on belief space and generalized opposition-based learning for resource allocation. Han, X., Han, Y., Chen, Q., Li, J., Sang, H.et al. (2021). Distributed flow shop scheduling with sequence-dependent setup times using an improved iterated greedy algorithm. Gao, D., Wang, G. G., Pedrycz, W. (2020). Solving fuzzy job-shop scheduling problem using DE algorithm improved by a selection mechanism. Wang, G. G., Gao, D., Pedrycz, W. (2022). Solving multiobjective fuzzy job-shop scheduling problem by a hybrid adaptive differential evolution algorithm. Chen, L. H., Yang, B., Wang, S. J., Wang, G., Li, H. Z.et al. (2014). Towards an optimal support vector machine classifier using a parallel particle swarm optimization strategy. Wang, M., Chen, H., Yang, B., Zhao, X., Hu, L.et al. (2017). Toward an optimal kernel extreme learning machine using a chaotic moth-flame optimization strategy with applications in medical diagnoses. Chen, H. L., Wang, G., Ma, C., Cai, Z. N., Liu, W. B.et al. (2016). An efficient hybrid kernel extreme learning machine approach for early diagnosis of Parkinson’s disease. Deng, W., Xu, J., Gao, X. Z., Zhao, H. (2022). An enhanced MSIQDE algorithm with novel multiple strategies for global optimization problems. Ebadinezhad, S. (2020). DEACO: Adopting dynamic evaporation strategy to enhance ACO algorithm for the traveling salesman problem. Chen, H., Wang, M., Zhao, X. (2020). A multi-strategy enhanced sine cosine algorithm for global optimization and constrained practical engineering problems. Guo, W., Liu, T., Dai, F., Xu, P. (2020). An improved whale optimization algorithm for forecasting water resources demand. Jiang, J., Yang, X., Meng, X., Li, K. (2020). Enhance chaotic gravitational search algorithm (CGSA) by balance adjustment mechanism and sine randomness function for continuous optimization problems. Javidi, A., Salajegheh, E., Salajegheh, J. (2019). Enhanced crow search algorithm for optimum design of structures. Tawhid, M. A., Dsouza, K. B. (2018). Hybrid binary dragonfly enhanced particle swarm optimization algorithm for solving feature selection problems. Luo, Q., Yang, X., Zhou, Y. (2019). Nature-inspired approach: An enhanced moth swarm algorithm for global optimization. Sulaiman, N., Mohamad-Saleh, J., Abro, A. G. (2018). A hybrid algorithm of ABC variant and enhanced EGS local search technique for enhanced optimization performance. Zhou, X., Lu, J., Huang, J., Zhong, M., Wang, M. (2021). Enhancing artificial bee colony algorithm with multi-elite guidance. Cai, Z., Gu, J., Luo, J., Zhang, Q., Chen, H.et al. (2019). Evolving an optimal kernel extreme learning machine by using an enhanced grey wolf optimization strategy. Heidari, A. A., Ali Abbaspour, R., Chen, H. (2019). Efficient boosted grey wolf optimizers for global search and kernel extreme learning machine training. Yousri, D., Allam, D., Eteiba, M. B. (2019). Chaotic whale optimizer variants for parameters estimation of the chaotic behavior in permanent magnet synchronous motor. Tubishat, M., Abushariah, M. A. M., Idris, N., Aljarah, I. (2019). Improved whale optimization algorithm for feature selection in Arabic sentiment analysis. Liang, H., Liu, Y., Shen, Y., Li, F., Man, Y. (2018). A hybrid bat algorithm for economic dispatch with random wind power. Li, H. W., Liu, J. Y., Chen, L., Bai, J. B., Sun, Y. Y.et al. (2019). Chaos-enhanced moth-flame optimization algorithm for global optimization. Zhang, J., Sanderson, A. C. (2009). JADE: Adaptive differential evolution with optional external archive. Chen, W., Zhang, J., Lin, Y., Chen, N., Zhan, Z.et al. (2013). Particle swarm optimization with an aging leader and challengers. Yang, X. S. (2010). A new metaheuristic bat-inspired Algorithmed. In: Qin, A. K., Huang, V. L., Suganthan, P. N. (2009). Differential evolution algorithm with strategy adaptation for global numerical optimization. Mirjalili, S., Lewis, A. (2016). The whale optimization algorithm. Mirjalili, S., Mirjalili, S. M., Lewis, A. (2014). Grey wolf optimizer. Wu, G., Mallipeddi, R., Suganthan, P. N., Wang, R., Chen, H. (2016). Differential evolution with multi-population based ensemble of mutation strategies. Mallipeddi, R., Suganthan, P. N., Pan, Q. K., Tasgetiren, M. F. (2011). Differential evolution algorithm with ensemble of parameters and mutation strategies. Tanabe, R., Fukunaga, A. S. (2014). Improving the search performance of SHADE using linear population size reductioned. 2014 IEEE Congress on Evolutionary Computation (CEC), pp. 1658–1665. Beijing, China.Awad, N. H., Ali, M. Z., Suganthan, P. N. (2017). Ensemble sinusoidal differential covariance matrix adaptation with Euclidean neighborhood for solving CEC2017 benchmark problemsed. 2017 IEEE Congress on Evolutionary Computation (CEC), pp. 372–379. Donostia, Spain.Mirjalili, S., Gandomi, A. H., Mirjalili, S. Z., Saremi, S., Faris, H.et al. (2017). Salp swarm algorithm: A bio-inspired optimizer for engineering design problems. Dorigo, M. (1992). Dorigo, M., Caro, G. D. (1999). The ant colony optimization meta-heuristiced. In: Mirjalili, S. (2015). Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Yang, X., Suash, D. (2009). Cuckoo search via Lévy flightsed. 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC), pp. 210–214. Coimbatore, India.Cheng, M. Y., Prayogo, D. (2014). Symbiotic organisms search: A new metaheuristic optimization algorithm. Chickermane, H., Gea, H. C. (1996). Structural optimization using a new local approximation method. Cao, X., Wang, J., Zeng, B. (2022). A study on the strong duality of second-order conic relaxation of AC optimal power flow in radial networks. Huang, L., Yang, Y., Chen, H., Zhang, Y., Wang, Z.et al. (2022). Context-aware road travel time estimation by coupled tensor decomposition based on trajectory data. Wu, Z., Li, R., Xie, J., Zhou, Z., Guo, J.et al. (2020). A user sensitive subject protection approach for book search service. Wu, Z., Shen, S., Zhou, H., Li, H., Lu, C.et al. (2021). An effective approach for the protection of user commodity viewing privacy in e-commerce website. Qiu, S., Zhao, H., Jiang, N., Wang, Z., Liu, L.et al. (2022). Multi-sensor information fusion based on machine learning for real applications in human activity recognition: State-of-the-art and research challenges. Zhang, X., Zheng, J., Wang, D., Tang, G., Zhou, Z.et al. (2022). Structured sparsity optimization with non-convex surrogates of l_{2,0}-Norm: A unified algorithmic framework. Li, D., Zhang, S., Ma, X. (2021). Dynamic module detection in temporal attributed networks of cancers. Ma, X., Sun, P. G., Gong, M. (2020). An integrative framework of heterogeneous genomic data for cancer dynamic modules based on matrix decomposition. Li, J., Chen, C., Chen, H., Tong, C. (2017). Towards context-aware social recommendation via individual trust. LiJ., ZhengX. L., ChenS. T., SongW. W., ChenD. R. (2014). An efficient and reliable approach for quality-of-service-aware service composition. Wang, W., Zheng, J., Zhao, L., Chen, H., Zhang, X. (2022). A non-local tensor completion algorithm based on weighted tensor nuclear norm. Hu, K., Zhao, L., Feng, S., Zhang, S., Zhou, Q.et al. (2022). Colorectal polyp region extraction using saliency detection network with neutrosophic enhancement. Zhang, X., Fan, C., Xiao, Z., Zhao, L., Chen, H.et al. (2022). Random reconstructed unpaired image-to-image translation. Zhang, L., Wang, J., Wang, W., Jin, Z., Su, Y.et al. (2022). Smart contract vulnerability detection combined with multi-objective detection. Guo, K., Chen, T., Ren, S., Li, N., Hu, M.et al. (2022). Federated learning empowered real-time medical data processing method for smart healthcare. Appendix
<table-wrap id="table-9">
<label>Table A1</label>
<caption>
<title>Description of the 23 benchmark functions and CEC2014