Bat algorithm (BA) is an eminent meta-heuristic algorithm that has been widely used to solve diverse kinds of optimization problems. BA leverages the echolocation feature of bats produced by imitating the bats’ searching behavior. BA faces premature convergence due to its local search capability. Instead of using the standard uniform walk, the Torus walk is viewed as a promising alternative to improve the local search capability. In this work, we proposed an improved variation of BA by applying torus walk to improve diversity and convergence. The proposed. Modern Computerized Bat Algorithm (MCBA) approach has been examined for fifteen well-known benchmark test problems. The finding of our technique shows promising performance as compared to the standard PSO and standard BA. The proposed MCBA, BPA, Standard PSO, and Standard BA have been examined for well-known benchmark test problems and training of the a.pngicial neural network (ANN). We have performed experiments using eight benchmark datasets applied from the worldwide famous machine-learning (ML) repository of UCI. Simulation results have shown that the training of an ANN with MCBA-NN algorithm tops the list considering exactness, with more superiority compared to the traditional methodologies. The MCBA-NN algorithm may be used effectively for data classification and statistical problems in the future.
Bat algorithmMCBAANNMLTorus walkIntroduction
Optimization is an attempt to generate an optimal solution of the problem amidst a given set of possible solutions. All optimization systems have an objective function and various decision variables that have an effect on the function. The important optimization challenges are to reduce wastage of time or exploit the preferred advantage of a given engineering system [1]. Optimization techniques might be depicted as strategies for achieving predominant solutions that satisfy the given objective functions [2]. These objective functions refer to the observation of optimization problems wherein one seeks to reduce or maximize a characteristic by scie.pngically selecting the values of variables within their pool of solutions [3]. All possible solutions are recognized as acceptable solutions, while the best solution is considered to be an optimum solution. Swarm intelligence (SI), which is the core branch of a.pngicial intelligence (AI), deals with the multi-agent system and its structural design that is influenced by the mutual actions of social insects like ants, bees as well as other social animal colonies. The term Swarm Intelligence (SI) was first reported by Gerardo Beni and Jing Wang in 1989 in their work on the cellular robotic system. SI techniques have been dynamically utilized in the field of optimization which is of significant certainty for science, engineering, robotics, and computer & telecommunication network industries. Scientists and researchers are solving future-driven network security problems in Software-Defined Networking (SDN) [4–10], Named Data Networking (NDN) [11,12], and cloud computing network [13] with future-driven applications such as voice over IP (VoIP) [14–16], worldwide interoperability for microwave access (WiMAX) [17–19] with the support of SI, AI and (ML) [20]. Many algorithms based on SI have been developed and effectively applied to tackle lots of existing complex optimization problems [21]. Some major algorithms are: Ant Colony Optimization (ACO) [22], Particle Swarm Optimization (PSO) [23], Bee Colony Algorithm (BCA) [24] and Bat algorithm (BA) [25] etc.
Bat Algorithm (BA) is recognized (viewed by many network researchers) as the most promising population-based stochastic algorithm to solve global optimization problems [Use this as a reference here: Yang in 2010]. Due to its robustness and simplicity, the algorithm has become the most popular alternative to solve complex optimization problems reported in the diversified field of engineering and science. In BA, bats achieve path search in a condition of full darkness by implementing refined echolocation to project their encompassing environment. Bats can differentiate their target from harmful predicators and objects by ejecting the small sound waves’ pulse, and then hearing their responses [26]. The BA has been to solve numerous engineering problems [27]. Also, BA is incorporated in application areas like Structural Damage Detection [28], power system stabilizer [29], sizing battery for energy storage3 and power dispatcher [30], Image Processing [31], and Fault Diagnosis [32].
The searching process for an algorithm has two fundamental components: first, the exploration which finds the unknown or new area for searching the successful solution, and second, exploitation that is used to enhance all the available solutions provided by exploration. As provided in the literature [33], various researchers described that the traditional BA can efficiently optimize the real world and low-dimensional optimization problems. Yet, various disadvantages like the problem of premature convergence and local minima for the high dimensional problem exists to solve the optimization problems. To solve the high dimensionality problem and enhance the capability of diversity in BA, there is a need to enhance the performance of BA [34]. To reduce the existing drawbacks of BA in solving the different optimization problems, this paper introduces a Modern computerized version of the BA Moreover, torus walk is added to ensure diversity and effectiveness in the population.
The article has been divided into different sections. Section 2 contains the Literature Review. In Section 3, the working of Standard BA is elaborated. The proposed variant is discussed in Section 4. In Section 5, the experimental setup and their characteristics are discussed. Simulation results and discussion is presented in Sections 6 & 7. Section 8 describes the conclusion.
Related Work
To avoid the issue of premature convergence during the local search [35], a novel enhanced technique for local search was proposed. The proposed technique is termed BA with local search (IBA). In [36], a novel meta-heuristic (HBH) is introduced with the combination of traditional BA and mutation operator to enhance its diversity. For reducing the low global exploration capability, an improved BA based on Iterative Local search and Stochastic Inertia Weight (ILSSIWBA) is proposed [37]. In the proposed technique, a type of Iterative Local Search (ILS) algorithm is embedded to get ride-off from local optima. A novel Hybrid BA (HBA) introduced [38] enhances the performance of traditional BA. To increase the local search capability and get ride off from local optimum, three updated methodologies are added into the standard BA. A new hybrid BA termed as Shuffled Multi-Population of BA (SMPBat) was introduced [39], where the introduced algorithm embedded the two latest strategies of BA: BA with Ring Master-Slave technique and Enhanced Shuffled BA. To dynamically allocate the resources of users, a Bat algorithm for SDN network scheduling introduced [40] with respect to the model for a network request. The statistical results elaborate that the introduced algorithm performs well, better than dynamic programming and greedy algorithms.
An enhanced BA proposed by embedding new inertia weight dynamically and for the parameters of the algorithm self-adaptive technique introduced [41]. In the proposed technique, a chaotic sequence is utilized for improving the local search capabilities and diversity of the population. A multi-objective Discrete BA (MDBA) proposed [42] for detecting the community architecture in dynamic structures. To update the bat location, a discrete methodology was considered. In [43], a modified BA is introduced for the solution of nonlinear, nonconvex, and high-dimensional problems. The outstanding success of the proposed technique was shown when it was compared with 17 competitive techniques. By using a K-mean clustering algorithm with modified binary BA, a wrapper-based feature selection in unsupervised learning was introduced [44]. To reduce the drawbacks of evolutionary algorithms that are previously introduced for detecting community in social dynamic structures, a new technique, Multi-objective Bat algorithm (MS-DYE) proposed [45]. In literature, it is proved that each parameter of BA can enhance the performance only at a particular time. For overcoming this drawback, a new BA with Multiple Strategies Coupling (mixBA) was introduced [46], in which 8 various strategies were incorporated. Two extended versions of BA termed PCA_BA and PCA_LBA are proposed [47], which enhance and test the performance of global search capability with the problems of large-scale. For the Jobshop scheduling problem (JSP), a type of Bipopulation based Discrete BA (BDBA) was introduced that reduces the sum of completion time cost and energy consumption cost [48]. An improved searching method introduced [49] by incorporating a new version of BA called Extended BA (EBA). With the help of mutation and adaptive step control approach, an extended Self-Adaptive BA (SABA) proposed [50]. The basic concept for the proposed parameters of SABA was to make sure convergence. A traveling salesman problem was incorporated to find the solution to the optimization problem of the cable connection at an offshore wind farm [51]. An improved BA was introduced to resolve this problem. To overcome the pipe network planning problem, a novel extended model was introduced [52] that gives the assurance of water supply connectivity by using the phenomena of high-level water nodes. None of the above papers considered modern computerized Bat algorithms using a.pngicial neural networks and machine learning.
Bat Algorithm
The standard BA was introduced by Yang in 2010, which is a new meta-heuristic SI-based optimization approach. The reason that inspired BA was microbat’s behavior to operate in a society, and they used their echolocation capability for measuring distance. There are three major attributes of micro-bats for echolocation strategies that are standardized. These standardized rules are elaborated below.
For measuring distance, each bat implements the echolocation, as well as, they can clearly distinguish between their targets either it is food or prey, or the other background objects.
Bats move with fixed frequency f_{min}, velocity V_{i} at location X_{i} randomly and diversify the loudness and wavelength A_{0} during the search of target. The wavelength of their ejected pulses are automatically modified by the bats. The closeness of their target is caused to finalize the pulse emission rate r∈[0,1].
The loudness diversifies from the highest positive value of A_{0} to the lowest constant value A_{min}.
The primary purpose of this algorithm is to search for the best quality food from available various food sources according to the region in the search space. Whereas the food sources’ locations are unide.pngied, the preliminary bat’s location is produced randomly from N vectors according to the dimension d regarding as Eq. (1). After that, the worth of the food sources are tested, which are placed in the bat population.
xij=xmind+μ(xmaxd−xmind)
Here, i=1,2,…,N and j=1,2,…,d. The lower and upper boundaries of d-dimensional search space are xmind and xmaxd. A random value is a μ∈[0,1]. Within a d-dimensional space, for every bat (i), its velocity (V_{i}) and position (X_{i}) should be determined, and afterwards modified throughout the iterations. The modified velocities and positions of the a.pngicial bats at specific time (t) are improved by:
fi=fmin+(fmax−fmin)β
Vit=Vit−1+(Xit−1−Xi∗)fi
Xit=Xit−1+Vit
The frequency of the ith bat is denoted by f_{i}. Where the maximum and minimum frequencies are fmax and fmin. A random number β is uniformly taken from [0, 1]. The present global best solution is nominated as X^{*} that is placed after the comparison of all solutions with every N bat. Utilizing an arbitrary walk, an adjusted solution of all bats is privately delivered for the phase of local search in the given Eq. (5).
xnew=xold+εAt
A random number ε is uniformly taken from [0, 1], and at current time step the average loudness of all bats is At=(Ait).
Furthermore, the pulse emission rate r_{i} and the loudness A_{i} are modified as the iterations progress. As the bats get nearer to their target the loudness reduces, thus the pulse emission rate expands. The pulse emission rate and loudness are modified by the following equations:
Ait+1=αAit
rit+1=ri0[1−exp(−γt)]
Here γ and α are constants. For any γ>0 and 0<α<1, we have rit→ri0 and Ait→0 as t→∞. In general, the initial pulse emission rate ri0∈[0,1] and the initial loudness Ai0∈[1,2].
Modern Computerized Bat Algorithm (MCBA)
To avoid the issue of premature convergence during the local search, a novel modified technique for local search is proposed. The proposed technique is named as Modern Computerized Bat Algorithm (MCBA) that enhances the performance of traditional BA and increases the local search capability and gets off from local optimum. The traditional BA is capable of exploiting the search space although, in many situations, it gets stuck into local optima that cause to disturb its efficiency regarding the global search. To escape from getting stuck into local optima in BA, it is essential to enhance the divergence in the search space. The basic concept of the algorithm which we proposed in this study is to enhance the BA by using torus walk rather than uniform random walk and introduction of quasi-sequence-based initialization approach. This usage of the torus walk is introduced to enhance the diversity of BA, and as a result, it can escape from the local optima. In other terms, due to the capability of BA to investigate the new region of search space, it maintained its capability to manipulate the solutions in the local neighborhood. The gap between the traditional BA and MCBA is that incorporating the torus in traditional BA for enhancing it, which produces a new solution for all bats. The two-contribution proposed in MCBA is discussed in detail below.
Torus Walk with Control Parameter
The proposed algorithm is approximately similar to the traditional BA according to the local search: first, a novel solution is achieved with torus walk rather than local random walk from all the available best solutions referred to Eq. (8) and also use the control parameter for the step size produce better results. On the basis of the preceding analysis, the pseudo-code of the MCBA algorithm is displayed in Algorithm 1.
xnew=gBest¯+εAt∗(0.1∗Torus(0,1)∗(Xit−t∗gBest¯))
where Torus(0,1) belongs to Torus distribution and gBest is the current best solution for the entire epoch. The proposed introduction of torus operator preserves enchanting features of traditional BA, especially by the means of fast convergence, whereas it is permitting the algorithm to exploit higher mutation for better diversity. The introduced torus walk worked for a large diversity rate when determining the intensity of BA for better exploitation.
Initialization
Population initialization is a vital factor in SI-BASED algorithm, which considerably influences diversity and convergence. In order to improve the diversity and convergence, rather than applying the random distribution for initialization, quasi-random sequences are more useful to initialize the population. In this paper, the capability of BA has been extended to make it suitable for the optimization problem by introducing a new initialization technique by using low discrepancies sequence, torus to solve the optimization problems in large dimension search spaces. Torus is a geometric term that was first used by the authors to generate a torus mesh for the geometric coordinate system. In computer game development, torus mesh is commonly used and can be generated using the left-hand coordinate system or right-hand coordinate system. In this study, we used R studio version 3.4.3 with the package “Rand toolbox” to generate the torus series Eq. (9) shows the mathematical notation for the torus series. Given below Fig. 1 show sample data generated using Torus distribution.
αk=(f(ks1),…,f(ksd))
where s1 denotes the series of ith prime number and f is a fraction which can be calculated by f=a−f{l}oor(a). Due to the prime constraints the dimension for the torus is limited to the 100,000 only if we use parameter prime in torus function. For more than 100,000 dimension the number must be provided through manual way.
Sample data generated using Torus distribution
Benchmark Function and Experiment Materials
For global optimization, the most considerable variety of benchmark problems can be used. All benchmark problems have their own individual abilities and the variety of detailed characteristics of such functions explains the level of complexity for benchmark problems. For the efficiency analysis of the above-mentioned optimization algorithms, Tab. 1 is displaying the benchmark problems that are utilized. Tab. 2 is explaining the following contents of benchmark problems: name, range, domain, and formulas. In this study, those benchmark problems are incorporated which have been extensively utilized in the literature for conveying a deep knowledge of the performance related to the above-mentioned optimization algorithms.
To measure the effectiveness and robustness of optimization algorithms, benchmark functions are applied. In this study, fifteen computationally-expensive black-box functions are applied with their various abilities and traits. The purpose to utilize these benchmark functions is to examine the effectiveness of the above-mentioned optimization algorithms. Due to this reason, three experiments are conducted. In the first experiment, the efficiency of the proposed MCBA is compared with traditional BA and PSO. In the second experiment, evaluate the complications on the efficiency of the algorithm with the seven discussed procedures (BA, dBA6, PSO, CS6, HS6, DE6, GA6), although the numbers of iterations are confined. The third experiment inspects the statistical test (t-test) by using MCBA with BA and PSO.
Standard benchmark functions
Functions
Domain
f^{*}
X^{*}
F1
−4≤xi≤5
0.00
(0, 0, 0, …, 0)
F2
−10≤xi≤10
0.00
(0, 0, 0, …, 0)
F3
−65.536≤xi≤65.536
0.00
(0, 0, 0, …, 0)
F4
−100≤xi≤100
0.00
(0, 0, 0, …, 0)
F5
−5.12≤xi≤5.12
0.00
(0, 0, 0, …, 0)
F6
−100≤xi≤100
0.00
(0, 0, 0, …, 0)
F7
−5.12≤xi≤5.12
0.00
(0, 0, 0, …, 0)
F8
−5.12≤xi≤5.12
0.00
(0, 0, 0, …, 0)
F9
−5.12≤xi≤5.12
0.00
(0, 0, 0, …, 0)
F10
−1≤xi≤1
0.00
(0, 0, 0, …, 0)
F11
−100≤xi≤100
0.00
(0, 0, 0, …, 0)
F12
−1.28≤xi≤1.28
0.00
(0, 0, 0, …, 0)
F13
−1≤xi≤1
0.00
(0, 0, 0, …, 0)
F14
−100≤xi≤100
0.00
(0, 0, 0, …, 0)
F15
−5.12≤xi≤5.12
0.00
(5 × i)
Parameters of Algorithms
To achieve the effective working of the algorithms, it is compulsory to adjust the parameters coupled with all approaches to their most suitable value. Largely, these parameters are observed before the implementation of the algorithm and maintain uniformity throughout the execution. In various studies, it is suggested that the most appropriate methodology for selecting the parameters of any algorithm is predicted through exhaustive experiments for obtaining the optimal parameters. In this study, the setting parameters are employed in Tab. 3 that is on the basis of literature stated in this section.
In this section, a comparison among optimization methods is performed with each other with reference to capabilities and efficiency with the help of high-dimensional fifteen benchmark functions. Benchmark problems may be embedded to demonstrate the performance of the algorithm at various complex levels. Tab. 4, contains the experimental simulation results on benchmark functions and Tab. 1, presents the comparative results with other state-of-the-art approaches. The exhaustive statistical results are explained in Tab. 5. The experimental results of constrained benchmark test functions are only exhibited by having surfaces with D = 10, 20, and 30. The experimental results of this work may not contemplate the entire competency of all examined algorithms in accordance with the all-possible conditions. All experiments are examined with the help of a computer having a processor of 2.60 GHz, 16 GB RAM, and MATLAB R2015b is working within Windows 8.1. It should be stated that the exhaustive experimental results are predicted through 30 autonomous runs from function f1 to f15, having the population size 40 and the maximum occurrences of iterations are fixed at 1000, 2000, 3000, and 5000. In this study, the initial versions of preferred algorithms are taken into consideration without any modification.
Comparative results for MCBA, BA and PSO on 15 standard benchmark functions
F#
IT
DIM
BA
PSO
MCBA
Best
Worst
Mean
Median
Std. Dev.
Best
Worst
Mean
Median
Std. Dev.
Best
Worst
Mean
Median
Std. Dev.
F1
1000
10
3.57E-08
3.37E-07
1.70E-07
1.50E-07
7.48E-08
2.38E-284
6.62E-10
2.21E-11
1.45E-174
1.19E-10
1.12E-54
1.76E-34
5.88E-36
1.19E-42
3.15E-35
2000
20
8.45E-08
2.78E-07
1.53E-07
1.45E-07
3.87E-08
2.00E-90
1.33E+00
4.46E-02
1.25E-42
2.40E-01
1.31E-38
2.74E-28
9.60E-30
7.37E-33
4.92E-29
3000
30
8.27E-08
2.05E-07
1.49E-07
1.51E-07
2.90E-08
9.24E-58
8.20E+00
2.76E-01
1.40E-26
1.47E+00
2.14E-37
2.65E-27
2.54E-28
3.19E-29
5.95E-28
5000
50
8.63E-08
1.85E-07
1.28E-07
1.25E-07
2.19E-08
4.51E-42
2.67E+01
9.47E-01
2.61E-18
4.79E+00
8.83E-34
3.19E-23
2.10E-24
3.75E-25
6.17E-24
F2
1000
10
1.98E-07
1.07E-06
6.45E-07
5.63E-07
2.28E-07
-9.00E+03
0.00E+ 00
-8.99E+03
-9.00E+03
5.36E+01
6.04E-60
7.59E-36
2.54E-37
2.08E-46
1.36E-36
2000
20
6.42E-07
3.02E-06
1.51E-06
1.46E-06
5.35E-07
-3.80E+04
0.00E+00
-3.76E+04
-3.80E+04
2.12E+03
1.35E-43
5.20E-28
1.74E-29
1.21E-34
9.33E-29
3000
30
2.02E-06
5.89E-06
3.42E-06
3.10E-06
1.18E-06
-8.70E+04
0.00E+00
-8.52E+04
-8.70E+04
9.24E+03
1.92E-38
2.96E-26
1.47E-27
4.29E-29
5.39E-27
5000
50
1.24E-05
2.55E-05
1.77E-05
1.63E-05
3.50E-06
-2.45E+05
0.00E+00
-2.42E+05
-2.45E+05
1.68E+04
2.05E-29
6.07E-21
2.48E-22
8.69E-25
1.09E-21
F3
1000
10
7.18E-17
1.43E-14
5.22E-15
4.19E-15
3.50E-15
2.72E+00
2.72E+ 00
2.72E+00
2.72E+00
8.88E-16
8.83E-114
1.09E-63
5.48E-65
8.70E-77
2.14E-64
2000
20
8.13E-16
8.76E-15
3.20E-15
2.55E-15
1.76E-15
2.72E+00
2.72E+00
2.72E+00
2.72E+00
8.88E-16
5.20E-74
1.05E-52
3.51E-54
2.30E-59
1.88E-53
3000
30
9.24E-16
4.14E-15
2.22E-15
2.18E-15
7.39E-16
2.72E+00
2.72E+00
2.72E+00
2.72E+00
8.88E-16
6.01E-67
1.59E-49
1.01E-50
9.84E-55
3.10E-50
5000
50
6.88E-16
2.82E-15
1.51E-15
1.37E-15
5.55E-16
2.72E+00
2.72E+00
2.72E+00
2.72E+00
8.88E-16
1.74E-50
3.53E-42
1.37E-43
1.97E-46
6.33E-43
F4
1000
10
3.60E-41
1.98E-35
2.01E-36
4.83E-37
3.77E-36
-2.87E+03
0.00E+ 00
-2.87E+03
-2.87E+03
0.00E+00
4.00E-189
1.13E-128
3.76E-130
9.94E-147
2.03E-129
2000
20
5.52E-39
1.98E-36
1.79E-37
4.77E-38
3.57E-37
-2.97E+03
0.00E+00
-2.97E+03
-2.97E+03
9.09E-13
9.89E-156
3.31E-103
1.10E-104
1.87E-120
5.94E-104
3000
30
4.26E-40
4.51E-37
9.54E-38
3.69E-38
1.29E-37
-4.09E+03
0.00E+00
-4.09E+03
-4.09E+03
2.73E-12
1.93E-110
3.98E-90
1.33E-91
3.63E-102
7.14E-91
5000
50
5.19E-39
8.58E+03
3.89E+02
1.16E-05
1614.56
-5.01E+03
0.00E+00
-5.01E+03
-5.01E+03
0.00E+00
5.72E-92
2.07E-75
6.90E-77
1.06E-83
3.71E-76
F5
1000
10
5.69E+00
4.55E+03
7.03E+02
2.41E+02
1.02E+03
5.86E-289
2.24E-14
7.47E-16
5.03E-244
4.02E-15
1.09E-19
6.85E-04
2.29E-05
7.76E-12
1.23E-04
2000
20
1.05E+02
1.12E+04
2.64E+03
1.64E+03
2.65E+03
9.75E-100
8.58E-01
2.88E-02
1.22E-50
1.54E-01
7.04E-26
1.37E-03
7.59E-05
3.49E-10
2.66E-04
3000
30
3.29E+02
3.36E+04
5.55E+03
3.20E+03
6.70E+03
6.54E-63
6.12E+01
2.09E+00
7.29E-28
1.10E+01
1.50E-17
2.21E-04
7.39E-06
1.79E-10
3.97E-05
5000
50
2.66E+03
3.82E+04
1.20E+04
9.46E+03
8.30E+03
2.01E-43
1.17E+03
4.17E+01
9.63E-19
2.11E+02
3.77E-12
7.16E-07
6.23E-08
5.01E-09
1.53E-07
F6
1000
10
1.40E+01
4.79E+01
3.15E+01
3.20E+01
9.01E+00
0.00E+00
7.00E-19
2.33E-20
0.00E+00
1.26E-19
2.75E-07
3.68E-04
8.24E-05
3.57E-05
1.06E-04
2000
20
3.31E+01
5.38E+01
4.58E+01
4.63E+01
4.44E+00
8.56E-170
2.85E+01
9.50E-01
8.52E-80
5.12E+00
6.06E-03
1.42E-01
5.83E-02
6.25E-02
3.55E-02
3000
30
3.12E+01
6.47E+01
5.22E+01
5.39E+01
7.59E+00
1.17E-122
1.87E+02
6.24E+00
3.24E-55
3.35E+01
2.70E-01
5.20E+00
1.37E+00
1.02E+00
1.01E+00
5000
50
5.00E+01
7.06E+01
6.08E+01
6.19E+01
5.39E+00
1.65E-90
1.58E+03
5.31E+01
1.03E-42
2.84E+02
6.24E+00
2.25E+01
1.25E+01
1.18E+01
4.16E+00
F7
1000
10
4.60E+03
7.34E+04
3.04E+04
2.65E+04
1.84E+04
2.93E-288
1.12E-13
3.73E-15
2.51E-243
2.01E-14
4.95E-01
1.00E+00
6.75E-01
6.67E-01
6.85E-02
2000
20
4.15E+04
4.83E+05
2.67E+05
2.49E+05
1.18E+05
4.88E-99
4.29E+00
1.44E-01
6.08E-50
7.70E-01
6.67E-01
7.05E-01
6.75E-01
6.67E-01
1.45E-02
3000
30
2.96E+05
1.87E+06
8.43E+05
7.46E+05
3.43E+05
3.46E-62
3.06E+02
1.05E+01
3.65E-27
5.49E+01
6.67E-01
7.00E-01
6.80E-01
6.67E-01
1.62E-02
5000
50
1.29E+06
4.95E+06
3.25E+06
3.33E+06
8.40E+05
1.00E-42
5.86E+03
2.09E+02
4.81E-18
1.05E+03
6.67E-01
1.03E+00
7.21E-01
6.67E-01
1.13E-01
F8
1000
10
1.84E-15
9.64E+05
1.74E+05
3.74E+04
2.59E+05
4.68E-171
5.06E-06
1.69E-07
3.51E-167
9.09E-07
7.60E-93
8.40E-64
3.05E-65
4.78E-73
1.51E-64
2000
20
3.66E+05
9.41E+07
1.98E+07
1.41E+07
1.68E+07
0.00E+00
1.86E-04
6.19E-06
0.00E+00
3.33E-05
7.68E-76
1.05E-50
4.42E-52
1.88E-59
1.92E-51
3000
30
1.43E+07
6.74E+08
2.16E+08
1.73E+08
1.55E+08
0.00E+00
1.08E-04
3.61E-06
0.00E+00
1.94E-05
1.25E-57
8.53E-40
2.85E-41
1.65E-49
1.53E-40
5000
50
2.93E+08
3.57E+09
1.47E+09
1.30E+09
6.73E+08
0.00E+00
1.92E-04
6.40E-06
0.00E+00
3.44E-05
4.50E-46
3.20E-34
1.15E-35
5.16E-40
5.73E-35
F9
1000
10
9.89E-07
5.35E-06
3.23E-06
2.82E-06
1.14E-06
6.67E-01
1.40E+ 00
6.91E-01
6.67E-01
1.32E-01
3.02E-59
3.80E-35
1.27E-36
1.04E-45
6.81E-36
2000
20
3.21E-06
1.51E-05
7.55E-06
7.30E-06
2.68E-06
6.67E-01
1.04E+01
1.01E+00
6.95E-01
1.75E+00
6.77E-43
2.60E-27
8.72E-29
6.07E-34
4.67E-28
3000
30
1.01E-05
2.94E-05
1.71E-05
1.55E-05
5.89E-06
6.76E-01
2.94E+02
1.06E+01
6.76E-01
5.27E+01
9.60E-38
1.48E-25
7.34E-27
2.14E-28
2.69E-26
5000
50
6.19E-05
1.27E-04
8.85E-05
8.17E-05
1.75E-05
6.76E-01
7.28E+02
2.54E+01
6.76E-01
1.31E+02
1.03E-28
3.04E-20
1.24E-21
4.34E-24
5.45E-21
F10
1000
10
1.09E-04
1.10E-03
5.04E-04
4.05E-04
2.39E-04
1.00E+05
1.00E+ 05
1.00E+05
1.00E+05
2.91E-11
3.38E-74
2.06E-55
1.16E-56
1.98E-62
4.25E-56
2000
20
2.86E-04
1.86E-03
8.59E-04
8.23E-04
3.20E-04
1.00E+05
1.00E+05
1.00E+05
1.00E+05
2.91E-11
6.21E-83
1.50E-67
5.04E-69
3.23E-73
2.69E-68
3000
30
5.51E-04
2.00E-03
1.11E-03
1.02E-03
3.50E-04
1.00E+05
1.00E+05
1.00E+05
1.00E+05
2.91E-11
2.71E-84
6.81E-70
2.36E-71
1.30E-79
1.22E-70
5000
50
6.28E-04
1.89E-03
1.16E-03
1.13E-03
3.35E-04
1.00E+05
1.00E+05
1.00E+05
1.00E+05
2.91E-11
8.99E-97
4.03E-76
1.61E-77
9.40E-84
7.25E-77
F11
1000
10
3.24E-03
7.46E-02
2.84E-02
2.48E-02
1.62E-02
9.84E-134
6.00E-01
3.00E-02
3.05E-91
1.13E-01
5.07E-04
4.16E-02
9.39E-03
6.84E-03
7.62E-03
2000
20
2.86E-02
1.69E-01
7.70E-02
6.59E-02
3.50E-02
1.37E-101
1.30E+00
5.33E-02
1.78E-49
2.35E-01
5.41E-03
7.49E-02
2.06E-02
1.69E-02
1.33E-02
3000
30
5.14E-02
2.62E-01
1.70E-01
1.70E-01
4.55E-02
1.20E-72
3.60E+00
1.77E-01
5.19E-34
6.64E-01
8.72E-03
8.97E-02
3.11E-02
2.79E-02
1.80E-02
5000
50
2.02E-01
7.53E-01
4.33E-01
4.20E-01
1.24E-01
1.83E-50
5.30E+00
3.40E-01
4.43E-23
1.09E+00
1.26E-02
1.31E-01
4.22E-02
3.66E-02
2.27E-02
F12
1000
10
2.11E+02
2.27E+06
2.54E+05
5.89E+03
4.94E+05
0.00E+00
4.98E-12
1.66E-13
2.08E-269
8.94E-13
2.97E-24
1.52E-16
1.22E-17
1.46E-20
3.75E-17
2000
20
5.43E+06
1.10E+22
6.30E+20
1.15E+18
2.20E+21
6.43E-172
3.41E+00
1.14E-01
9.49E-82
6.12E-01
3.86E-20
1.19E-11
1.34E-12
1.15E-13
2.69E-12
3000
30
1.44E+03
3.46E+36
1.51E+35
1.61E+29
6.42E+35
9.83E-120
6.86E+00
2.31E-01
5.38E-53
1.23E+00
4.94E-15
5.88E-09
4.99E-10
2.01E-11
1.20E-09
5000
50
2.44E+03
3.98E+64
1.44E+63
3.93E+56
7.15E+63
7.91E-85
9.48E+01
3.18E+00
1.46E-33
1.70E+01
6.00E-17
1.11E-06
9.12E-08
6.06E-09
2.39E-07
F13
1000
10
2.00E-07
9.62E-07
6.65E-07
6.85E-07
1.83E-07
2.14E-55
7.54E-01
2.52E-02
1.68E-31
1.35E-01
1.07E-60
4.29E-38
7.69E-39
1.86E-44
1.44E-39
2000
20
8.07E-07
2.16E-06
1.45E-06
1.47E-06
4.14E-07
1.67E-24
4.50E+00
1.79E-01
5.47E-13
8.14E-01
1.58E-42
3.24E-29
1.71E-30
2.07E-34
6.21E-30
3000
30
1.93E-06
7.65E-06
3.46E-06
3.00E-06
1.35E-06
3.55E-21
1.18E+01
4.58E-01
2.77E-10
2.12E+00
6.26E-39
6.28E-26
6.11E-27
2.65E-29
1.32E-26
5000
50
9.19E-06
1.10E+02
9.84E+00
1.33E-01
2.27E+01
7.07E-19
1.34E+01
6.16E-01
1.43E-08
2.48E+00
2.59E-30
7.41E-22
7.10E-23
4.26E-24
1.74E-22
F14
1000
10
1.46E-07
5.68E-07
3.11E-07
2.88E-07
1.02E-07
5.25E-143
4.16E-05
1.39E-06
3.54E-96
7.46E-06
1.41E-34
6.80E-25
3.07E-26
1.49E-29
1.26E-25
2000
20
1.79E-07
1.84E+00
6.15E-02
2.63E-07
3.31E-01
2.61E-45
3.71E+00
1.33E-01
2.76E-22
6.67E-01
3.08E-21
2.66E-17
4.68E-18
1.06E-18
6.80E-18
3000
30
1.02E+00
6.23E+03
3.31E+02
2.21E+01
1.21E+03
2.43E-29
6.11E+00
2.35E-01
7.56E-14
1.10E+00
6.60E-20
7.14E-11
9.76E-12
1.97E-12
1.71E-11
5000
50
1.79E+02
1.10E+06
7.82E+04
3.13E+02
2.70E+05
4.46E-20
3.36E+01
1.43E+00
2.32E-09
6.13E+00
9.34E-10
9.25E-05
2.32E-05
1.36E-05
2.21E-05
F15
1000
10
2.11E+02
2.27E+06
2.54E+05
5.89E+03
4.94E+05
0.00E+00
8.79E-51
2.93E-52
0.00E+00
1.58E-51
2.97E-24
1.52E-16
1.22E-17
1.46E-20
3.75E-17
2000
20
5.43E+06
1.10E+22
6.30E+20
1.15E+18
2.20E+21
4.26E-321
2.71E-04
9.02E-06
5.35E-150
4.86E-05
3.86E-20
1.19E-11
1.34E-12
1.15E-13
2.69E-12
3000
30
1.44E+03
3.46E+36
1.51E+35
1.61E+29
6.42E+35
9.49E-257
9.56E-04
3.19E-05
6.11E-118
1.72E-04
4.94E-15
5.88E-09
4.99E-10
2.01E-11
1.20E-09
5000
50
2.44E+03
3.98E+64
1.44E+63
3.93E+56
7.15E+63
4.24E-215
6.87E+00
2.29E-01
2.75E-107
1.23E+00
6.00E-17
1.11E-06
9.12E-08
6.06E-09
2.39E-07
Result of student T-test for all approaches
F#
PSO vs MCBA
BA vs MCBA
IT
T-value
Sig.
T-value
Sig.
F1
1000
+0.73
MCBA
+73.81
MCBA
2000
+0.77
MCBA
+158.43
MCBA
3000
+0.84
MCBA
+180.63
MCBA
5000
+0.86
MCBA
+6132.85
MCBA
F2
1000
+0.6
MCBA
+33.03
MCBA
2000
+0.71
MCBA
+33.14
MCBA
3000
+0.7
MCBA
+33.13
MCBA
5000
+0.54
MCBA
+32.97
MCBA
F3
1000
+181.56
MCBA
+213.99
MCBA
2000
+681.64
MCBA
+714.07
MCBA
3000
+349.41
MCBA
+381.84
MCBA
5000
+2370.81
MCBA
+2403.24
MCBA
F4
1000
+151.22
MCBA
+183.65
MCBA
2000
+343.77
MCBA
+376.2
MCBA
3000
+99.55
MCBA
+131.98
MCBA
5000
+53.22
MCBA
+85.65
MCBA
F5
1000
+40.69
MCBA
+73.12
MCBA
2000
−7.31
PSO
+39.74
MCBA
3000
+85.14
MCBA
+117.57
MCBA
5000
+2.89
MCBA
+35.32
MCBA
F6
1000
−7.95
PSO
+40.38
MCBA
2000
+34.26
MCBA
+66.69
MCBA
3000
+56.8
MCBA
+89.23
MCBA
5000
+111.06
MCBA
+143.49
MCBA
F7
1000
−0.6
MCBA
+33.03
MCBA
2000
+0.71
PSO
+33.14
MCBA
3000
+0.7
MCBA
+33.13
MCBA
5000
+0.54
MCBA
+32.97
MCBA
F8
1000
+4.44
MCBA
+36.87
MCBA
2000
+10.48
MCBA
+42.91
MCBA
3000
+16.05
MCBA
+48.48
MCBA
5000
+28.66
MCBA
+61.09
MCBA
F9
1000
+1.06
MCBA
+33.49
MCBA
2000
+2.86
MCBA
+35.29
MCBA
3000
+10.15
MCBA
+42.58
MCBA
5000
+21.47
MCBA
+53.9
MCBA
F10
1000
+0.68
MCBA
+33.11
MCBA
2000
+1.41
MCBA
+33.84
MCBA
3000
+0.42
MCBA
+32.01
MCBA
5000
+2590.75
MCBA
+38.49
MCBA
F11
1000
+9.26
MCBA
+40.2
MCBA
2000
+9.3
MCBA
+73.07
MCBA
3000
+9.37
MCBA
+41.8
MCBA
5000
+9.39
MCBA
+138.99
MCBA
F12
1000
+9.13
MCBA
+173.36
MCBA
2000
+9.24
MCBA
+160.4
MCBA
3000
+9.23
MCBA
+65.93
MCBA
5000
+9.07
MCBA
+41.5
MCBA
F13
1000
+190.09
MCBA
+222.52
MCBA
2000
+690.17
MCBA
+722.6
MCBA
3000
+16.24
MCBA
+48.67
MCBA
5000
+16.23
MCBA
+48.66
MCBA
F14
1000
+16.07
MCBA
+48.5
MCBA
2000
+19.97
MCBA
+52.42
MCBA
3000
+26.01
MCBA
+5078.81
MCBA
5000
+31.58
MCBA
+64.01
MCBA
F15
1000
−44.19
PSO
+76.62
MCBA
2000
+16.59
MCBA
+22080.69
MCBA
3000
+18.39
MCBA
+6824.01
MCBA
5000
+25.68
MCBA
+134.O2428
MCBA
Discussion
In this section, the achieved results are compared and argued thoroughly, which is explained by the targeted criteria elaborated in Section 5.
The Impact of Increasing the Number of Dimensions on the Performance
The core objective of this section is to review the consequences of tested optimization approaches in high dimensionality, regarding the accuracy and reliability of achieved solutions at the time of solving complex and computationally-expensive optimization problems. The value achieved in each iteration is operated as a performance measure. As a result, the exploitation ability of traditional BA is moderately low, particularly for high-dimensional problems. The results are also disclosed that the BA and PSO are only effective in performance, while they are tackling expensive design problems having low dimensionality. Besides this, MCBA has excellent control on the high dimensionality problems than other methods in spite of the complexity and the superficial topology of the examined problems. The results are demonstrated that MCBA outperforms in higher dimensionality problems. By summarizing it, the dimensionality strongly influences the working of most algorithms, however, it is observed that MCBA is more consistent during the increment of dimensions of the problem. Due to this consistency of MCBA, it is proved that the MCBA algorithm has the greater capability of exploration.
Performance Evaluation of the Proposed Algorithm Against the Standard Bat Algorithm and Particle Swarm Optimization
To evaluate the effectiveness of the proposed MCBA algorithm, it is applied to minimize 15 benchmark functions with different characteristics, including formulations, bounds of search space, global minimum values as seen in Tab. 1. These benchmark test functions are widely employed to test the performance of global optimization approaches. In order to exhaustively verify the efficiency of MCBA, its effectiveness is compared with the traditional BA and PSO. The stopping criteria for all algorithms are that the maximum cycle is reached by the maximum number of iterations. For all benchmark functions, each algorithm is individually run 30 times with each algorithm.
MCBA is compared with standard BA and PSO in terms of the minimum value (Min), the mean value (Mean), and the standard deviation (SD) of the solutions obtained in 30 independents executions by each algorithm. The comparative results of BA, PSO, and proposed MCBA are demonstrated in Tab. 4. Note that the mean and the standard deviation of solutions indicate the solution quality of the algorithms and the stability of the algorithms, respectively. For statistical comparison, a widely known non-parametric statistical hypothesis test the t-test is implemented to compare the implications between the MCBA algorithm and other algorithms at α= 0.05 significance level in Tab. 5.
The sign “+” indicates that MCBA is better than its competitor. And then, to show the convergence speed of MCBA superiority over contenders. From Tab. 5, it can be seen that MCBA is considerably better than BA and PSO on all the test functions except for the functions f11. The solution accuracy obtained by MCBA for the function f2 is worse than PSO but better than BA. In addition, the performance of BA and PSO decreases as their dimensions increase, but MCBA does not have such a defect. In concluding remarks, the MCBA reveals the extreme convergence in its performance on approximately all the benchmark test functions. And the comprehensive experimental results show that MCBA performance is remarkably outstanding than BA and PSO.
Data Classification
For further verification of the performance of the proposed algorithm BPA, PSONN, BANN, and MCBA-NN, a comparative study for real-world benchmark datasets problem is tested for the training of the a.pngicial neural network. We have performed experiments using eight benchmark datasets (Iris, Cancer, Diabetes, Memo Grapy, Balance Scale, Heart, Horse, Seed) exerted from the worldwide famous machine-learning repository of UCI. Training weights are initialized within the interval [ −50; 50]. The accuracy of the feed-forward a.pngicial neural networks is tested in the form of Root Mean Squared Error (RMSE). Tab. 6 shows the characteristics of the datasets used. A multi-layer feed-forward a.pngicial neural network is trained with a backpropagation algorithm, Standard PSO, standard BA, and MCBANN. Comparison of these training approaches tested on real classification problem datasets taken from the UCI repository. Cross-validation method used to compare the performances of different classification techniques. In the paper, the k-fold cross-validation method used for comparison of classification performances for the training of a.pngicial neural networks with backpropagation, standard PSO, standard BA, and proposed MCBANN is used. The k-fold cross-validation method was proposed and used in the experiment with a value k = 10. The Dataset is divided into 10 chunks where each chunk of data contains the same proportion of each class of dataset. One chunk is used as testing while nine chunks are used as the training phase. The experimental results of algorithms such as backpropagation, standard PSO, standard BA, and proposed MCBA-NN are compared with each other at eight well-known real datasets taken from UCI, and their performances are evaluated. In Tab. 7. The simulation results show that training of an a.pngicial neural network with MCBA algorithm outperforms in accuracy and is capable of providing good classification accuracy than the other traditional approaches. The MCBA algorithm may be used effectively for data classification and statistical problems in the future as well. Tab. 7 represents the accuracy graph for eight datasets.
Characteristics of UCI benchmarks data set
S. No
Data set
Instances
Nature
No. of inputs
No. of classes
1
Iris
150
Real
4
3
2
Cancer
699
Real
19
2
3
Diabetes
768
Real
8
2
4
Memo graphy
961
Real
9
2
5
Balance scale
625
Real
4
3
6
Heart
270
Real
13
2
7
Horse
368
Real
27
2
8
Seed
210
Real
7
3
For conducting statistical significance of classification accuracy, the software RStudio version 1.2.5001 is used. Due to the more than two independent sampling groups, the Analysis of Variance (ANOVA) One Way Test is adopted for analyzing the statistical significance. Although, ANOVA One-Way Sample Test describes the result that either there is a mean difference among the distinct groups or not. Hence, for such kind of testing, we need to elaborate on the mean differences among all combinations of groups. For multiple comparisons (group-wise) the Post hoc Tukey’s multiple comparison tests are implemented after ANOVA One Way Sample testing with the alpha value 0.05 level of significance, which gives the 95% of a confidence interval. Tab. 8 is representing the outcomes of the One-way ANOVA test that is conducted on the testing accuracy of classification data sets. The p-value in Tab. 2, the mean difference is 0.02398 < 0.05 that gives proof that there is a significant mean difference among all approaches. Fig. 2 displays the graph of the ANOVA test, in which it can be observed that MCBA-NN is best than other approaches. For the concern of group-wise comparison, the graph of Tukey’s test is displayed in Fig. 3, where all combinations of groups are examined and plotted. According to the generated graph, it can be seen that there is a significant mean difference between MCBA-NN & BPA and the combination of BPA & BANN is near to statistical significance with a 95% confidence interval.
Results of 10-fold classification rates of ANN-training methods in for 8 data sets for accuracy
S. No.
Data sets
BPA
PSONN
BANN
MCBA-NN
Tr. Ac (%)
Ts. Ac (%)
Tr. Ac (%)
Ts. Ac (%)
Tr. Ac (%)
Ts. Ac (%)
Tr. Ac (%)
Ts. Ac (%)
1
Iris
98
80
99
97
98.9
97
99
98.5
2
Cancer
70
54
76
67
90
85
91
88
3
Diabetes
86
65
88.7
69.1
88
73
90
90
4
Memo grapy
77
71
90
82
88
86
92
88
5
Balance scale
70
68
74
73
85
74
80
78
6
Heart
79
68
99
68
99
72
90
76
7
Horse
64
58
74
62
72
62
72
62
8
Seed
84
71
87
84
98
86
94
89
One way ANOVA results of MCBA-NN
Parameter
Relation
Sum of squares
df
Mean square
F
Significance
Testing accuracy
Among groups
1231.5
3
410.50
3.6681
0.02398
One way ANOVA test graph of MCBA-NNMulti-comparison Post hoc Tukey test graph on all version of ANN and BPA
Conclusion
The use of meta-heuristic algorithmic techniques in ANN and ML is gaining popularity and is expected to play an important part in the improvement of current and future systems. This paper presents a new local search technique. For experimental validation, a comprehensive set of benchmark test functions are utilized. The simulation results reveal that the use of the proposed technique MCBA maintains the diversity of the swarm, improves the convergence speed, and finds a better region of the swarm. The proposed approach MCBA contains higher diversity and enhances the local searching ability. The experimental results depict that the MCBA has superior accuracy of convergence and avoids local optima in a better way. The proposed technique MCBA is much better when it is compared with traditional PSO, Standard BA, and also compared with seven state of art algorithms (BA, dBA, PSO, CS, HS, DE, GA) and provides better results. The core objective of this research is applicable to other stochastic based meta-heuristic algorithms that develop the future direction of our work.
The authors would like to thank the editors of CMC and anonymous reviewers for their time and for reviewing this manuscript. Also, the author very sincerely appreciate Professor Dr. Yong-Jin Park (IEEE Life member and former Director IEEE Region 10) for his valuable comments and suggestions.
Funding Statement: The APC was funded by PPPI, University Malaysia Sabah, KK, Sabah, Malaysia, https://www.ums.edu.my.
Conflicts of Interest: The authors declare that they have no conflicts of interest to report regarding the present study.
ReferencesS. S.Rao, S.Yılmaz and E. U.Küçüksille, “A new modification approach on bat algorithm for solving optimization problems,” G.Zhang, J.Lu and Y.Gao, W. H.Bangyal, J.Ahmad, and H. T.Rauf, “Optimization of neural network using improved bat algorithm for data classification,” W. H.Bangyal, J.Ahmad, I.Shafi, and Q.Abbas, “A forward only counter propagation network-based approach for contraceptive method choice classification task,” W. H.Bangyal, J.Ahmad, H. T.Rauf, and R.Shakir, “Evolving a.pngicial neural networks using opposition based particle swarm optimization neural network for data classification,” in Int. Conf. on Innovation and Intelligence for Informatics, Computing, and Technologies, IEEE, pp. 1–6, 2018.W. H.Bangyal, A.Hameed, W.Alosaimi, and H.Alyami, “A new initialization approach in particle swarm optimization for global optimization problems,” A.Ashraf, W. H.Bangyal, H. T.Rauf, S.Pervaiz, and J.Ahmad, “training of a.pngicial neural network using new initialization approach of particle swarm optimization for data classification,” Int. Conf. on Emerging Trends in Smart Technologies, IEEE, pp. 1–6, 2020.K.Nisar, M. H. A.Hijazi and A. A. A.Ibrahim, “A new model for virtual machine migration with software defined networking,” in The Fourth Int. Conf. on Computing Science, Computer Engineering, and Education Technologies, Beirut, Lebanon, 2017.E. R.Jimson, K.Nisar and M. H. AdHijazi, “The state of the art of software defined networking (SDN) issues in current network architecture and a solution for network management using the SDN,” A. A. A.Ibrahim and K.Nisar, “Future internet and named data networking hourglass, packet and node architecture,” S.Harada, Y. ParkZ.Yan, K.Nisar and A. A. A.Ibrahim, “Data aggregation in named data networking,” in IEEE Region 10 Conf. (TENCON), Penang, Malaysia, pp. 1839–1842, 2017.Y.Zhang, X.Lan, J.Ren and L.Cai, “Efficient computing resource sharing for mobile edge-cloud computing networks,” K.Nisar, A.Amphawan, S.Hassan and N. I.Sarkar, “A comprehensive survey on scheduler for VoIP over WLANs,” K.Nisar, A. M.Said and H.Hasbullah, “Enhanced performance of packet transmission using system model over VoIP network,” in ITSim, IEEE 2010, Malaysia, pp. 1005–1008, 2010.N. I.Sarkar, K.Nisar and L.Babbage, “Performance studies on campus-wide focus on ftp, video and VoIP ethernet network,” J.Shuja, R. W.Ahmad, A.Gani, A. I. A.Ahmed, A.Siddiqaet al., “Greening emerging IT technologies: Techniques and practices,” I. A.Lawal, A. M.Said, K.Nisar and A. A.Mu’azu, “A distributed QoS-oriented model to improve network performance for fixed WiMAX,” I. A.Lawal, A. M.Said, K.Nisar, P.A.Shah and A. R. A.Mu’azu, “Throughput performance improvement for VoIP applications in fixed WiMAX network using client-server model,” M. R.Haque, S. C.Tan, Z.Yusoff, K.Nisar, C. K.Leeet al., “SDN architecture for UAVs and EVs using satellite: A hypothetical model and new challenges for future,” in IEEE 18th Annual Consumer Communications & Networking Conf., Las Vegas, NV, USA, pp. 1–6, 2021.Y.Zhang, P.Agarwal, V.Bhatnagar, S.Balochian and J.Yan, “Swarm intelligence and its applications,” M.Dorigo and G. D.Caro, “Ant colony optimization: A new meta-heuristic,” in Proc. of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), Washington, DC, USA, IEEE, pp. 1470–1477, 1999.J.Kennedy and R.Eberhart, “Particle swarm optimization,” in Proc. of ICNN’95-Int. Conf. on Neural Networks, Perth, WA, Australia, IEEE, pp. 1942–1948, 1995.D. T.Pham, A.Ghanbarzadeh, K.Ebubekir, S.Otri, S.Rahimet al., “The bees algorithm—A novel tool for complex optimisation problems,” in X. S.Yang, “A new metaheuristic bat-inspired algorithm,” in A.Chakri, H.Ragueb and X. S.Yang, “Bat algorithm and directional bat algorithm with case studies,” in A. H.Gandomi, X. S.Yang, A. H.Alavi and S.Talatahari, “Bat algorithm for constrained optimization tasks,” S.Khatir, I.Belaidi, R.Serra, M. A.Wahab and T.Khatir, “Numerical study for single and multiple damage detection and localization in beam-like structures using bat algorithm,” E.Ali, “Optimization of power system stabilizers using bat search algorithm,” S.Biswal, A.Barisal, A.Behera and T.Prakash, “Optimal power dispatch using bat algorithm,” in 2013 Int. Conf. on Energy Efficient Technologies for Sustainability, Piscataway, NJ, IEEE, pp. 1018–1023, 2013.J. W.Zhang and G. G.Wang, “Image matching using a bat algorithm with mutation,” M.Kang, J.Kim and J. M.Kim, “Reliable fault diagnosis for incipient low-speed bearings using fault feature analysis based on a binary bat algorithm,” X. S.Yang and A. H.Gandomi, “Bat algorithm: A novel approach for global engineering optimization,” K.Sörensen, “Metaheuristics—The metaphor exposed,” M.Zhang, C.Zhihua, Y.Chang, R.Yeqing, C.Xingjuanet al., “Bat algorithm with individual local search,” in Int. Conf. on Intelligence Science, Beijing, China, Springer, pp. 442–451, 2018.W.Ali, H. M.Ghanem and A.Jantan, “Hybridizing bat algorithm with modified pitch adjustment operator for numerical optimization problems,” in C.Gan, C.Weihua, M.Wu and X.Chen, “A new bat algorithm based on iterative local search and stochastic inertia weight,” Q.Liu, L.Wu, W.Xiao, F.Wang and L.Zhang, “A novel hybrid bat algorithm for solving continuous optimization problems,” R.Chaudhary and H.Banati, “Modified shuffled multipopulation bat algorithm,” in 2018 Int. Conf. on Advances in Computing, Communications and Informatics, Bangalore, Karnataka, India, IEEE, pp. 943–951, 2018.L.Rongheng and Y.Zezhou, “A bat algorithm for SDN network scheduling,” G. D.Krishna and D.Sanjoy, “A dynamically adapted and weighted bat algorithm in image enhancement domain,” X.Zhou, X.Zhao and Y.Liu, “A multiobjective discrete bat algorithm for community detection in dynamic networks,” W.Zhigong and Y.Wang, “An optimal dispatch model of power economic load system based on modified bat algorithm,” R.Ramasamy and S.Rani, “Modified binary bat algorithm for feature selection in unsupervised learning,” I.Messaoudi and N.Kamel, “A multi-objective bat algorithm for community detection on dynamic social networks,” Y.Wang, P.Wang, J.Zhang, Z.Cui, X.Caiet al., “A novel bat algorithm with multiple strategies coupling for numerical optimization,” Z.Cui, F.Li and W.Zhang, “Bat algorithm with principal component analysis,” Y.Lu and T.Jiang, “Bi-population based discrete bat algorithm for the low-carbon job shop scheduling problem,” D.Pebrianti, N. Q.Ann, L.Bayuaji, N. R. H.Abdullah, Z.Md Zainet al., “Extended bat algorithm (EBA) as an improved searching optimization algorithm,” in Proc. of the 10th National Technical Seminar on Underwater System Technology 2018, Malaysia, Springer, pp. 229–237, 2019.S.Lyu, Z.Li, Y.Huang, J.Wang and J.Hu, “Improved self-adaptive bat algorithm with step-control and mutation mechanisms,” Q.Yuanhang, P.Hou, L.Yang and G.Yang, “Simultaneous optimisation of cable connection schemes and capacity for offshore wind farms via a modified bat algorithm,” S.Lyu, B.Wu, Z.Li, T.Hong, J.Wanget al., “Tree-type irrigation pipe network planning using an improved bat algorithm,”