close

Вход

Забыли?

вход по аккаунту

?

ICITECH.2017.8079912

код для вставкиСкачать
2017 8th International Conference on Information Technology (ICIT)
Hybridizing cuckoo search algorithm with hill
climbing for numerical optimization problems
Mohammad Shehab1
Ahamad Tajudin Khader2
School of Computer Sciences
Universiti Sains Malaysia
Penang, Malaysia
moh.shehab12@gmail.com
School of Computer Sciences
Universiti Sains Malaysia
Penang, Malaysia
Mohammed Azmi Al-Betar3
Department of Information
Technology, Al-Huson University College,
Al-Balqa Applied University, P.O. Box 50,
Al-Huson, Irbid, Jordan
Laith Mohammad Abualigah4
Abstract— The Cuckoo Search Algorithm (CSA) is a promising
metaheuristic algorithm. It is applied to solve many problems in
different fields. This paper proposes a new cuckoo search
algorithm by combining the cuckoo search algorithm with the
Hill Climbing method for solving the integer and minimax
optimization problems. The proposed algorithm is named hybrid
cuckoo search and hill climbing (CSAHC). CSAHC starts the
search by applying the standard cuckoo search for the number of
iterations then the best-obtained solution is passed to the hill
climbing algorithm as an intensification process to accelerate the
search and overcome the slow convergence of the standard
cuckoo search algorithm. The proposed algorithm balances
between the global exploration of the cuckoo search algorithm
and the deep exploitation of the hill climbing method. The
validation of the performance is determined by applying 13
benchmarks. The results of experimental simulations indicated
that the CSAHC performs better than standard CSA.
Key words: Metaheuristic algorithms, Cuckoo search algorithm,
Hill climbing, exploration and exploitation.
I. INTRODUCTION
Optimization resides in many domains, such as
engineering, energy, economics, medical, and computer
science [1]. It is mainly concerned with finding the optimal
values for several decision variables to form a solution to
problem optimization. This solution is optimally considered
when the decision maker is satisfied with it [2]. An
optimization problem is the minimization or maximization of a
suitable decision-making algorithm normally adapted to the
approximation methods. The principle of decision making
entails choosing between several alternatives. The result of this
choice is the selection of the best decision from all choices.
School of Computer Sciences
Universiti Sains Malaysia
Penang, Malaysia
Optimization algorithms developed based on natureinspired ideas deal with selecting the best alternative in the
sense of the given objective function. The optimization
algorithm can be either a heuristic or a metaheuristic approach.
Heuristic approaches are problem-designed approaches where
each optimization problem has its own heuristic methods that
are not applicable for other kinds of optimization problems.
The metaheuristic-based algorithm is also a general solver
template that can be adapted for various kinds of optimization
problems by properly tweaking its operators and configuring its
parameters. To elaborate, each optimization algorithm can be
categorized into three classes: evolutionary algorithms (EAs),
swarm-based algorithms, and trajectory-based algorithms.
Examples of EAs include genetic algorithms (GAs) [3], genetic
programming (GP) [4], and differential evolution (DE) [5].
Examples of swarm-based algorithms include artificial bee
colony (ABC) [6], particle swarm optimization (PSO)[7], and
cuckoo search algorithm (CSA) [8]. Examples of trajectorybased algorithms includes tabu search (TS) [9], simulated
annealing (SA) [10], β-hill climbing [11].
The performance of the population-based algorithms is
measured through checking its ability to establish a proper
trade-off between exploration and exploitation. Where, the
algorithm which has a weak balance between exploration and
exploitation be more likely to the trapping in local optima,
premature convergence, and stagnation [12].
Population-based search algorithm is normally very powerful
in exploring several regions of the problem search space.
However, it has difficulty in determining the local optima in
each region. By contrast, deep searching of the local searchbased algorithm is very efficient in a single search space region
978-1-5090-6332-1/17/$31.00 ©2017 IEEE
36
2017 8th International Conference on Information Technology (ICIT)
but not for several search space regions [13]. Thus, sometimes,
it is very beneficial to hybridize a local and a population
search-based method to complement their advantages in a
single optimization framework. Based on the above suggestion
and through hybridization, the search can strike a balance
between the wide range of exploration and nearby exploitation
of the problem search space. In this context, CSA has been
hybridized with other population-based algorithms to improve
its performance in tackling complex optimization problems
[14].
Quadratic assignment problems (QAPs) are considered to be
NP-hard problems, which cannot be easily solved by exact
methods. Therefore, Dejam et al. [15] proposed a hybrid
algorithm combined with the CSA of TS (i.e., CSA-TS) to
solve QAPs. In their research, the QAPs were initially tackled
using CSA. Thereafter, these were combined with TS, which
focused on the local search to increase the optimization
precision. The experimental results indicated that the proposed
algorithm performs better than ABC and GA.
Ali and Tawhid [16] proposed a hybrid algorithm, which
includes the CSA and Nelder–Mead method (HCSANM), to
solve the integer and minimax optimization problems.
HCSANM started with a basic CSA for a number of iterations,
chose the best results achieved, and then passed it to Nelder–
Mead, similar to intensification for increasing the search and
eliminating the slow convergence struggle of the basic CSA.
Therefore, HCSANM controls the balance using the basic CSA
for global exploration, and NM method is utilized for deep
exploitation [17]. The validation of the performance was
determined by applying the HCSANM on 7 integer
programming and 10 minimax problems. The results showed
the efficiency of HCSANM and its ability to deal with
minimax optimization problems in a reasonable time.
The linear least squares problem solved by hybridization
algorithm between Newton method (NM) and CSA is called
CSANM [18]. The authors benefited from CSA for fast
convergence and global search as well as from NM for the
ability of strong local search. The experimental results showed
the convergence efficiency and computational accuracy of the
CSANM in comparison with the basic CSA and HS based on
NM (HSNM).
In this research, a new hybrid optimization approach is
developed by hybridizing the cuckoo search algorithm with hill
climbing to solve optimization problems. The proposed
approach is evaluated on thirteen
benchmark functions.
Experimental results demonstrate that the proposed algorithm
performs better than the basic cuckoo search algorithm.
The structure of this paper is organized as follows. Sections
II and III describes the CSA and HC in brief, respectively. The
CSAHC approach is presented in Section IV. Subsequently,
our method is evaluated through fourteen benchmarks in
Section V by comparing with the original CSA. Finally, the
conclusion and future works are given in Section VI.
II. CUCKOO SEARCH ALGORITHM
The use of CSA in the optimization context was proposed by
Yang and Deb in 2009 [8]. To date, work on this algorithm has
significantly increased, and the CSA has succeeded in having
its rightful place among other optimization methodologies [19].
This algorithm is based on the obligate brood parasitic
behaviour found in some cuckoo species, in combination with
the Levy flight behaviour discovered in some birds and fruit
flies. The CSA is an efficient metaheuristic swarm based
algorithm that efficiently strikes a balance between local
nearby exploitation and global-wide exploration in the search
space problem [20].
The cuckoo has a specific way of laying its eggs to distinguish
it from the rest of the birds [21]. The following three idealized
rules clarify and describe the standard cuckoo search:
x Each cuckoo lays one egg at a time and dumps it in a
randomly chosen nest.
x The best nests with high-quality eggs will be carried
over to the next generations.
x The number of available host nests is fixed, and the
egg laid by a cuckoo is discovered by the host bird
with a probability PD  0,1 . In this case, the host
bird can either get rid of the egg or simply abandon
the nest and build a completely new nest. In addition,
probability PD can be used by the n host nest to
replace the new nests.
Figure 1 shows the representation of the CSA search process.
Similar to other swarm-based algorithms, the CSA starts with
an initial population of n host nests. These initial host nests will
be randomly attracted by the cuckoos with eggs and also by
random Levy flights to lay the eggs. Thereafter, nest quality
will be evaluated and compared with another random host nest.
In case the host nest is better, it will replace the old host nests.
This new solution has the egg laid by a cuckoo. If the host bird
discovers the egg with a probability PD  0,1 , the host
either throws out the eggs, or abandons it and builds a new
nest. This step is done by replacing the abundant solutions with
the new random solutions.
1: Objective function f(X), X= (x1, x2, ..., xd)T
2: Generate initial population of n host nests Xi (i=1, 2, …, n)
3: While t Max_iterations do
4:
Get a cuckoo randomly by Levy flights
5:
Evaluate its quality/ fitness Fi
6:
Choose a nest among n (say, j) randomly
7:
If Fi ! Fj then
8:
replace j by the new solution;
9:
End If
10:
A fraction (Pa) of worse nests are abandoned and
new ones are built;
11:
Keep the best solutions
12:
Rank the solutions and find the current best
13: End While
14: Postprocess results and visualization
978-1-5090-6332-1/17/$31.00 ©2017 IEEE
37
2017 8th International Conference on Information Technology (ICIT)
Fig. 1. Algorithm 1, Pseudo code of the cuckoo search algorithm
Yang and Deb used a certain and simple representation for the
implementation, with each egg representing a solution. As the
cuckoo lays only one egg, it also represents one solution. The
purpose is to increase the diversity of new, and probably better,
cuckoos (solutions) and replace them instead with the worst
solutions. By contrast, the CSA can be more complicated by
using multiple eggs in each nest to represent a set of solutions.
The CSA, as a bat algorithm [22] and an FA [23], uses a
balance between exploration and exploitation. The CSA is
equiponderant to the integration of a local random walk. The
local random walk can be written as
xit 1
Where
x
t
j
xit Ds … H ( PD H ) … ( xtj xkt )
and x
t
k
(1)
are two different solutions selected
randomly by random permutation, H (u) is a Heaviside
function, generates a random number drawn from a uniform
distribution between 0 and 1 and s is the step size.
Furthermore, global explorative random walk by using Levy
flights can be expressed as follows:
xit 1
And
DL ( s , O )
xit DL( s, O ),
(2)
O*(O ) sin( SO / 2) 1
, s !! s 0 ! 0
S
s 1O
(3)
Where L is the characteristic scale of the problem of interest,
and D > 0 is a factor of size scaling. The value can be
calculated by D O(L / 10) ; for more affectivity,
D
O(L / 100) can be used. In addition, xit
in the above
equation represents the current location, which is the only way
t 1
to determine the next location x i
. This is called the random
walk and the Markov chain. In the second term, DL( s, O )
represents the transition probability. However, to avoid
premature convergence and increase diversity (not only
confined in a local optimum), the new solutions should be
generated on a remote sufficient distance from the current best
solution and some random elements should be included.
possible. Then, loop until a solution is found or until there are
no new operators left to be applied in the current state. Also,
inside the loop there are two steps. The first step, select an
operator that has not yet been applied to the current state and
apply it to produce the new state. The second step, evaluate the
new state. Algorithm 2 shows the pseudo-code of the HC
algorithm, which proves the simplicity of hill climbing.
Based on the above, in HC the basic idea is to always head
towards a state which is better than the current one. So, it
always improves the quality of a solution [25].
1: i = initial solution
2: While f(s) d f (i) for all s  Neighbours (i) do
3:
Generates an s  Neighbours (i);
4:
If fitness (s) ! fitness (i) then
5:
Replace s with the i;
6:
End If
7: End While
Fig. 2. Algorithm 2, Pseudo code of the hill climbing algorithm
IV. THE PROPOSED METHODOLOGY: CSA-HILL CLIMBING
Based on the introduction of CSA and HC in the previous
sections, this section provides a detailed description of the
proposed cuckoo search algorithm with hill climbing
(CSAHC).
CSA is inspired from the “brood parasitic” cuckoo species in
combination with certain birds and fruit flies [6]. Levy flight
has been applied for global exploration and proved its
efficiency through achieving good results [26][27]. This
resulted in CSA being balanced between exploration and
exploitation [28][29]. However, sometimes it exploits solutions
poorly with slow convergence. For that reason, the proposed
algorithm balances between the global exploration of the CSA
and the deep exploitation of the HC.
CSAHC starts the search by applying the standard cuckoo
search for the number of iterations. The best-obtained solution
is then passed to the HC to accelerate the search and overcome
the slow convergence of the standard cuckoo search algorithm.
As shown in Figure 3, the HC is an iterative algorithm that
starts with an arbitrary solution to a problem and subsequently
attempts to determine a better solution by incrementally
changing a single element of the solution. When the change
produces a better solution, incremental change is performed on
the new solution, which is repeated until no further
improvements can be found. It then returns the solution to the
CSA to check it through the fraction probability PD .
III. HILL CLIMBING
Hill Climbing (HC) is a mathematical optimization
technique which belongs to the family of local search [24]. It
searches for a better solution in the neighborhood through
evaluating the current state. If it is also goal state, then return to
it and quit. Otherwise, continue updating the current state, if
V. THE EXPERIMENTAL RESULTS ANALYSIS
In this section, the proposed CSAHC was tested through an
array of experiments. For testing purposes, we implemented
the original version of CSA. We compared results of CSAHC
978-1-5090-6332-1/17/$31.00 ©2017 IEEE
38
2017 8th International Conference on Information Technology (ICIT)
with the original results. This comparison is shown in the
tables within this section.
A. Benchmark Functions
To test the performance of a CSAHC, 13 well-known
benchmark functions are used for comparison. Table I
describes these benchmark functions in terms of the optimum
solution after a predefined number of iterations and the rate of
convergence to the optimum solution. Further information
about all the benchmark functions can be found in
[30][31][32].
A. Experimental results and algorithms settings
The two parameters of the CSA are the number of host
nests (population size n) and the probability of discovery PD .
In this section, the different settings have been used such as n
(5, 10, 15, 20, 50, 100, 150, 250, 500) and for PD (0, 0.01,
0.05, 0.1, 0.15, 0.2, 0.25, 0.4, 0.5).
Fig. 3. Flowchart of the CSAHC Algorithm
The simulations illustrated the best value for n=25
and PD =0.25 which used for most optimisation
problems leaving no need to perform parameter
tuning. The tests have been run on 10, 25, 50, and
100 dimensions for a maximum of 100000 function
evaluations. All tests have been run 100 times. Table
II lists the parameter settings. Mean, standard
deviation and the best results for basic CSA and
hybrid CSAHC are shown in Table III.
TABLE I. BENCHMARK FUNCTIONS
978-1-5090-6332-1/17/$31.00 ©2017 IEEE
39
2017 8th International Conference on Information Technology (ICIT)
TABLE II. PARAMETER SETTING
Parameter
Runtime
Max Cycle
n
D
Value
generation until the twenty-seventh generation) showed that
the CSA achieved better solutions than CSAHC.
100
100000
25
5/10/50/100
PD
0.25
All the experiments are conducted using a computer
with processor Intel(R) Core (TM) i7-6700K CPU
4.00 GHz with 16 GB of RAM and 64-bit for
Microsoft Windows 10 Pro. The source code is
implemented using MATLAB (R2015a).
Experimental results with 5, 10, 50 and 100
dimensions are shown in Table III. The table is
divided into four parts based on dimension. Each part
has two columns with results for side-by-side
comparison. In the first column, we show results of
basic CSA, and in the second, results of the hybrid
CSAHC. As shown in Table III, the CSAHC
achieved the better solutions compared with basic
CSA in all 13 benchmark functions. The results
illustrated the preference of CSAHC especially in
tests with five dimensions. Regarding the other
dimensions, the difference between the results was
stable based on the results of five dimensions.
Fig. 4. Performance comparison for the F1 Ackley func.
Fig. 5. Performance comparison for the F2 Griewank func.
The most representative convergent curves are
illustrated in Figs. 4 to 13. The values in the figures
are the mean function optimum, which are the true
values.
Figure 4 exhibits that CSBA is more capable of
finding better solutions compared with the CSA.
While, both Figures 5 and 7 illustrated that CSAHC
and CSA are very close in the initial analysis, but
after that, the CSAHC shows clear superiority until
the end.
Fig. 6. Performance comparison for the F3 Penalty 1 func.
The results in Figures 6, 10 and 11 show that the
CSA initially has better solutions compared with
CSAHC. However, in Figure 6, CSAHC achieved
better solutions after the seventh generation.
Nevertheless, the results of CSAHC and CS are very
close until the end, with a preference for CSAHC.
Figures 10 and 11 show that the CSAHC started to
investigate better solutions after the fourth
generation.
Figures 8 and 12 recorded similar results with the results for
both CSAHC and CSA being very close from the beginning
through to the end. Figure 9 is divided into three parts. The
initial and end parts show that CSAHC has better results
compared with CSA. While the middle part (from twelfth
Fig. 7. Performance comparison for the the F4 Penalty 1 func.
978-1-5090-6332-1/17/$31.00 ©2017 IEEE
40
2017 8th International Conference on Information Technology (ICIT)
Fig. 8. Performance comparison for the F6 Rastrigin func.
Fig. 11. Performance comparison for the the F10 Schwefel 2.22 func.
Fig. 12. Performance comparison for the the F12 Sphere func.
Fig. 9. Performance comparison for the F7 Rosenbrock func.
VI. CONCLUSION AND FUTURE WORK
Fig. 10. Performance comparison for the F9 Schwefel 1.2 func.
In the present work, a novel metaheuristic CSAHC method is
proposed for solving global optimisation tasks. We improved
the CSA by combining it with HC and evaluated the
performance of the both CSAHC and CSA on the 13
benchmark functions. The hybridization enhanced the
exploration of basic CSA by using the HC which is capable of
dealing with local searches. The experimental results present
that the CSAHC achieved better results compared with the
basic CSA. Future work will investigate the hybrid CSA
performance in other benchmarks and real-life problems.
REFERENCES
[1]
[2]
[3]
[4]
S. M. Z. Mohammed, A. T. Khader, and M. A. AlBetar, “3-SAT Using Island-based Genetic
Algorithm,” IEEJ Trans. Electron. Inf. Syst., vol.
136, no. 12, pp. 1694–1698, 2016.
A. L. BOLAJI, A. T. Khader, M. A. Al-Betar, and
M. A. Awadallah, “Artificial bee colony algorithm,
its variants and applications: A survey.,” J. Theor.
Appl. Inf. Technol., vol. 47, no. 2, 2013.
J. H. Holland, Adaptation in natural and artificial
systems: an introductory analysis with applications
to biology, control, and artificial intelligence. U
Michigan Press, 1975.
J. R. Koza, “Genetic programming II: Automatic
[5]
[6]
[7]
discovery of reusable subprograms,” Cambridge,
MA, USA, 1994.
R. Storn and K. V Price, “Minimizing the Real
Functions of the ICEC’96 Contest by Differential
Evolution.,” in International Conference on
Evolutionary Computation, 1996, pp. 842–844.
D. Karaboga, “An idea based on honey bee swarm
for numerical optimization,” 2005.
K. James and E. Russell, “Particle swarm
optimization,” in Proceedings of 1995 IEEE
International Conference on Neural Networks,
978-1-5090-6332-1/17/$31.00 ©2017 IEEE
41
2017 8th International Conference on Information Technology (ICIT)
TABLE III. MEAN, STANDARD DEVIATION AND BEST OPTIMA FOR CSA AND CSAHC WITH 5, 10, 50 AND 100 DIMENSIONS
D=5
Function
Ackley
Griewank
Mean
Std.
Best
Mean
Std.
Best
Penalty #1
Mean
Std.
Best
Penalty #2
Mean
Std.
Best
Quartic
with noise
Mean
Std.
Best
Rastrigin
Mean
Std.
Best
Rosenbroc
k
Mean
Std.
Best
Schwefel
2.26
Mean
Std.
Best
Schwefel
1.2
Mean
Std.
Best
D=10
D=50
D=100
CSA
CSAHC
CSA
CSAHC
CSA
CSAHC
CSA
CSAHC
6.63E-13
5.08E-21
4.61E-14
5.08E-23
1.25E-06
4.75E-10
0.01
2.35E-03
1.48E-12
2.14E-19
1.02E-12
2.14E-23
5.68E-06
1.06E-07
0.63
5.24E-02
8.88E-16
1.18E-27
2.84E-18
2.51E-29
5.18E-08
2.98E-13
1.91E-02
4.54E-05
2.52E-14
5.20E-18
2.34E-14
2.48E-19
2.22E-04
5.90E-09
2.14
7.42E-02
5.64E-14
7.37E-18
5.22E-14
9.50E-18
4.97E-04
8.25E-09
2.96
9.59E-02
4.27E-15
6.57E-21
5.25E-16
7.25E-23
8.47E-06
2.91E-14
7.39E-02
9.99E-07
3.88
5.65
8.65E-04
1.45E-04
7.21E-03
7.14E-08
1.03
9.05
3.56E-04
2.17E-05
4.37E-05
1.02E-08
3.38
7.56
4.85E-02
2.70E-02
0.28
7.42E-05
6.87
7.50
2.99
5.01E-02
1.14
2.05E-02
1.65E-30
4.01E-38
1.61E-30
2.21E-39
1.60E-18
3.57E-22
3.26E-08
3.46E-11
3.70E-30
8.25E-36
3.60E-30
7.87E-39
3.58E-18
9.60E-22
7.28E-08
7.74E-10
1.35E-32
2.19E-42
1.05E-32
3.25E-44
1.35E-20
5.62E-27
2.09E-09
1.34E-13
2.761
8.67E-03
1.14E-02
5.86E-03
16.58
9.93E-03
414.98
33.68
7.479
1.85E-02
8.55E-02
7.58E-03
37.11
2.69E-02
927.70
42.30
8.14E-02
9.22 E-05
7.86E-04
2.27E-05
14.81
3.05E-03
19.28
5.61E-02
3.44E-12
3.68E-16
2.88E-14
4.29E-18
7.85E-03
8.24E-08
3.21
1.68E-02
6.07E-12
5.99E-16
6.43E-12
2.27E-16
1.75E-02
3.51E-06
4.08
7.02E-02
1.65E-14
4.41E-19
8.21E-17
1.87E-20
3.97E-05
1.02E-10
0.01
2.23E-08
1.13E-28
8.29E-33
1.90E-28
1.51E-34
1.82E-14
2.22E-15
2.78E-04
4.62E-09
2.52E-28
5.36E-31
4.26E-28
7.94E-34
4.08E-14
5.64E-15
6.10E-02
7.12E-09
8.17E-30
2.01E-40
2.87E-32
6.79E-41
5.87E-15
3.92E-19
6.42E-05
3.93E-12
1.038
7.25 E-03
2.40E-02
7.19E-04
0.14
1.49E-02
1.02
8.56 E-02
5.84
2.22E-02
0.49
1.60E-03
5.84
3.27E-02
3.03
0.35
9.75E-07
6.39E-13
2.76E-09
1.15E-012
1.15E-02
8.91E-07
0.74
1.96E-04
8.84E-310
8.65E-335
4.13E-316
4.77E-339
8.80E-36
9.88E-58
3.61E-02
1.15E-22
5.63E-309
7.82E-333
2.58E-314
9.07E-339
1.78E-35
2.78E-56
8.75E-02
4.55E-22
8.37E-321
3.55E-346
9.38E-322
1.20E-347
2.46E-41
7.35E-73
2.91E-07
1.83E-28
1.08E-315
2.58E-330
4.11E-318
2.65E-331
1.77E-28
9.93E-55
5.24E-08
6.43E-25
2.01E-313
9.15E-330
3.21E-315
4.99E-330
5.90E-27
7.59E-54
1.07E-06
9.29E-23
1.36E-321
3.66E-339
6.91E-322
4.76E-344
1.84E-33
5.79E-76
3.86E-10
4.21E-28
6.28E-315
5.28E-322
1.19E-318
1.28E-328
1.33E-33
2.30E-51
2.28E-10
9.88E-24
3.47E-312
6.77E-319
6.39E-316
7.94E-328
1.38E-30
9.84E-51
8.86E-08
3.74E-22
1.54E-321
8.62E-329
7.36E-322
1.49E-430
7.11E-38
9.24E-71
8.44E-13
8.39E-25
Schwefel
2.22
Mean
Std.
Best
Schwefel
2.21
Mean
Std.
Best
Mean
Std.
Best
7.52E-308
1.08E-325
3.89E-314
9.88E-226
7.46E-20
1.03E-49
3.01E-05
1.09E-20
Sphere
9.01E-307
8.36E-324
5.24E-309
7.09E-223
1.83E-19
8.27E-45
9.44E-05
7.55E-20
1.41E-320
9.08E-331
3.06E-321
2.87E-240
5.08E-22
7.89E-74
2.05E-08
4.59E-22
Mean
Std.
Best
2.84E-11
7.53E-11
4.16E-19
2.18E-12
1.13E-19
1.15E-09
1.33
1.20E-04
5.87E-17
7.25E-12
2.52E-18
3.29E-03
7.02E-03
2.58E-09
2.98
2.68E-03
5.11E-15
9.11E-28
3.28E-15
1.28E-28
2.72E-05
4.28E-12
0.25
5.17E-07
Step
[8]
1995, pp. 1942–1948.
X.-S. Yang and S. Deb, “Cuckoo search via Lévy
flights,” in Nature & Biologically Inspired
Computing, 2009. NaBIC 2009. World Congress on,
978-1-5090-6332-1/17/$31.00 ©2017 IEEE
42
2017 8th International Conference on Information Technology (ICIT)
[9]
[10]
[11]
[12]
[13]
[14]
[15]
[16]
[17]
[18]
[19]
[20]
2009, pp. 210–214.
F. Glover, “Heuristics for integer programming
using surrogate constraints,” Decis. Sci., vol. 8, no.
1, pp. 156–166, 1977.
S. Kirkpatrick, C. D. Gelatt, M. P. Vecchi, and
others, “Optimization by simmulated annealing,”
Science (80-. )., vol. 220, no. 4598, pp. 671–680,
1983.
M. A. Al-Betar, “β-hill climbing: an exploratory
local search,” Neural Comput. Appl., pp. 1–16,
2016.
M. M. Shehab, A. T. Khader, and M. A. Al-Betar,
“New Selection Schemes for Particle Swarm
Optimization,” IEEJ Trans. Electron. Inf. Syst.,
2016.
M. A. Al-Betar, A. T. Khader, and I. A. Doush,
“Memetic techniques for examination timetabling,”
Ann. Oper. Res., vol. 218, no. 1, pp. 23–50, 2014.
M. Shehab, A. T. Khader, and M. A. Al-Betar, “A
survey on applications and variants of the cuckoo
search algorithm,” Appl. Soft Comput., 2017.
S. Dejam, M. Sadeghzadeh, and S. J. Mirabedini,
“Combining cuckoo and tabu algorithms for solving
quadratic assignment problems,” J. Acad. Appl.
Stud., vol. 2, no. 12, pp. 1–8, 2012.
A. F. Ali and M. A. Tawhid, “A hybrid cuckoo
search algorithm with Nelder Mead method for
solving global optimization problems,”
Springerplus, vol. 5, no. 1, p. 1, 2016.
M. A. Al-Betar, Z. A. A. Alyasseri, A. T. Khader,
A. L. Bolaji, and M. A. Awadallah, “Gray image
enhancement using harmony search,” Int. J.
Comput. Intell. Syst., vol. 9, no. 5, pp. 932–944,
2016.
M. Abdel-Baset and I. M. Hezam, “Solving Linear
Least Squares Problems Based on Improved Cuckoo
Search Algorithm,” 2016.
I. Fister Jr, X.-S. Yang, D. Fister, and I. Fister,
“Cuckoo search: a brief literature review,” in
Cuckoo search and firefly algorithm, Springer,
2014, pp. 49–62.
S. Roy and S. S. Chaudhuri, “Cuckoo search
algorithm using Lévy flight: a review,” Int. J. Mod.
[21]
[22]
[23]
[24]
[25]
[26]
[27]
[28]
[29]
[30]
[31]
[32]
Educ. Comput. Sci., vol. 5, no. 12, p. 10, 2013.
X.-S. Yang and S. Deb, “Cuckoo search: recent
advances and applications,” Neural Comput. Appl.,
vol. 24, no. 1, pp. 169–174, 2014.
X.-S. Yang, “A new metaheuristic bat-inspired
algorithm,” in Nature inspired cooperative
strategies for optimization (NICSO 2010), Springer,
2010, pp. 65–74.
X.-S. Yang, “Firefly algorithm,” Eng. Optim., pp.
221–230, 2010.
A. Schaerf and A. Meisels, “Solving employee
timetabling problems by generalized local search,”
in Congress of the Italian Association for Artificial
Intelligence, 1999, pp. 380–389.
E. K. Burke and J. P. Newall, “Enhancing timetable
solutions with local search methods,” in
International Conference on the Practice and
Theory of Automated Timetabling, 2002, pp. 195–
206.
I. Pavlyukevich, “Lévy flights, non-local search and
simulated annealing,” J. Comput. Phys., vol. 226,
no. 2, pp. 1830–1844, 2007.
X.-S. Yang and S. Deb, “Multiobjective cuckoo
search for design optimization,” Comput. Oper.
Res., vol. 40, no. 6, pp. 1616–1624, 2013.
A. Layeb, “A novel quantum inspired cuckoo search
for knapsack problems,” Int. J. Bio-Inspired
Comput., vol. 3, no. 5, pp. 297–305, 2011.
X. Li, J. Wang, and M. Yin, “Enhancing the
performance of cuckoo search algorithm using
orthogonal learning method,” Neural Comput.
Appl., vol. 24, no. 6, pp. 1233–1247, 2014.
X. Yao, Y. Liu, and G. Lin, “Evolutionary
programming made faster,” IEEE Trans. Evol.
Comput., vol. 3, no. 2, pp. 82–102, 1999.
D. Simon, “Biogeography-based optimization,”
IEEE Trans. Evol. Comput., vol. 12, no. 6, pp. 702–
713, 2008.
M. Jamil and X.-S. Yang, “A literature survey of
benchmark functions for global optimisation
problems,” Int. J. Math. Model. Numer. Optim., vol.
4, no. 2, pp. 150–194, 2013.
978-1-5090-6332-1/17/$31.00 ©2017 IEEE
43
Документ
Категория
Без категории
Просмотров
3
Размер файла
623 Кб
Теги
2017, icitech, 8079912
1/--страниц
Пожаловаться на содержимое документа