You are on page 1of 20

Engineering with Computers (2022) 38:2771–2790

https://doi.org/10.1007/s00366-020-01240-3

ORIGINAL ARTICLE

An efficient population‑based simulated annealing algorithm for 0–1


knapsack problem
Nima Moradi1 · Vahid Kayvanfar1   · Majid Rafiee1

Received: 12 October 2020 / Accepted: 8 December 2020 / Published online: 5 January 2021
© Springer-Verlag London Ltd., part of Springer Nature 2021

Abstract
0–1 knapsack problem (KP01) is one of the classic variants of knapsack problems in which the aim is to select the items with
the total profit to be in the knapsack. In contrast, the constraint of the maximum capacity of the knapsack is satisfied. KP01
has many applications in real-world problems such as resource distribution, portfolio optimization, etc. The purpose of this
work is to gather the latest SA-based solvers for KP01 together and compare their performance with the state-of-the-art meta-
heuristics in the literature to find the most efficient one(s). This paper not only studies the introduced and non-introduced
single-solution SA-based algorithms for KP01 but also proposes a new population-based SA (PSA) for KP01 and compares
it with the existing methods. Computational results show that the proposed PSA is the most efficient optimization algorithm
for KP01 among all SA-based solvers. Also, PSA’s exploration and exploitation are stronger than the other SA-based algo-
rithms since it generates several initial solutions instead of one. Moreover, it finds the neighbor solutions based on the greedy
repair and improvement mechanism and uses both mutation and crossover operators to explore and exploit the solution space.
Suffice to say that the next version of SA algorithms for KP01 can be enhanced by designing a population-based version of
them and choosing the greedy-based approaches for the initial solution phase and local search policy.

Keywords  0–1 knapsack problem · Meta-heuristics · Simulated annealing · Population-based

1 Introduction decision-making problems [7]. Moreover, KP has several


variants; one of them is the 0–1 knapsack problem (KP01).
Discrete or combinatorial optimization is one of the branches In KP01, there is only one knapsack, and each item has two
of operations research (OR), which deals with the discrete options of whether being in the knapsack or not being in
solution space. One of the critical and challenging problems the knapsack. The other variants of KP are multi-objective
in discrete optimization is the knapsack problem (KP), in KP [8, 9], multi-dimensional KP [10–12], multiple KP [13],
which the aim is to select the best items according to their multiple-choice KP [14], quadratic KP [15] and subset-sum
weights and profits. In KP, each item has a specific weight problem [16], max–min KP [17]; nevertheless, in this paper,
and profit, and the problem is to choose the items with the the focus is on KP01.
best total profit satisfying the limitation of the capacity KP01 can be formulated mathematically [18], as it is
related to the total weight. shown below. In this model, Xi is the binary decision vari-
KP can be seen in real-world problems where it has a lot able which gets one if the item i is selected to be in the
of applications; for example, interactive multimedia systems knapsack; otherwise, it gets zero. Also, pi and wi denote the
[1], e-training services delivery [2], mining industry [3], profit and weight of item i. C is the maximum capacity of the
self-sufficient marine [4], interdiction games, and monoto- knapsack. It has been proven that the optimization of KP01
nicity [5], information technology professionals [6], project is NP-hard [19], which means there is no algorithm with
selection, resource distribution, portfolio optimization, and polynomial time complexity to solve it until now.
n

* Vahid Kayvanfar Max pi × Xi
kayvanfar@sharif.edu i=1

1
Department of Industrial Engineering, Sharif University s.t
of Technology, Tehran, Iran

13
Vol.:(0123456789)

2772 Engineering with Computers (2022) 38:2771–2790

n
∑ their inefficiency in solving the instances with large size has
wi × Xi ≤ C led researchers to propose the heuristic and meta-heuristic
i=1
solution techniques for KP01. Various heuristics and meta-
heuristics based on SA are designed and developed to solve
Xi ∈ {0,1} (1) different problems [25], including the KP01, efficiently; even
the size of the problem is large and challenging.
In this paper, KP01 is chosen because of its important
applications in real-world problems and its theoretical
importance in OR. KP01 is an NP-hard optimization prob-
2.1 Meta‑heuristics for KP01
lem; thus, this requires designing meta-heuristics optimiza-
In recent years, researchers are working on the various heu-
tion algorithms that would solve KP01 efficiently since exact
ristics and meta-heuristics for KP01 due to their simplic-
solvers are not efficient when the problem size gets larger.
ity and efficiency. For instance, Sapre et al. [26] proposed
The main purpose of this paper is to gather the different
the basic version of the cohort intelligence (CI) algorithm
simulated annealing (SA)-based optimization algorithms for
along with feasibility-based rules for KP01. With an edu-
KP01 together and compare them with a population-based
cated approach for selecting the candidates, their proposed
version of SA (PSA) to find the most efficient solver for
methodology has achieved the optimal solution in most of
KP01 instances. Thus, the contributions of the present work
the problems. Abdel-Basset et al. [27] have solved the KP01
can list as the following:
by proposing an improved whale optimization algorithm
• Proposing an efficient SA and analyzing the other vari- (IWOA). They have applied a penalty function for the fit-
ness function and a two-stage repair operator for repairing
ants of SA proposed for KP01 to evaluate their perfor-
the infeasible solutions. Gómez-Herrera et al. [28] proposed
mance and comparing them with the state-of-the-art
a quartile-based hyper-heuristic for different size of KP01.
optimization methods with more details.
• Proposing an efficient PSA to solve the KP01 instances Their proposed algorithm is designed based on the informa-
tion on the distributions of weight and profit of the items in
for the first time in the literature review of KP01, which
the knapsack instances. Also, in their paper, a solver can
is more efficient than the other SA-based solvers.
• Employing the efficient repair and improvement algo- select heuristics based on the problem features by using
the proposed hyper-heuristic. Hu [29] has solved the KP01
rithm to handle the infeasible solutions and converting
instances by a probabilistic solution discovery algorithm.
them into the feasible ones with better quality.
• Employing the mutation and crossover operators to gen- The proposed methodology consists of three steps: strategy
development, strategy analysis, and solution discovery. The
erate neighbor solutions in the PSA to find better solu-
efficiency of the proposed method has been shown by using
tions by using useful information among the population.
two numerical examples.
Gao et al. [30] proposed a quantum-inspired wolf pack
In the second section, the literature of KP01 is reviewed,
algorithm (QWPA) for KP01. Quantum rotation and quan-
especially with more details on the SA-based algorithms for
tum collapse are used as two main operations in QWPA.
KP01. In the third section, a new PSA algorithm for KP01 is
The quantum rotation enables the population to move to the
proposed. In the fourth section, comprehensive experiments
global optima, and the quantum collapse helps to avoid the
are conducted to analyze the proposed algorithms’ perfor-
trapping of individuals into the local optima. Zhan et al. [31]
mance, and the comparative study is presented. Finally, in
have proposed noising methods (NMs) for KP01 instances.
the last section, the conclusion and suggestions for future
In their paper, in total, six variants of NMs, including two
studies are presented.
thresholds accepting (TA), are designed to solve the 0–1 KP.
Also, a hybrid greedy repair operator is used in the variants
to establish a balance between intensification and diversifi-
2 Literature review cation. Truong et al. [32] used an artificial chemical reaction
optimization (ACROA) as an optimization method for KP01.
There are two main optimization techniques in the litera- To improve their proposed method, a new repair operator
ture, including exact methods and approximate algorithms combining a greedy strategy and random selection is used
for the solution of KP01. Exact methods consist of dynamic to repair the infeasible solutions. Feng et al. [33] presented
programming (DP) and branch and bound (B&B). In recent a novel binary monarch butterfly optimization (BMBO)
years, several works have studied KP01 by proposing exact method for KP01 examples. In this work, three individual
methods, such as [20–24]. These works are mostly based on allocation schemes are tested to achieve better performance.
exact methods such as DP and B&B. Although the proposed Also, a novel repair operator based on a greedy strategy is
exact methods can obtain the global solution of the problem, used to improve and revise the infeasible solutions.

13
Engineering with Computers (2022) 38:2771–2790 2773

In the latest years, KP01 has been a popular problem to In the following, the steps of single solution-based SA
evaluate the performance of the meta-heuristics. Abdel- (SSA)1 algorithms have been described with details, espe-
Basset et al. [34] proposed a binary version of equilibrium cially in the case of KP01. These steps are taken from the
optimization (BEO) for KP01 where eight transfer functions, SA-based solvers for KP01 in the literature, and this infor-
including V-shaped and S-shaped, are used to convert con- mation will help to design an efficient SA-based algorithm
tinuous EO to binary version. Also, their claim is that the for KP01 instances.
performance of any binary algorithm depends on the good
choice of the transfer function. In their proposed method, a • Inputs: As shown in Algorithm 1, SSA starts with the
penalty function is used to handle the infeasible solutions, inputs such as the maximum or initial temperature
and a repair algorithm is employed to convert the infeasi- (Tmax), minimum or final temperature (Tmin), cooling
ble solutions to the feasible ones. Their proposed BEO is rate ( 𝛼 ), and maximum iteration at each temperature
executed on the various small-, medium-, and large-scale ( max_iteration).
instances compared with the other meta-heuristics to dem- • Initial solution: This phase is important because of its
onstrate its superiority. Abdel-Basset et al. [35] converted role in determining the start point at the solution search
the continuous marine predators optimization algorithm space. When a greedy generation of the initial solution
(MPO) to a binary version (BMP), which can be used to is used, the final solution is close to the optimal solution
optimize the KP01 instances. For this purpose, they have with high probability. However, a random initial solution
used V-shaped and S-shaped transfer functions to generate mechanism will decrease the quality of the final solu-
the binary solution space despite the continuous nature of tion; even the computational time will be higher in some
MPO. Their proposed method is compared with the other cases [62]. According to the literature, there are two main
solution techniques, and the computational results and statis- approaches to construct the initial solution, including the
tical analysis show the merits of the BMP. Zhang et al. [36] randomly initial solution phase (RISP) and the greedy
solved KP01 by proposing an amoeboid organism algorithm. initial solution phase (GISP). In RISP, the initial solution
Their method consists of three main steps. In the first step, is generated just by random numbers. However, GISP
by a network converting algorithm, the KP01 is converted starts by sorting the knapsack items according to the cri-
into a directed graph, then at the second step, the longest teria vi = pi ∕wi , the density metric. Then, the items with
path problem is transformed into the shortest path problem higher vi are selected until the knapsack capacity would
to apply the amoeboid organism model. In the final step, the not exceed the maximum capacity. RISP and GISP are
amoeboid organism algorithm can solve the shortest path shown in Figs. 2 and 3, respectively.
problem. Also, the numerical examples show the efficiency
of the proposed amoeboid organism algorithm.
For the other works on KP01, the readers are referred to
the papers binary cuckoo search algorithm (BCSA) [37], • The stopping criteria: There are usually two stopping
novel binary bat algorithm (NBBA) [38], binary bat algo- criteria in SSA, although some papers have used only
rithm (BBA) [38], binary artificial bee colony algorithm one criterion. The first criterion controls the tempera-
with differential evolution (BABC) [39], complex-valued ture cooling by the functions such as linear cooling
encoding wind-driven optimization (CWDO) [40], B&B (Ti = T0 − i × β) in which i is the counter, and β is a speci-
[38], CI [38]. To summarize the recent works on KP01, fied constant value or geometric function (T = α × T) in
Table 1 is provided. In the next section, the proposed SA- which α is the cooling rate. The second criterion deter-
based solvers for KP01 in the literature are reviewed with mines the number of iterations at a fixed temperature.
more details to show their properties and weak points and The different stopping conditions have been discussed
compare them with the proposed SA-based algorithm in this on page 133 in Ref. [16].
paper. • Local search: This phase is one of the most critical steps
in the meta-heuristics. Moving toward the neighborhood
2.2 SA‑based meta‑heuristics for KP01 is implemented by the local search operators such as
swap operators (two bits of the solution exchange their
SA algorithm is one of the stochastic nature-inspired meta- bits), hamming operator (flipping one bit of the solu-
heuristics that its performance has been verified in the litera-
ture for combinatorial optimization problems [59, 60]–61.
SA simulates materials’ annealing process when the crystal-
1
line structure is being shaped after cooling the heated mate-   Single-solution based SA (SSA) works with one solution during the
algorithm; however, population-based SA (PSA) starts with several
rial. A full description of SA can be found in Ref. [16]. The initial solutions and improves them instead of working with only one
pseudo-code of the general SA algorithm is shown in Fig. 1. solution.

13

2774 Engineering with Computers (2022) 38:2771–2790

Table 1  Articles on KP01 after 2015 (α for exact, β for heuristic, and µ for meta-heuristic)
References Year Solution approach Solution technique
α β µ

[41] 2015 – ✓ ✓ Complex-valued encoding bat algorithm (CPBA)


[42] 2015 – ✓ ✓ Simplified binary harmony search (SBHS) algorithm
[32] 2015 – ✓ ✓ Artificial chemical reaction optimization (ACROA)
[43] 2016 – ✓ ✓ Sequential and parallel SA
[44] 2016 – ✓ ✓ The binary version of the monkey algorithm
[21] 2016 ✓ ✓ ✓ The greedy, dynamic programming (DP), B&B, and genetic algorithms
[37] 2017 – ✓ ✓ Cuckoo search algorithm (CSA) and firefly algorithm (FA)
[29] 2017 – ✓ – The probabilistic solution discovery algorithm
[20] 2017 ✓ – – Exact solution approach based on an effective exploration
[28] 2017 – ✓ – A quartile-based hyper-heuristic
[45] 2017 – ✓ ✓ Gravitational search algorithm (GSA)
[33] 2017 – ✓ ✓ Binary monarch butterfly optimization (BMBO)
[38] 2017 – ✓ ✓ Novel binary bat algorithm (NBBA)
[40] 2017 – ✓ ✓ Complex-valued encoding wind-driven optimization (CWDO)
[46] 2017 – ✓ ✓ A meme evolution paradigm for KP01
[30] 2018 – ✓ ✓ Quantum-inspired wolf pack algorithm (QWPA)
[47] 2018 – ✓ – A heuristic algorithm based on expectation efficiency
[48] 2018 – ✓ ✓ List-based simulated annealing (LBSA) algorithm
[7] 2018 – ✓ ✓ Flower pollination algorithm (BFPA)
[49] 2018 – ✓ ✓ Elite opposition-flower pollination algorithm (EFPA)
[39] 2018 – ✓ ✓ Binary artificial bee colony algorithm with differential evolution (BABC-DE)
[50] 2018 – ✓ ✓ Chaotic monarch butterfly optimization algorithm (CMBO)
[51] 2018 – ✓ ✓ An improved quantum evolutionary algorithm (IQEA)
[52] 2018 – ✓ ✓ Opposition-based learning (OBL) monarch butterfly optimization with
Gaussian perturbation (OMBO)
[53] 2018 – ✓ ✓ New binary artificial bee colony NB-ABC
[54] 2018 – ✓ – Parallel time–space reduction
[55] 2019 – ✓ ✓ A binary multi-scale quantum harmonic oscillator algorithm (BMQHOA)
[56] 2019 – ✓ ✓ Binary fireworks algorithm (BFWA)
[26] 2019 – ✓ – Cohort intelligence (CI) with an educated approach
[57] 2019 – ✓ ✓ Tissue P system with cell division
[58] 2019 ✓ ✓ ✓ B&B, DP, SA-GA-hybrid SA, and GA
[33] 2019 – ✓ ✓ A novel MBO with a global position updating operator (GMBO)
[27] 2019 – ✓ ✓ Whale optimization algorithm (WOA)
[31] 2020 – ✓ ✓ Noising methods (NMs)
[34] 2020 – ✓ ✓ Binary equilibrium optimization algorithm (BEOA)
[35] 2020 – ✓ ✓ Binary marine predators optimization algorithm (BMPO)
Present research 2020 – ✓ ✓ A new population-based SA

tion), which is recommended in binary representation solve a constrained problem by removing the constraints
[16]. When a problem has more than one constraint and adding a penalty function to the objective function
like KP01, the local search is more complicated. This is [63]. On the other hand, IOM deals with the infeasible
because after moving toward the neighbor solution, the solution to transform it into a feasible one. Based on the
new solution must be checked to satisfy the constraints literature, IOM is more efficient than PFM in KP01 [48].
or be repaired to be a feasible solution. To overcome the For IOM, there are two policies (1) generation until sat-
infeasibility of the solution in KP01, there are two main isfaction (GS), (2) repair and improvement mechanism
methods, i.e., the penalty function method (PFM) and the (RI). At first, a new solution is generated in the first
individual optimization method (IOM) [33]. PFM tries to approach, and then its feasibility is checked by a feasi-

13
Engineering with Computers (2022) 38:2771–2790 2775

Fig. 1  Algorithm 1: general SA
algorithm

Fig. 2  Algorithm 2: RISP
algorithm

Fig. 3  Algorithm 3: GISP
algorithm

13

2776 Engineering with Computers (2022) 38:2771–2790

Fig. 4  Algorithm 4: feasibility
evaluation algorithm

Fig. 5  Algorithm 5: repair
operator (removing items until
maximum capacity satisfied)

Fig. 6  Algorithm 6: improve-
ment operator (adding items
until maximum capacity not
exceeded)

bility evaluation algorithm (Fig. 4). If the new solution et  al. [48] (Zhan_RI) and Sonuc et  al. [43] (Sonuc_RI).
is feasible, the new movement is accepted; otherwise, the Zhan_RI is equal to Algorithm 5 and Algorithm 6 (Figs. 5,
infeasible solution is rejected, and the local search opera- 6), and details for Sonuc_RI can be found in Ref. [43].
tor generates the other new solution, and this process is Zhan_RI approach consists of two steps, including repair
repeated to obtain a feasible one and improvement operators. By repairing the operator, some
knapsack items are selected to be removed to satisfy the
One alternative to avoid being trapped in a generation capacity threshold (Fig. 5).
to satisfaction loop is to repair the infeasible solution and
then improve it by RI. In the literature, there are two ver- • The improvement operator is then needed to choose the
sions of RI for KP01, including RI implemented by Zhan new items without exceeding the maximum capacity

13
Engineering with Computers (2022) 38:2771–2790 2777

Fig. 7  Algorithm 7: m-flipping
operator algorithm

Table 2  Comparison of SA-based solvers for KP01 with the present work


References Year SA features for KP01
Initial solution gen- Local search Sequential/parallela Homogenous/list-basedb Single solution-
eration approach policy based/population-
based
RISP GISP GS RI

SA_So [43] 2016 – ✓ – ✓ Both Homogenous Single-solution based


SA_Zh [44] 2018 – ✓ – ✓ Sequential List-based Single-solution based
SA_Ez [58] 2019 ✓ – ✓ – Sequential Homogenous Single-solution based
Present work 2020 ✓ ✓ ✓ ✓ Sequential Homogenous Both
a
 Sequential SA is executed sequentially; on the other hand, parallel SA (PSA) is executed concurrently on several processors
b
 In homogenous SA (HSA), the temperature is fixed at each iteration; however, in list-based SA (LSA), the average of temperatures is used at
each iteration

(Fig. 6). One efficient repair policy is to remove the items density metric, a metric in which the items of the knapsack
with the lowest pi ∕wi . Also, adding the items with the are sorted by the value of vi = pi ∕wi in descending order.
highest pi ∕wi to knapsack is a logical way in the improve- Also, to generate feasible neighbor solutions, they used a
ment phase. (Other strategies have been provided in Ref. greedy method to generate feasible and better solutions.
[40].) It is noteworthy to say that the infeasible solution Finally, the results showed the parallel SA efficiency over
is always distinguished by Algorithm 2, whether in GS the sequential SA in terms of the solution quality. Zhan et al.
or RI. Also, an m-flipping operator is used to generate [48] proposed a list-based SA with a hybrid greedy repair
the neighbor solution from the current solution, as is and optimization operator for KP01. In the list-based cooling
shown in Fig. 7. Then, the newly generated solution is schedule, all temperatures are saved in “a priority queue.”
transformed into a feasible one by one of the GS or RI Like the previous paper [48], used the value of vi = pi ∕wi
approaches, and finally, the new, feasible solution is com- as the density metric to generate the initial solution and fea-
pared with the current solution according to the accept- sible neighbor solutions along with the greedy repair and
ance or rejection conditions. optimization operators. Ezugwu et al. [58] studied the KP01
• Acceptance/rejection condition: SSA does not always by proposing the various exact and heuristics, including
accept the better solution, unlike the other meta-heu- SA. They proposed a simple version of SA with a random
ristics. It is one of SSA’s useful features to accept the approach to generate the initial solution and a non-greedy
worse solution by using the Boltzmann distribution as method to move toward the neighbor solutions. The results
an acceptance probability function shown in line 11 of showed the superiority of the B&B, DP and hybrid GA-SA
Algorithm 1. By this probability function, SSA avoids over the greedy search, solo GA, and solo SA in solving the
being trapped in local optima. different KP01 instances.
To summarize, the SA-based solvers for KP01 and their
In the literature on KP01, three articles have been found differences with the present work are provided in Table 2. By
which have proposed SA as an optimization algorithm; this Table 2, it becomes clear that the present paper not only has
shows the low number of implementations of SA for KP01, proposed an efficient SA but also has studied other variants
although SA is one of the most widely used meta-heuristics of SA for KP01 to evaluate their performance and compare
for combinatorial optimization problems. Sonuc et al. [43] them with the state-of-the-art optimization methods with
proposed a sequential and a parallel SA for KP01. Their more details. This is also our main contribution since there is
proposed parallel SA ran on GPU with a multi-start tech- no work like it for SA-based algorithms for KP01, and there
nique. To generate the initial solution, they employed the is no PSA for KP01 until today. In other words, this paper

13

2778 Engineering with Computers (2022) 38:2771–2790

Table 3  Six SSA-based algorithms for KP01 in the present work


Features of algorithm Variants of SSA for KP01
Three new SSA algorithms Three previous SSA algorithms
SAKP01_1 SAKP01_2 SAKP01_3 SA_So [43] Homogenous version SA_Ez [58]
of SA_Zh [48]

Initial solution generation RISP RISP GISP GISP GISP RISP


Local search Sonuc_RI Zhan_RI GS Sonuc_RI Zhan_RI GS
Sequential/parallel Sequential Sequential Sequential Both Sequential Sequential
Homogenous/list-based Homogenous Homogenous Homogenous Homogenous List-based Homogenous

Fig. 8  Algorithm 8: mutation
operator in the proposed PSA

not only studies the introduced and not-introduced SA-based Mup (mutation probability), and Pcr (crossover point).
algorithms but also proposes a new PSA for KP01. Npop shows the number of solutions with which the
As a result, according to the literature, there are two algorithm works and improves them. In the proposed
approaches to generate the initial solution (RISP and GISP), PSA, initial solutions are generated by either RISP or
and three local search methods (GS, Zhan_RI, and Sonuc_ GISP approaches; Rr shows the rate of the initial solu-
RI) for SSA-based algorithms in solving KP01 instances; tions which are generated by the RISP approach, and the
therefore, there will be 2 × 3 = 6 SSA-based algorithms for remaining initial solutions are generated by GISP. Mup
KP01. These six algorithms are like each other in inputs, is like the mutation probability parameter in the genetic
stopping criteria, and acceptance conditions. However, the algorithm (GA), and it shows the number of neighbor
main differences are the initial solution phase and local solutions (offspring), which are generated by the crosso-
search approaches. In the following, these six algorithms ver operator. Pcr indicates the point at which the crosso-
have been summarized in Table 3. In Table 3, we can see ver operator operates. Then, the bits to the right of the
that three of these six algorithms have been studied before crossover point are swapped between the two parent solu-
by other researchers, but the other three ones have not been tions.
investigated in the literature. Thus, in this paper, a total of • Initial solution: Npop and Rr are two main parameters for
six SSA algorithms have been presented beside a new PSA this step. As mentioned above, in the proposed PSA, ini-
algorithm. Three introduced SSA algorithms in the literature tial solutions are generated by either RISP (Algorithm 2)
are referred to by their authors in Table 3. (They are called or GISP (Algorithm 3); so Rr shows the parts of the
SA_So, SA_Zh, and SA_Ez.) Three non-introduced SSA initial solutions which are generated by RISP, and the
algorithms are named SAKP01_1 to SAKP01_3 in Table 3. remaining solutions are generated by GISP.
• The stopping criteria: Like SSA-based algorithms, there
are two stopping criteria in PSA. The first criterion con-
3 Population‑based simulated annealing trols the temperature cooling by the function (f(T) = α × T),
(PSA) for 0–1 knapsack problem and the second criterion determines the number of itera-
tions at a fixed temperature ( max_iteraion).
In the following, the steps of the proposed PSA algorithms • Local search: Mup and Pcr are two main parameters in
for KP01 have been described: this step. As mentioned before, neighbor solutions are
generated by either mutation (Fig. 8) or crossover (Fig. 9)
• Inputs: In addition to the parameters of SSA-based operators. At first, the mutation operator operates on the
algorithms, four new parameters include Npop (number current solutions considering Mup parameter, and then
of population), Rr (the rate of random initial solutions), Npop − Npop × Mup solutions are randomly combined

13
Engineering with Computers (2022) 38:2771–2790 2779

Fig. 9  Algorithm 9: Crossover
operator in the proposed PSA

Fig. 10  Representaion of the
initial and neigbor solutions
generation in the proposed PSA

considering a single-point crossover with Pcr parameter. algorithm, a user can choose the policies for the initial solu-
After applying mutation and crossover, each offspring tion generation by integrating RISP and GISP, and the local
must be repaired and improved by Zhan_RI (Algorithms search by using mutation and crossover operators and greedy
5, 6). Finally, feasible solutions are sent to the accept- repair and improvement approach. Table 4 summarizes the
ance/rejection step. differences and similarities among the SSA-based solvers
and PSA.

• Acceptance/rejection condition: Like SSA-based algo- 4 Experiments and computational results


rithms, the Boltzmann distribution is used as an accept-
ance probability function. The only difference is that In this section, two comprehensive experiments, including
in the proposed PSA, each new solution (offspring) is parameter tuning for SSA and PSA algorithms, and com-
compared with its previous solution (parent) according parative studies are performed. All SSA and PSA algorithms
to this probability function; for example, the i-th solution have been written in C++ programming language (Dev.
in the previous population is compared exactly with the C++ compiler) using a 2.30 GHz ­Intel® Core i3 processor
i-th solution in the new population. with 2 GB RAM.

To summarize, these steps are represented in Fig. 10, in 4.1 Parameter tuning for SAKP01_1, SAKP01_2,
which RISP or GISP generates the initial solutions, and then SAKP01_3, and PSA (first experiment)
the neighbor solutions are found by mutation and crossover
operators. The maximum temperature controls these pro- In the first experiment, the optimal values of SSA and PSA
cesses and maximum iteration at each temperature, as dis- parameters are determined by the design of experiments
cussed in Sect. 2. (DOE) and Design Expert-10 software; response surface
As a result, in this paper, two main SA-based algorithms, methodology (RSM) is used as a DOE method. These
including SSA and PSA algorithms, have been proposed experiments have been executed on a problem with size
for KP01. The flowchart of the proposed PSA algorithm to 1000. Finally, the Design Expert-10 software, SSA, and
solve the KP01 is shown in Fig. 11. By the proposed PSA

13

2780 Engineering with Computers (2022) 38:2771–2790

Fig. 11  Proposed PSA algo-


rithm for KP01

Table 4  Similarities and differences among six SSA-based algorithms with PSA


Features of Proposed PSA SSA-based solvers for KP01
algorithm in this paper
SAKP01_1 SAKP01_2 SAKP01_3 SA_So [43] Homogenous SA_Ez [58]
version of
SA_Zh [48]

Initial solution RISP and GISP RISP RISP GISP GISP GISP RISP
Local search Mutation and Sonuc_RI Zhan_RI GS Sonuc_RI Zhan_RI GS
crossover-
based Zhan_
RI
Structure Sequential Sequential Sequential Sequential Both Sequential Sequential
Homogenous/ Homogenous Homogenous Homogenous Homogenous Homogenous Homogenous Homogenous
list-based
Number of solu- Population- Single solution- Single solution- Single solution- Single solution- Single solution- Single solution-
tions based based based based based based based

13
Engineering with Computers (2022) 38:2771–2790 2781

Table 5  Optimal values of SSA parameters for SAKP01_1, been considered. Also, the optimal values of the proposed
SAKP01_2, and SAKP01_3 SSAs and PSA are taken from Tables 5 and 6, and the opti-
Alg Optimal values for SSA parameters mal values of the parameters of the other optimization algo-
rithms are considered from their original articles.
Tmax Tmin 𝛼 max_iteraion m

SAKP01_1 1000 0.0001 0.99 – – 4.2.1 Low‑dimensional KP01


SAKP01_2 10 0.0001 0.98 10 6
SAKP01_3 10 0.0001 0.98 10 3 To evaluate the performance of the six SSAs and PSA algo-
rithms, their results have been compared with the state-of-
the-art meta-heuristics on the low-dimensional KP01 data
PSA parameters’ optimal values are obtained (Tables 5, 6, set. In this section, the proposed SSAs and PSA are com-
respectively). pared with BCSA [37], BMBO [33], NBBA [38], BABC
Thus, by the optimal values of SA parameters in each [39], CWDO [40], NM1 [41], SA_Ez [58], SA_So [43],
algorithm, which have been obtained by the first and second SA_Zh [48]. The results of the comparative study on this
experiments, the algorithm’s performance can be enhanced data set are shown in Table 8. In Table 8, BKS is the best-
efficiently because, by proper parameter tuning, the algo- known solution; “B”, “M”, and “W” indicate the best objec-
rithm will be able to be executed according to its best per- tive function, mean of the objective functions, and worst
formance [64]. objective function obtained by the algorithm, respectively.
Also, each algorithm was executed independently ten times
4.2 A comparative study (second experiment) for each instance. From Table 8, some facts can be elicited:
(a) the proposed PSA is as efficient as the state-of-the-art
The second experiment aims to compare the performance of non-SA-based and SA-based algorithms on low-dimensional
the proposed SSAs and PSA algorithms with state-of-the- instances, (b) all solvers are stable on low-dimensional
art meta-heuristics in the literature. Five data sets, includ- instances except BMBO, SA_So, SA_Ez, and SAKP01_1,
ing low-dimensional, medium-dimensional, uncorrelated (c) among SA-based solvers, PSA, SA_Zh, SAKP01_2, and
high-dimensional, weakly correlated high-dimensional, SAKP01_2 are most efficient since they have reached the
and strongly correlated high-dimensional KP01 instances, optimal solution in all instances.
have been chosen for comparative studies. All these data
sets except the medium-dimensional instances have been 4.2.2 Medium‑dimensional KP01
selected from http://www.artem​isa.unica​uca.edu.co/~johny​
orteg​a/insta​nces_01_KP/. The medium-dimensional KP01 In this section, the proposed SSAs and PSA algorithms are
data set can be found in Ref. [38]. The differences among compared with NBBA [38], BBA [38], NM1 [31], B&B
these data sets are shown in Table 7. Also, in the compara- [38], CI [38], SA_Ez [58], SA_So [43], and SA_Zh [48].
tive study, the homogenous version of SA in Ref. [48] has The results of the comparison are shown in Table  9. In

Table 6  Optimal values of PSA Alg Optimal values for PSA parameters


parameters
Tmax Tmin 𝛼 max_iteraion m Npop Rr Mup Pcr

PSA 1000 0.001 0.98 30 2 30 0.3 0.8 0.5

Table 7  Five data sets of KP01 and their features


Data set Low-dimen- Medium-dimen- Uncorrelated high- Weakly correlated high- Strongly correlated
sional ( p1 to p10) sional ( p11 to p20) dimensional ( p21 to p27) dimensional ( p28 to p34) high-dimensional ( p35
to p41)

Size 4–20 30–75 100–10,000 100–10,000 100–10,000


Profit of each item – – Rand (10, 100) Rand ( wi − 10, wi+10) wi + 10
Weight of each item – – Rand (10, 100) Rand (10, 100) Rand (10, 100)
Capacity of knapsack – – 0.75 × sum of weights 0.75 × sum of weights 0.75 × sum of weights

13

2782 Engineering with Computers (2022) 38:2771–2790

Table 8  Comparative study of the proposed SSAs and PSA algorithms with the state-of-the-art meta-heuristics on low-dimensional KP01
instances
Optimization algorithms for KP01 Low-dimensional KP01 instances
Ins p1 p2 p3 p4 p5 p6 p7 p8 p9 p10
Size 10 20 4 4 15 10 7 23 5 20
BKS 295 1024 35 23 481.07 52 107 9767 130 1025

Non-SA-based solvers BCSA B 295 1024 35 23 481.07 52 107 9767 130 1025
M 295 1024 35 23 481.07 52 107 9767 130 1025
W 295 1024 35 23 481.07 52 107 9767 130 1025
BMBO B 295 1024 35 23 481.07 52 107 9767 130 1025
M 295 1024 35 23 481.07 52 107 9766.12 130 1025
W 295 1024 35 23 481.07 52 107 9765 130 1025
NBBA B 295 1024 35 23 481.07 52 107 9767 130 1025
M 295 1024 35 23 481.07 52 107 9767 130 1025
W 295 1024 35 23 481.07 52 107 9767 130 1025
BABC B 295 1024 35 23 481.07 52 107 9767 130 1025
M 295 1024 35 23 481.07 52 107 9767 130 1025
W 295 1024 35 23 481.07 52 107 9767 130 1025
CWDO B 295 1024 35 23 481.07 52 107 9767 130 1025
M 295 1024 35 23 481.07 52 107 9767 130 1025
W 295 1024 35 23 481.07 52 107 9767 130 1025
NM1 B 295 1024 35 23 481.07 52 107 9767 130 1025
M 295 1024 35 23 481.07 52 107 9767 130 1025
W 295 1024 35 23 481.07 52 107 9767 130 1025
SA-based solvers Proposed SA_So B 295 1024 35 23 481.07 52 105 9767 130 1025
by the M 295 1024 35 23 472.44 52 105 9755.4 130 1025
literature
W 295 1024 35 23 437.94 52 105 9754 130 1025
SA_Zh B 295 1024 35 23 481.07 52 107 9767 130 1025
M 295 1024 35 23 481.07 52 107 9767 130 1025
W 295 1024 35 23 481.07 52 107 9767 130 1025
SA_Ez B 294 972 35 22 481.07 52 102 9767 130 1025
M 294 972 35 22 481.07 52 102 9767 130 1025
W 294 1024 35 22 481.07 52 102 9767 130 1025
Proposed SAKP01_1 B 295 1024 35 23 481.07 52 107 9767 130 1025
by the M 294.6 1024 35 23 463.81 50.7 99.8 9765.5 130 1025
present
W 293 1024 35 23 437.94 47 87 9762 130 1025
work
SAKP01_2 B 295 1024 35 23 481.07 52 107 9767 130 1025
M 295 1024 35 23 481.07 52 107 9767 130 1025
W 295 1024 35 23 481.07 52 107 9767 130 1025
SAKP01_3 B 295 1024 35 23 481.07 52 107 9767 130 1025
M 295 1024 35 23 481.07 52 107 9767 130 1025
W 295 1024 35 23 481.07 52 107 9767 130 1025
PSA B 295 1024 35 23 481.07 52 107 9767 130 1025
M 295 1024 35 23 481.07 52 107 9767 130 1025
W 295 1024 35 23 481.07 52 107 9767 130 1025

The bold numbers are the best obtained values

Table 9, some facts can be elicited: (a) PSA, NM1, SA_Zh, NM1 are the only algorithms which reach the best-known
and SAKP01_2 have reached the best-known solution in all solution in each run since their “B’ and “M” are equal to
instances, (b) in all ten runs for each algorithm, PSA and each other in each instance, (c) among SA-based solvers, the

13
Engineering with Computers (2022) 38:2771–2790 2783

Table 9  Comparative study of the proposed SSAs and PSA algorithms with the state-of-the-art meta-heuristics on medium-dimensional KP01
instances
Optimization algorithms for KP01 Medium-dimensional KP01 instances
Ins p11 p12 p13 p14 p15 p16 p17 p18 p19 p20
Size 30 35 40 45 50 55 60 65 70 75

Non-SA-based solvers NBBA B 1437 1689 1821 2033 2448 2643 2917 2818 3223 3614
M 1437 1689 1821 2033 2448 2642.6 2917 2817.6 2322.6 3613.2
W 1437 1689 1821 2033 2448 2632 2917 2814 3219 3605
BBA B 1437 1689 1821 2033 2440 2642 2917 2809 3213 3602
M 1437 1689 1821 2030.3 2439.6 2640.4 2915 2808.3 3212.9 3600.4
W 1437 1689 1821 2016 2435 2614 2893 2802 3209 3588
CI B 1437 1689 1816 2020 2440 2643 2917 2814 3221 3614
M 1418 1686.5 1807.5 2017 2436.1 2605 2915 2773.6 3216 3603.8
W 1398 1679 1791 2007 2421 2581 2905 2716 3211 3591
B&B B 1437 1689 1821 2033 2440 2440 2917 2818 3223 3614
M NA NA NA NA NA NA NA NA NA NA
W NA NA NA NA NA NA NA NA NA NA
NM1 B 1437 1689 1821 2033 2449 2651 2917 2818 3223 3614
M 1437 1689 1821 2033 2449 2651 2917 2818 3223 3614
W 1437 1689 1821 2033 2449 2651 2917 2818 3223 3614
SA-based solvers Proposed SA_So B 1437 1689 1817 2020 2448 2643 2917 2817 3215 3595
by the M 1424.6 1689 1815.1 2013.8 2442 2635.6 2914.4 2812.2 3212.7 3590.9
litera-
W 1417 1689 1806 1982 2417 2609 2891 2806 3203 3586
ture
SA_Zh B 1437 1689 1821 2033 2449 2651 2917 2818 3223 3614
M 1437 1689 1821 2033 2449 2651 2917 2817.5 3223 3613.5
W 1437 1689 1821 2033 2449 2651 2917 2817 3223 3609
SA_Ez B 1390 1657 1797 1999 2349 2590 2873 2770 3208 3584
M 1289.9 1569.5 1684.7 1886.2 2281 2454.7 2734.4 2613.5 3063.3 3333
W 1171 1381 1566 1778 2167 2300 2560 2408 2850 3174
Proposed SAKP01_1 B 1424 1689 1821 2020 2448 2632 2917 2817 3221 3600
by the M 1415 1685.2 1804.2 2003.4 2432.7 2591.5 2902.9 2810.9 3215.4 3587.8
present
W 1379 1661 1744 1970 2417 2552 2878 2806 3214 3565
work
SAKP01_2 B 1437 1689 1821 2033 2449 2651 2917 2818 3223 3614
M 1437 1689 1821 2033 2449 2651 2917 2818 3222.4 3614
W 1437 1689 1821 2033 2449 2651 2917 2818 3221 3614
SAKP01_3 B 1437 1689 1821 2006 2438 2643 2917 2809 3196 3583
M 1414.6 1668.6 1794.2 1941.3 2392.4 2590.6 2882 2728.5 3139.8 3529.5
W 1389 1624 1766 1878 2334 2537 2841 2651 2973 3488
PSA B 1437 1689 1821 2033 2449 2651 2917 2818 3223 3614
M 1437 1689 1821 2033 2449 2651 2917 2818 3223 3614
W 1437 1689 1821 2033 2449 2651 2917 2818 3223 3614

The bold numbers are the best obtained values

performance of the SA_Zh and SAKP01_2 are close to PSA, indicates the better solver in terms of the number of the best
but they are not as stable as PSA. Finally, by these results, solutions found and if two algorithms are like each other in
the competitiveness of the proposed SSAs and PSA for this terms of the number of the best solutions found, they are
data set can be shown by a logical phrase below in which < ranked by the most stability.

SA_Ez < SA_So < SAKP01_1 < SAKP01_3 < CI < BBA < B&B < NBBA < SA_Zh < SAKP01_2 < NM1 = PSA

13

2784 Engineering with Computers (2022) 38:2771–2790

Table 10  Comparative study of the SA-based solvers for KP01on uncorrelated high-dimensional KP01 instances
SA-based solvers for KP01 Uncorrelated high-dimensional KP01 instances
Ins p21 p22 p23 p24 p25 p26 p27
Size 100 200 500 1000 2000 5000 10,000
BKS 9147 11,238 28,857 54,503 110,625 276,457 563,647

Proposed by the literature SA_So B 9147 11,238 28,719 54,397 110,363 275,218 561,848
M 9147 11,126.1 28,614.8 54,139.1 110,176.6 275,010.3 561,709.6
W 9147 10,987 28,188 53,937 110,069 274,833 561,280
Std 0 106.5 146.7 131.5 75.5 105.6 150.7
SA_Zh B 9147 11,238 28,857 54,442 110,315 275,533 562,268
M 9147 11,233.2 28,772.2 54,303 110,278.4 275,471.2 562,107.5
W 9147 11,223 28,715 54,397.2 110,240 275,393 562,006
Std 0 5.9 51.6 35.7 18.6 39.5 77.4
SA_Ez B 8781 10,426 25,328 46,210 90,612 231,667 424,800
M 7585 9402.4 23,138.2 42,038.7 85,616.7 207,244.4 412,822.4
W 5575 8378 19,570 38,312 81,245 198,641 396,619
Std 595.9 613.0 1591.7 2970.4 3037.7 4724.2 7511.7
Proposed by the present work SAKP01_1 B 9147 10,520 23,865 38,902 63,895 111,191 167,760
M 8521.1 9998 20,920.2 35,256.1 57,742 103,276 158,206
W 7846 8888 17,752 30,742 52,640 96,051 149,416
Std 522.2 546.0 1731.7 2685.1 3101.7 4672.9 5882.1
SAKP01_2 B 9147 11,238 28,834 54,396 110,383 275,556 560,180
M 9147 11,231.6 28,720.1 54,223.1 110,117.4 275,044.3 558,674.1
W 9147 11,223 28,594 53,982 109,514 274,306 557,877
Std 0 5.7 93.0 148.5 272.8 368.0 666.8
SAKP01_3 B 9147 11,227 28,642 54,303 110,223 275,282 561,876
M 9075.5 11,066 28,513.8 54,018.3 109,923.3 275,019.1 561,716.9
W 8897 11,005 28,208 53,674 109,656 274,732 561,491
Std 109.5 62.9 141.6 182.9 179.1 144.9 124.1
PSA B 9147 11,238 28,857 54,503 110,510 275,675 562,282
M 9147 11,238 28,850.9 54,461.9 110,429 275,601.7 562,149.5
W 9147 11,238 28,819 54,442 110,394 275,538 562,056
Std 0 0 12.7 19.4 31.4 48.1 71.5

The bold numbers are the best obtained values

4.2.3 Uncorrelated high‑dimensional KP01 of the proposed SSAs and PSA for this data set can be shown
by a logical phrase below:
In this section, the proposed SSAs and PSA algorithms
SAEz < SAKP011 < SAKP013 < SASo < SAKP012 < SAZh < PSA.
are compared with the three SA-based algorithms from
the literature, including SA_Ez [58], SA_So [43], SA_Zh
[48], to find the most efficient SA-based solvers(s) for high- 4.2.4 Weakly correlated high‑dimensional KP01
dimensional KP01 instances. The comparative study results
of SA-based solvers on uncorrelated high-dimensional KP1 In this section, the comparative study of the proposed
instances are given in Table 10, and each instance is solved SSAs and PSA algorithms with SA_Ez [58], SA_So [43],
ten times by each algorithm. In Table 10, one fact can be and SA_Zh [48] is conducted on a weakly correlated high-
elicited; PSA is the most efficient solver in terms of best- dimensional KP01 data set. The results of the comparative
found solution and stability (low standard deviation) among study on this data set are given in Table 11. In Table 11,
all SA-based solvers on uncorrelated high-dimensional two facts can be elicited: (a) PSA is the most efficient solver
instances. Also, as the previous data set, the competitiveness in terms of best-found solution and stability (low standard

13
Engineering with Computers (2022) 38:2771–2790 2785

Table 11  Comparative study of the SA-based solvers for KP01on weakly correlated high-dimensional KP01 instances
SA-based solvers for KP01 Weakly correlated high-dimensional KP01 instances
Ins p28 p29 p30 p31 p32 p33 p34
Size 100 200 500 1000 2000 5000 10,000
BKS 1514 1634 4566 9052 18,051 44,356 90,204

Proposed by the literature SA_So B 1254 1445 3824 7384 14,407 34,626 70,515
M 1183.4 1386.2 3607.5 7264.2 14,212.6 34,439.1 70,356.4
W 1136 1327 3498 7110 14,054 34,246 70,097
Std 36.8 54.3 87.3 84.2 96.1 114.8 111.3
SA_Zh B 1514 1623 4200 8009 15,591 360,700 72,790
M 1496.6 1562.6 4091.4 7789 15,025.6 35,706.2 71,753.3
W 1471 1492 3899 7603 14,358 35,108 70,996
Std 13.7 44.9 100.9 145.6 367.0 270.6 470.4
SA_Ez B 1108 1280 3307 7106 13,510 32,757 38,744
M 976.3 1229.3 3102.7 6827.6 13,022.1 31,874.1 36,757.2
W 530 1049 2670 6471 12,362 30,484 34,668
Std 152.7 86.4 212.8 231.6 300.2 661.6 1334.7
Proposed by the present work SAKP01_1 B 1308 1386 3347 6672 12,134 28,361 54,519
M 1174.3 1230.4 3132.1 6247.7 11,441.7 28,030.2 53,469.6
W 1041 1084 2874 5724 10,400 27,376 50,330
Std 85.1 109.2 123.8 255.5 597.8 281.2 1142.0
SAKP01_2 B 1514 1620 4276 8189 15,401 36,104 72,377
M 1479 1564.4 4051.9 7948.5 15,020.1 35,623.2 71,896.9
W 1445 1492 3846 7588 14,490 35,195 70,885
Std 23.8 49.0 120.2 155.7 301.3 293.6 461.3
SAKP01_3 B 1320 1470 3739 7312 14,355 34,538 70,476
M 1259.3 1388.9 3462.6 7191 14,133.5 34,368.4 70,284.5
W 1079 1294 3325 7058 13,976 34,200 70,131
Std 64.6 55.2 105.5 64.3 104.8 104.7 93.6
PSA B 1514 1633 4381 8539 16,622 38,313 76,108
M 1513 1620.9 4329.8 8454.8 16,351.4 37,952 75,250.1
W 1513.1 1608 4292 8388 16,154 37,755 74,511
Std 0.3 9.5 30.7 53.4 135.1 163.4 422.0

The bold numbers are the best obtained values

deviation) among all SA-based solvers on weakly correlated high-dimensional KP01 data set. The results of the com-
high-dimensional instances, (b) the stability of the proposed parative study on this data set are given in Table 12. From
PSA got lower when it was executed on weakly correlated Table 12, two facts can be elicited: (a) PSA is the most effi-
instances. Also, the competitiveness of the proposed SSAs cient solver in terms of best-found solution and stability (low
and PSA for this data set can be shown by a logical phrase standard deviation) among all SA-based solvers on strongly
below: correlated high-dimensional instances, (b) the stability of
the proposed PSA is acceptable, unlike the previous section.
SAEz < SAKP011 < SAKP013 < SASo < SAZh < SAKP012 < PSA.
Also, the competitiveness of the proposed SSAs and PSA for
this data set can be shown by a logical phrase below:
4.2.5 Strongly correlated high‑dimensional KP01 SAEz < SAKP011 < SASo < SAKP013 < SAKP012 < SAZh < PSA.

Finally, in this section, the performance of the proposed


SSAs and PSA algorithms are compared with SA_Ez
[58], SA_So [43], SA_Zh [48] on strongly correlated

13

2786 Engineering with Computers (2022) 38:2771–2790

Table 12  Comparative study of the SA-based solvers for KP01on strongly correlated high-dimensional KP01 instances
SA-based solvers for KP01 Strongly correlated high-dimensional KP01 instances
Ins p35 p36 p37 p38 p39 p40 p41
Size 100 200 500 1000 2000 5000 10,000
BKS 2397 2697 7117 14,390 28,919 72,505 146,919

Proposed by the literature SA_So B 2297 2697 7017 14,290 28,918 72,405 146,318
M 2254 2693.7 7013.6 14,287.3 28,914.2 72,367.2 146,263
W 2192 2685 7008 14,281 28,913 72,297 146,207
Std 48.9 3.5 3.0 2.6 1.9 54.9 50.1
SA_Zh B 2397 2697 7117 14,290 28,919 72,405 146,519
M 2397 2696.6 7096.2 14,289.6 28,915.8 72,404.1 146,355.2
W 2397 2695 7016 14,287 28,913 72,398 146,304
Std 0 0.8 39.9 0.9 2.6 2.1 68.1
SA_Ez B 1952 2688 6483 12,686 23,890 57,090 109,214
M 1952 2567.4 6041.6 11,586.2 22,744.7 53,235.9 106,604.7
W 1952 2360 5413 10,387 21,516 51,198 102,119
Std 0 108.8 360.6 818.2 662.5 1740.5 1948.6
Proposed by the present work SAKP01_1 B 1494 1597 3484 6390 13,019 30,605 60,719
M 1314 1355.9 3232.6 6107.9 12,206.4 30,214.2 60,157.9
W 1095 1097 2988 5989 11,919 29,903 59,428
Std 153.0 156.0 146.9 119.6 304.3 198.2 400.6
SAKP01_2 B 2397 2697 7117 14,290 28,919 72,405 146,516
M 2397 2696.3 7065.2 14,289.4 28,916.6 72,382.6 146,346
W 2397 2694 7013 14,286 28,913 72,187 146,217
Std 0 1.1 49.3 1.3 2.5 65.2 97.3
SAKP01_3 B 2396 2697 7016 14,290 28,919 72,405 146,314
M 2290.1 2666.3 6995.3 14,282.4 28,909.7 72,364 146,267
W 2181 2567 6851 14,258 28,901 72,267 146,200
Std 63.1 48.1 48.2 9.7 5.9 49.0 49.5
PSA B 2397 2697 7117 14,390 28,919 72,505 146,818
M 2397 2697 7117 14,380 28,919 72,505 146,593.7
W 2397 2697 7117 14,290 28,919 72,505 146,515
Std 0 0 0 30.0 0 0 52.2

The bold numbers are the best obtained values

Fig. 12  Convergence history 600000


of SA-based solvers for p27
500000
(vertical axis: fitness function,
horizontal axis: computational 400000
time)
300000

200000

100000

0
1 2 3 4 5 6 7 8 9 10

SAKP01_3 SAKP01_1 SAKP01_2 SA_So SA_Zh SA_Ez PSA

13
Engineering with Computers (2022) 38:2771–2790 2787

Fig. 13  Convergence history 70000


of SA-based solvers for p34
60000
(vertical axis: fitness function,
horizontal axis: computational 50000
time) 40000
30000
20000
10000
0
1 2 3 4 5 6 7 8 9 10

SAKP01_3 SAKP01_1 SAKP01_2 SA_So SA_Zh SA_Ez PSA

Fig. 14  Convergence history 140000


of SA-based solvers for p41 120000
(vertical axis: fitness function,
100000
horizontal axis: computational
time) 80000
60000
40000
20000
0
0 2 4 6 8 10

SAKP01_3 SAKP01_1 SAKP01_2 SA_So SA_Zh SA_Ez PSA

Fig. 15  Convergence history of 563000


SA-based solvers for p27 with
562500
more details (vertical axis: fit-
ness function, horizontal axis: 562000
generation)
561500

561000

560500

560000
1 11 21 31 41 51 61 71 81 91

SAKP01_3 SAKP01_1 SAKP01_2 SA_So SA_Zh SA_Ez PSA

5 Discussion history of the all SA-based solvers for p27 , p34 and p41 are
shown in Figs. 12, 13 and 14, respectively. Also, Fig. 15
By the computational results, some facts can be seen: (a) the shows the detail of the convergence history of the algorithms
proposed PSA is the most efficient optimization algorithm in the last generations. In Figs. 12, 13, 14 and 15, some
for KP01 among the all SA-based solvers from the literature, facts can be observed: (a) the fitness function values by PSA
(b) the performance of each algorithm is dependent directly and SA_Zh are very close after the fourth second. Their
on the type of the high-dimensional KP01 instances (uncor- initial solution is better than the other algorithms since their
related, weakly or strongly correlated) although PSA was the approach for initial solution generation is GISP, (b) SA_Ez,
best solver in all data sets and all instances, (c) PSA is more and SAKP01_1 are not as strong as the other SA-based algo-
efficient than its single-solution based version (SAKP01_2) rithms. Also, SA_Ez has trapped in local point during the
since PSA starts with the various initial solutions generated execution, (c) PSA not only starts the optimization process
by either randomly or greedy approaches and explores the with a high-quality initial solution but also explores the solu-
solution space by both mutation and crossover operators; tion space purposefully.
as a result, its exploration and exploitation is more strong
than the other SA-based algorithms. Finally, the convergence

13

2788 Engineering with Computers (2022) 38:2771–2790

6 Conclusion and future studies References

In this paper, the efficient SSAs and PSA optimization algo- 1. Abdel-Basset M, El-Shahat D, Faris H, Mirjalili S (2019) A binary
multi-verse optimizer for 0–1 multidimensional knapsack prob-
rithms were proposed for KP01, which can solve small to lems with application in interactive multimedia systems. Comput
large problems of KP01 by reaching solutions with high Ind Eng 132:187–206
quality. First, six variants of SSA were designed for KP01, 2. Khemaja M, Khalfallah S (2019) Towards a knapsack model for
according to the literature. After parameter tuning, these six optimizing e-training services delivery: application to hybrid
intelligent tutoring systems. Procedia Comput Sci 164:257–264
algorithms and a new PSA solver for KP01 were compared 3. Samavati M, Essam D, Nehring M, Sarker R (2017) A method-
with state-of-the-art meta-heuristics from the literature. All ology for the large-scale multi-period precedence-constrained
these meta-heuristics were among the latest efficient algo- knapsack problem: an application in the mining industry. Int J
rithms designed for KP01 and include both non-SA-based Prod Econ 193:12–20
4. Simon J, Apte A, Regnier E (2017) An application of the multi-
and SA-based solvers. By conducting miscellaneous experi- ple knapsack problem: the self-sufficient marine. Eur J Oper Res
ments, four facts can be concluded: (a) the proposed PSA is 256:868–876
the most efficient optimization algorithm for KP01 among all 5. Fischetti M, Ljubić I, Monaci M, Sinnl M (2019) Interdiction
SA-based solvers, (b) the performance of each algorithm is games and monotonicity, with application to knapsack problems.
INFORMS J Comput 31:390–410
dependent directly on the type of the high-dimensional KP01 6. Khemaja M (2016) Using a knapsack model to optimize continu-
instances; for example, PSA was more stable in strongly ous building of a hybrid intelligent tutoring system: application to
correlated high-dimensional data set than weakly correlated information technology professionals. Int J Hum Cap Inf Technol
instances although it was the best solver in all data sets and Prof (IJHCITP) 7:1–18
7. Abdel-Basset M, El-Shahat D, El-Henawy I (2019) Solving 0–1
all instances, (c) in addition to PSA, SAKP01_2 and SA_Zh knapsack problem by binary flower pollination algorithm. Neural
were the most efficient among SSA algorithms where both Comput Appl 31:5477–5495
used greedy RI mechanism to find the neighbor solutions, 8. Bazgan C, Hugot H, Vanderpooten D (2009) Solving efficiently
(d) the exploration and exploitation of PSA is more strong the 0–1 multi-objective knapsack problem. Comput Oper Res
36:260–279
than the other SA-based algorithms since it generates the 9. Gandibleux X, Freville A (2000) Tabu search based procedure for
several initial solutions instead of one and finds the neighbor solving the 0–1 multiobjective knapsack problem: the two objec-
solutions by RI-based mutation and crossover operators to tives case. J Heuristics 6:361–383
generate the feasible and high-quality neighborhoods. Suf- 10. Fleszar K, Hindi KS (2009) Fast, effective heuristics for the
0–1 multi-dimensional knapsack problem. Comput Oper Res
fice to say that RI is an efficient approach to generate the 36:1602–1607
neighbor solutions, strengthen the algorithm to find high- 11. Chu PC, Beasley JE (1998) A genetic algorithm for the multidi-
quality solutions quickly, and is an efficient mechanism to mensional knapsack problem. J Heuristics 4:63–86
overcome the infeasibility of the generated solutions. Hence, 12. Ünal AN, Kayakutlu G (2016) A Partheno-genetic algorithm for
dynamic 0–1 multidimensional knapsack problem. RAIRO Oper
the next version of SA algorithms for KP01 can enhance Res 50:47–66
their performance by designing a population-based version 13. Dell’Amico M, Delorme M, Iori M, Martello S (2019) Mathemati-
of them and choosing the proper approaches for the initial cal models and decomposition methods for the multiple knapsack
solution phase and local search policy. problem. Eur J Oper Res 274:886–899
14. He C, Leung JY, Lee K, Pinedo ML (2016) An improved binary
For future studies, the implementation of SA to other search algorithm for the multiple-choice knapsack problem.
variants of KP like MKP, quadratic KP, and so on would RAIRO Oper Res 50:995–1001
be interesting. Also, developing new initial solution gen- 15. Chen Y, Hao J-K (2017) An iterated “hyperplane exploration”
eration strategies and policies for overcoming the infeasible approach for the quadratic knapsack problem. Comput Oper Res
77:226–239
solution of KP01 can be useful. However, employing other 16. König D, Lohrey M, Zetzsche G (2016) Knapsack and subset
powerful meta-heuristic algorithms, including population- sum problems in nilpotent, polycyclic, and co-context-free groups.
based ones, such as intelligent water drops algorithm [65, Algebra Comput Sci 677:138–153
66] and cuckoo optimization algorithm [67], and comparing 17. Bortfeldt A, Winter T (2009) A genetic algorithm for the two-
dimensional knapsack problem with rectangular pieces. Int Trans
its results with the currently obtained outcomes could be Oper Res 16:685–713
helpful and provides better vision. 18. Dantzig GB (1957) Discrete-variable extremum problems. Oper
Res 5:266–288
19. Pisinger D (1995) An expanding-core algorithm for the exact 0–1
knapsack problem. Eur J Oper Res 87:175–187
Compliance with ethical standards  20. Della Croce F, Salassa F, Scatamacchia R (2017) An exact
approach for the 0–1 knapsack problem with setups. Comput Oper
Conflict of interest  The authors reported no potential conflict of inter- Res 80:61–67
est. 21. Shaheen A, Sleit A (2016) Comparing between different
approaches to solve the 0/1 knapsack problem. Int J Comput Sci
Netw Secur (IJCSNS) 16:1

13
Engineering with Computers (2022) 38:2771–2790 2789

22. Awasthi Y (2020) Contrasting of various algorithmic techniques 41. Zhou Y, Li L, Ma M (2016) A complex-valued encoding bat
to solve knapsack 0–1 problem. J Syst Integr 10:1–9 algorithm for solving 0–1 knapsack problem. Neural Process Lett
23. Shen J, Shigeoka K, Ino F, Hagihara K (2019) GPU-based branch- 44:407–430
and-bound method to solve large 0–1 knapsack problems with 42. Kong X, Gao L, Ouyang H, Li S (2015) A simplified binary har-
data-centric strategies. Concurr Comput Pract Exp 31:e4954 mony search algorithm for large scale 0–1 knapsack problems.
24. Shen J, Shigeoka K, Ino F, Hagihara K (2017) An out-of-core Expert Syst Appl 42:5337–5355
branch and bound method for solving the 0–1 knapsack problem 43. Sonuc E, Sen B, Bayir S (2016) A parallel approach for solving
on a GPU. In: International conference on algorithms and archi- 0/1 knapsack problem using simulated annealing algorithm on
tectures for parallel processing. Springer, pp 254–267 CUDA platform. Int J Comput Sci Inf Secur 14:1096
25. Kolahan F, Kayvanfar V (2009) A heuristic algorithm approach 44. Zhou Y, Chen X, Zhou G (2016) An improved monkey algorithm
for scheduling of multi-criteria unrelated parallel machines. In: for a 0–1 knapsack problem. Appl Soft Comput 38:817–830
Proceeding of international conference on industrial and mechani- 45. Sajedi H, Razavi SF (2017) DGSA: discrete gravitational
cal engineering—ICIME09 (World Academy of Science, Engi- search algorithm for solving knapsack problem. Oper Res Int J
neering and Technology, WASET), vol 59, pp 102–105 17:563–591
26. Sapre S, Patel H, Vaishnani K, Thaker R, Shastri AS (2019) Solu- 46. Feng L, Gupta A, Ong Y-S (2019) Compressed representation for
tion to small size 0–1 knapsack problem using cohort intelligence higher-level meme space evolution: a case study on big knapsack
with educated approach, socio-cultural inspired metaheuristics. problems. Memet Comput 11:3–17
Springer, Berlin, pp 137–149 47. Zhang L, Lv J (2018) A heuristic algorithm based on expecta-
27. Abdel-Basset M, El-Shahat D, Sangaiah AK (2019) A modified tion efficiency for 0–1 knapsack problem. Int J Innov Comput Inf
nature inspired meta-heuristic whale optimization algorithm Control 14:1833–1854
for solving 0–1 knapsack problem. Int J Mach Learn Cybern 48. Zhan S-H, Zhang Z-J, Wang L-J, Zhong Y-W (2018) List-based
10:495–514 simulated annealing algorithm with hybrid greedy repair and
28. Gómez-Herrera F, Ramirez-Valenzuela RA, Ortiz-Bayliss JC, optimization operator for 0–1 knapsack problem. IEEE Access
Amaya I, Terashima-Marín H (2017) A quartile-based hyper- 6:54447–54458
heuristic for solving the 0/1 knapsack problem. In: Mexican 49. Abdel-Basset M, Zhou Y (2018) An elite opposition-flower pol-
international conference on artificial intelligence. Springer, pp lination algorithm for a 0–1 knapsack problem. Int J Bio-Inspired
118–128 Comput 11:46–53
29. Hu F (2018) A probabilistic solution discovery algorithm for 50. Feng Y, Yang J, Wu C, Lu M, Zhao X-J (2018) Solving 0–1 knap-
solving 0–1 knapsack problem. Int J Parallel Emerg Distrib Syst sack problems by chaotic monarch butterfly optimization algo-
33:618–626 rithm with Gaussian mutation. Memet Comput 10:135–150
30. Gao Y, Zhang F, Zhao Y, Li C (2018) Quantum-inspired wolf pack 51. Li J, Li W (2018) A new quantum evolutionary algorithm in 0-1
algorithm to solve the 0–1 knapsack problem. In: Mathematical knapsack problem. In: International symposium on intelligence
problems in engineering, 2018 computation and applications. Springer, pp 142–151
31. Zhan S, Wang L, Zhang Z, Zhong Y (2020) Noising methods with 52. Feng Y, Wang G-G, Dong J, Wang L (2018) Opposition-based
hybrid greedy repair operator for 0–1 knapsack problem. Memet learning monarch butterfly optimization with Gaussian perturba-
Comput 12:37–50 tion for large-scale 0–1 knapsack problem. Comput Electr Eng
32. Truong TK, Li K, Xu Y, Ouyang A, Nguyen TT (2015) Solv- 67:454–468
ing 0–1 knapsack problem by artificial chemical reaction opti- 53. Nouioua M, Li Z, Jiang S (2018) New binary artificial bee colony
mization algorithm with a greedy strategy. J Intell Fuzzy Syst for the 0–1 knapsack problem. In: International conference on
28:2179–2186 swarm intelligence. Springer, pp 153–165
33. Feng Y, Wang G-G, Deb S, Lu M, Zhao X-J (2017) Solving 0–1 54. Buayen P, Werapun J (2018) Parallel time–space reduction by
knapsack problem by a novel binary monarch butterfly optimiza- unbiased filtering for solving the 0/1-knapsack problem. J Parallel
tion. Neural Comput Appl 28:1619–1634 Distrib Comput 122:195–208
34. Abdel-Basset M, Mohamed R, Mirjalili S (2020) A Binary Equi- 55. Huang Y, Wang P, Li J, Chen X, Li T (2019) A binary multi-scale
librium Optimization Algorithm for 0–1 Knapsack Problems. quantum harmonic oscillator algorithm for 0–1 knapsack problem
Comput Ind Eng 106946. https:​ //doi.org/10.1016/j.cie.2020.10694​ with genetic operator. IEEE Access 7:137251–137265
6 56. Xue J, Xiao J, Zhu J (2019) Binary fireworks algorithm for 0–1
35. Abdel-Basset M, Mohamed R, Chakrabortty RK, Ryan M, Mirjal- knapsack problem. In: 2019 international conference on artifi-
ili S (2020) New binary marine predators optimization algorithm cial intelligence and advanced manufacturing (AIAM). IEEE, pp
for 0–1 knapsack problems. Comput Ind Eng 106949. https​://doi. 218–222
org/10.1016/j.cie.2020.10694​9 57. Ye L, Zheng J, Guo P, Pérez-Jiménez MJ (2019) Solving the 0–1
36. Zhang X, Huang S, Hu Y, Zhang Y, Mahadevan S, Deng Y (2013) knapsack problem by using tissue P system with cell division.
Solving 0–1 knapsack problems based on amoeboid organism IEEE Access 7:66055–66067
algorithm. Appl Math Comput 219:9959–9970 58. Ezugwu AE, Pillay V, Hirasen D, Sivanarain K, Govender M
37. Bhattacharjee KK, Sarmah SP (2017) Modified swarm intelli- (2019) A comparative study of meta-heuristic optimization
gence based techniques for the knapsack problem. Appl Intell algorithms for 0–1 knapsack problem: some initial results. IEEE
46:158–179 Access 7:43979–44001
38. Rizk-Allah RM, Hassanien AE (2018) New binary bat algorithm 59. Czyzżak P, Jaszkiewicz A (1998) Pareto simulated annealing—a
for solving 0–1 knapsack problem. Complex Intell Syst 4:31–53 metaheuristic technique for multiple-objective combinatorial opti-
39. Cao J, Yin B, Lu X, Kang Y, Chen X (2018) A modified artificial mization. J Multi-Criteria Decis Anal 7:34–47
bee colony approach for the 0–1 knapsack problem. Appl Intell 60. Nahar S, Sahni S, Shragowitz E (1986) Simulated annealing and
48:1582–1595 combinatorial optimization. In: 23rd ACM/IEEE design automa-
40. Zhou Y, Bao Z, Luo Q, Zhang S (2017) A complex-valued encod- tion conference. IEEE, pp 293–299
ing wind driven optimization for the 0–1 knapsack problem. Appl 61. Yip PP, Pao Y-H (1995) Combinatorial optimization with use
Intell 46:684–702 of guided evolutionary simulated annealing. IEEE Trans Neural
Netw 6:290–295

13

2790 Engineering with Computers (2022) 38:2771–2790

62. Glover FW, Kochenberger GA (2006) Handbook of metaheuris- scheduling of an agile manufacturing system. Int J Inf Technol
tics. Springer Science & Business Media, Berlin Decis Mak 15(2):239–266
63. Kort BW, Bertsekas DP (1972) A new penalty function method 67. Shahdi-Pashaki S, Teymourian E, Kayvanfar V, Komaki GHM,
for constrained minimization. In: Proceedings of the 1972 IEEE Sajadi A (2015) Group technology-based model and cuckoo opti-
conference on decision and control and 11th symposium on adap- mization algorithm for resource allocation in cloud computing.
tive processes. IEEE, pp 162–166 In: Proceedings of 15th IFAC symposium on information control
64. Silberholz J, Golden B (2010) Comparison of metaheuristics, problems in manufacturing, Ottawa, Canada, May 2015
Handbook of metaheuristics. Springer, Berlin, pp 625–640
65. Kayvanfar V, Moattar Husseini SM, Karimi B, Sajadieh MS Publisher’s Note Springer Nature remains neutral with regard to
(2017) Bi-objective intelligent water drops algorithm to a practi- jurisdictional claims in published maps and institutional affiliations.
cal multi-echelon supply chain optimization problem. J Manuf
Syst 44(1):93–114
66. Teymourian E, Kayvanfar V, Komaki GHM, Khodarahmi
M (2016) An enhanced intelligent water drops algorithm for

13

You might also like