You are on page 1of 5

Information Processing Letters 120 (2017) 6–10

Contents lists available at ScienceDirect

Information Processing Letters


www.elsevier.com/locate/ipl

Notes on a hierarchical scheduling problem on identical


machines ✩
Cheng He ∗ , Hao Lin
School of Science, Henan University of Technology, Zhengzhou, Henan 450052, China

a r t i c l e i n f o a b s t r a c t

Article history: For the hierarchical scheduling problem on identical machines to minimize the maximum
Received 13 September 2015 T-time of all machines under the condition that the total completion time of all jobs
Received in revised form 23 February 2016 is minimum, where the T-time of a machine is defined as the total completion time of
Accepted 8 December 2016
jobs scheduled on the machine, it is NP-hard if the number of the machines is fixed, and
Available online 21 December 2016
strongly NP-hard otherwise. When the number of the machines is fixed, a forward dynamic
Communicated by Nathan Fisher
programming algorithm and a fully polynomial-time approximation scheme (FPTAS) have
Keywords: been presented in a literature. In the literature, it is showed that the worst-case ratio of
Hierarchical scheduling the classical algorithm SPT is at most 11
6
and at least 53 . In this paper, we give an improved
Identical machines 9 7
worst-case ratio, which is at most 5
and at least 4
, of the algorithm. Another algorithm,
Total flowtime
whose worst-case ratio is at most 67 and at least 35
33
, is provided for the two-machine case.
Worst-case ratio
Approximation algorithms On the other hand, we present a backward dynamic programming algorithm and an FPTAS
with the better time complexities.
© 2016 Elsevier B.V. All rights reserved.

1. Introduction and the maximum flowtime

There are n jobs J 1 , J 2 , . . . , J n , with processing times


 
( C j (σ ))max := max C j (πi ).
p 1 , p 2 , . . . , pn , to be processed on m identical machines 1≤i ≤m
j ∈πi
M 1 , M 2 , . . . , M m without preemption. A feasible schedule is
a schedule that non-preemptively process the jobs on the
The makespan of σ is C max (σ ) = max1≤i ≤m, j ∈πi C j (πi ).
machines. Let σ = (π1 , π2 , . . . , πm ) be a schedule of the
In the paper, we focus on the hierarchical scheduling
problem, πi is the sequence of jobs in machine M i (1 ≤
problem on m identical machines to minimize the max-
i ≤ m). Denote by C j (πi ) the completion time of job J j
imum flowtime under the condition that the total flow-
on  
 machine M i . Then the flowtime of machine M i is time is minimum, denoted by P m|| Lex( C j , ( C j )max )
j ∈πi C j (πi ). From this, two objective functions consid-
(called as problem P for short) following the three-field
ered in this paper are the total flowtime
notation of [5]. When m is a part of input, the problem
    
is denoted by P || Lex( C j , ( C j )max ) (called as problem
C j (σ ) := C j (πi ) 

P for short). [1] showed that the problem P m||( C j )max
1≤i ≤m j ∈πi
is NP-hard and the worst-case ratio of algorithm SPT for
3
the problem is at most 3 − m + m12 , and so at most 3
 
✩ for P ||( C j )max . [6] proved that P ||( C j )max is strongly
This work was supported by NSFC (11201121, 11571323) and NSFST-
DOHN (162300410221). NP-hard and the worst-case ratio of algorithm SPT for the
* Corresponding author. problem is at most 2.608. [7] presented the following re-
E-mail address: hech202@163.com (C. He). sults for problem P and problem P .

http://dx.doi.org/10.1016/j.ipl.2016.12.001
0020-0190/© 2016 Elsevier B.V. All rights reserved.
C. He, H. Lin / Information Processing Letters 120 (2017) 6–10 7


(1) Problem P is NP-hard and Problem P is strongly NP- schedule of P ||( C j )max . Hence we confine our attention
hard. on Q-SPT schedules in the following.
(2) The worst-case ratio of algorithm SPT for problem Let J (i ) be the current job set of jobs assigned to M i
P is at most 116
and at least 53 . and T M i be the sum of processing
 times of jobs assigned
(3) The worst-case ratio of algorithm RSPT for problem to M i at present, i.e., T M i = j ∈J ( i ) p j .
P is at most 32 and at least 11
9
.
(4) An O (m! · nm+1 · P s2m )-time forward dynamic pro- Algorithm SPT.
gramming algorithm and an FPTAS with O (m! · nm+1 ·
( (mn+

 )n 2m
) ) time for problem P, where P s = n
j =1 p j .
Step 0: Let T M i := 0, J (i ) := ∅, i = 1, . . . , m and j := n.
In the present paper, we improved the worst-case ratio of Step 1: Let T M i 0 = min1≤i ≤m { T M i } (if a tie, then the min-
algorithm SPT for problem P such that its upper bound imum i 0 first and M i 0 is different from the last ma-
and lower bound are 95 and 74 , respectively. For m = 2, we chinethat is chosen to schedule job). Let J (i 0 ) :=
present a better algorithm called Algorithm DLPT and de- J (i 0 ) { J j } and T M i 0 := T M i 0 + p j and schedule job
duce that its worst-case ratio is at most 76 and at least J j at the end of current schedule on M i 0 .
35
. Moreover, we present an O (m! · nm+1 · P m Step 2: If j > 1, then let j := j − 1 and go back to Step 1.
33 s )-time back-
ward dynamic programming algorithm and an FPTAS with Otherwise stop.
2m+1
O ( m!·n m ) time for problem P.
The paper is organized as follows. An improved worst- Lemma 2.3. The schedule derived by Algorithm SPT is a Q-SPT
case ratio of Algorithm SPT is discussed in Section 2. In schedule.
Section 3, we present a backward dynamic programming
algorithm and an FPTAS for problem P. In Section 4, an- Proof. Obviously, the m jobs J n , J n−1 , . . . , J n−m+1 , i.e.,
other algorithm, called Algorithm DLPT, is provided for J km , J km−1 , . . . , J (k−1)m+1 (for n = km) in Rk are sched-
m = 2. We deduce that the worst-case ratio of Algo- uled first on M 1 , M 2 , . . . , M m , respectively, by Algorithm
rithm DLPT is at most 76 and at least 35 . SPT. Then by Algorithm SPT, we have
33

2. Algorithm SPT Claim 1. Job J lm−i in Rl is scheduled on M i +1 for 1 ≤ l ≤ k and


0 ≤ i ≤ m − 1.
Recall that we have n jobs and m machines. We may
assume that n = km for some positive integer k (otherwise Proof of Claim 1. Algorithm SPT shows that the jobs in
we may add some dummy jobs with processing time 0 and Rk , Rk−1 , . . . , R1 are scheduled one by one. We prove
the dummy jobs are scheduled first on the machines with- Claim 1 by induction on the number l of ranks in Rl . The
out affecting the two objectives in any schedule). With- basic case, l = k, is obvious. Assuming that Claim 1 holds
out loss of generality, we assume that p 1 ≥ p 2 ≥ · · · ≥ pn . for 2 ≤ l ≤ k − 1, we will show that it also holds for l = 1.
According to the assumption that Claim 1 holds for 2 ≤
We partition job set J = { J 1 , J 2 , . . . , J n } into k ranks,
l ≤ k − 1, we have T M 1 ≤ T M 2 ≤ . . . ≤ T M m , just before
where R j = { J ( j −1)m+1 , J ( j −1)m+2 , . . . , J ( j −1)m+m } is the
the jobs in R1 are scheduled, by p 1 ≥ p 2 ≥ · · · ≥ pn . From
j-th rank, j = 1, . . . , k.
Algorithm SPT, we have jobs J m , J m−1 , . . . , J 1 in R1 are
Let π be the schedule obtained from schedule σ by
scheduled one by one. So job J m in R1 is scheduled on M 1
interchanging the positions of two jobs with the same pro-
by Algorithm SPT. Further, at present T M 1 + pm = pkm +
cessing times. Then π is essentially the same as σ . So we
p (k−1)m + . . . + pm ≥ pkm + p (k−1)m+1 + . . . + pm+1 = pkm +
regard the schedules up to the permutations among the
T M m ≥ T M m ≥ T M m−1 ≥ . . . ≥ T M 2 . Hence, next job J m−1
jobs of the same processing times as the same schedules
in R1 is scheduled on M 2 by Algorithm SPT. Similarly, we
throughout
 the paper. Suppose  that  the optimal value  of may prove that J m−2 , J m−3 , . . . , J 1 in R1 are scheduled on
P || C j is T ∗ . Then P || Lex( C j , ( C j )max ) ⇔ P | C j ≤
M 3 , M 4 , . . . , M m , respectively. Therefore Claim 1 also holds
T ∗ |( C j )max .
for l = 1.

Definition 2.1. A schedule σ is called Quasi-SPT (Q-SPT for By Claim 1 and Algorithm SPT, the schedule derived by
short) if σ satisfies the following three conditions. Algorithm SPT is a Q-SPT schedule. 2

• Each machine receives exactly one job from R j , j =


So the schedule derived by Algorithm  SPT is a feasi-
1, . . . , k.
ble schedule for problem P | C j ≤ T ∗ |( C j )max . By a
• Jobs on each machine are scheduled in the non- more elaborate analysis on the upper bound of the worst-
decreasing order of processing time.
case ratio of Algorithm SPT of [7], we receive better upper
• There is no idle time. bound and lower bound.

Lemma 2.2. ([2]) A schedule σ  is Q-SPT if and only if σ is an Theorem 2.4.


 The worst-case
 ratio of Algorithm SPT for the
optimal schedule of problem P || C j . problem P | C j ≤ T ∗ |( C j )max is at most 95 and at least 74 .

Lemma
 2.2 implies that solving the problem P | C j ≤ Proof. Without loss of generality, we may suppose that
T ∗ |( C j )max equivalents to finding a Q-SPT optimal k ≥ 4 by adding some dummy jobs with processing time
8 C. He, H. Lin / Information Processing Letters 120 (2017) 6–10

0. Let σ = (π1 , π2 , . . . , πm ) be the schedule obtained by


∗ ∗ , π ∗ , . . . , π ∗ ) be an optimal
Algorithm SPT and  σ = (π1 2 m
schedule for P | C j ≤ T ∗ |( C j )max . From Claim 1, we
have πi +1 = { J km−i , J (k−1)m−i , . . . , J m−i } for 0 ≤ i ≤ m − 1.
For convenience, assume that x j = p ( j −1)m+1 , j = 1, . . . , k.
Then x1 ≥ x2 ≥ · · · ≥ xk and

 
k 
k
( C j (σ ))max = jp ( j −1)m+1 = jx j (1) Fig. 1.1. The schedule σ = (π1 , π2 , π3 , π4 ) derived by Algorithm SPT.
j =1 j =1

by p 1 ≥ p 2 ≥ · · · ≥ pn . For j ∈ {1, 2, . . . , k}, assume that


job J ( j −1)m+1 is scheduled on machine M j  in σ ∗ . Let the
processing times of k jobs that are scheduled on M j  be
y 1 , y 2 , · · · , yk , respectively, and y 1 ≥ y 2 ≥ · · · ≥ yk . Hence
x j = y j , and y i ≥ xi +1 for 1 ≤ i ≤ k − 1 by Lemma 2.2.
  k
Note that ( C j (σ ∗ ))max ≥ j ∈π ∗ C j (π ∗j  ) = i =1 iy i . Es-
j
pecially, when j = 1, 2, 3, we have Fig. 1.2. The optimal schedule σ ∗ = (π1∗ , π2∗ , π3∗ , π4∗ ).

 k−1

( C j (σ ∗ ))max ≥ x1 + ixi +1 , (2) backward dynamic programming algorithm and an FPTAS,
whose time complexities are less.
i =2
Let F j be j-th (1 ≤ j ≤ k + 1) state space, and the ini-
 k−1
 tial state space F k+1 = [0, 0, . . . , 0]. F j is the set of state
∗ k− j +1
( C j (σ ))max ≥ 3x2 + ixi +1 , (3) vectors [t 1 , t 2 , . . . , tm ] of a schedule for job set i =1 Ri ,
i =3 j = 1, . . . , k, where t i is the flowtime of the jobs on
 k−1
 M i (1 ≤ i ≤ m). F j −1 can be recursively derived from F j ,
( C j (σ ∗ ))max ≥ x2 + 5x3 + ixi +1 . (4) j = 2, . . . , k + 1. By Lemma 2.2, there are m! assignments
i =4
of the jobs of Rk− j +2 for one state vector from F j . There-
fore at most m! different state vectors of F j −1 are gen-
From (1)–(4), we have erated. Suppose that [t 1 , t 2 , . . . , tm ] ∈ F j and π is a per-
  mutation of {1, 2, . . . , m} that represents an assignment of
( C j (σ ))max = (x1 + ki =−21 ixi +1 ) the jobs of Rk− j +2 to the machines. Then the derived

+ (2x2 + x3 + ki=4 xi ) state vector [t 1 , t 2 , . . . , tm
 ] in F 
j −1 satisfies t i = t i + (k −
  j + 2) p (k− j +1)m+π (i ) , i = 1, 2, . . . , m. The optimal objective
≤ ( C j (σ ∗ ))max + 35 (3x2 + ki =−31 ixi +1 )
 value is min{max{t 1 , t 2 , . . . , tm }|[t 1 , t 2 , . . . , tm ] ∈ F 1 }.
+ 15 (x2 + 5x3 + ki =−41 ixi +1 ) We see that for any [t 1 , t 2 , . . . , tm ] ∈ F j , t i ≤ n P s for
  any 1 ≤ i ≤ m. So | F j | ≤ (n P s )m . It needs O (m! · m) time
≤ ( C j (σ ∗ ))max + 35 ( C j (σ ∗ ))max
 to calculate the generated m! state vectors of F j −1 from
+ 15 ( C j (σ ∗ ))max each state vector of F j . So the overall complexity of the

≤ 95 ( C j (σ ∗ ))max dynamic programming is O (m! · nm+1 · P m s ). Thus we can
design an FPTAS from thedynamic programming  by the
The following example shows that 74 is a lower bound rounding technique for P | C j ≤ T ∗ |( C j )max .
of the worst-case ratio of Algorithm SPT. There are 13 Let I be an instance on job set J and machine set
jobs, with the processing times 75, 39, 39, 39, 39, 21, M. And let P 0 be the objective value by applying Algo-
21, 21, 21, 9, 9, 9, 9, respectively, to be processed on 4 rithm SPT to I and γ = 9n20 , where 0 <  < 1. Let I 
5 P
identical machines M 1 , M 2 , M 3 and M 4 without preemp- be the new instance with the same job set and machine
tion. Fig. 1.1 and Fig. 1.2 are the schedule that gener- set as those in I , but the processing time p j of job J j is
ated by AlgorithmSPT and the optimal schedule,
 respec- p
tively. We have ( C j (σ ))max = max1≤i ≤4 C j (πi ) = p j = γj
γ , j = 1, 2, . . . , n. Therefore p j ≤ p j ≤ p j + γ .
  j ∈ π i 
252 and ( C j (σ ∗ ))max = max1≤i ≤4 j ∈π ∗ C j (πi ) = 144.
Let σ ∗ and σ ∗ be the optimal schedules obtained by
 i
performing the dynamic programming for I and I  , re-
( C (σ ))max
Therefore ( C j(σ ∗ )) = 74 , yields the desired result. 2 spectively. Let σ be the schedule obtained by replacing
j max

the processing time p j of each job J j by p j (1 ≤ j ≤ n)


  
from σ ∗ . Then C j (σ ) ≤ C j (σ ∗ ) + nγ and ( C j (σ ))max ≤
3. Dynamic programming algorithm and FPTAS   
(  C j (σ ∗ ))max + n2 γ ≤ ( C j (σ ∗ ))max + 59  P 0 ≤ (1 +
n  )( C j (σ ∗ ))max by p j ≤ p j + γ and Theorem 2.4.
Let P s = j =1 p j . [7] presented an O (m! · nm+1 · As for the running time, since the running time of per-
P s2m )-time forward dynamic programming algorithm and forming Algorithm SPT is O (n log n), P 0 may be derived in
(mn+ )n
an FPTAS ! · nm+1 · (  )2m ) time for the prob-
with O ∗(m O (n log n) time. When performing the dynamic program-

lem P | C j ≤ T |( C j )max . In this section, we provide a ming for I  , since t i ≤ (C j (σ ∗ ))max ≤ (C j (σ ∗ ))max ≤ P 0
C. He, H. Lin / Information Processing Letters 120 (2017) 6–10 9

and all processing times in I  are multiples of γ , t i has Lemma 4.2 implies that σ  is an optimal partition
at most l choices, where schedule of P 2||C max in respect to job set J  if and only
if σ  is an optimal schedule of P 2||C max in respect to job
ti P0 9n2
l= ≤ = . set J 
γ γ 5 • Algorithm LPT: First sort all jobs in the order of
Therefore, the total running time of obtaining schedule σ non-increasing processing times, then assigns the first un-
2m+1 processed job in the sequence to the machine, which can
is O ( m!·n m ). Hence we have
process it as early as possible, till all jobs is processed.

Theorem 3.1.
 The abovedynamic programming can solve the Lemma 4.3. ([4]) The worst-case ratio of Algorithm LPT for the
problem P | C j ≤ T ∗ |( C j )max in O (m! · nm+1 · P m
s ) time.
problem P m||C max is at most 34 − 3m
1
. Further, the worst-case
7
2m+1 ratio of Algorithm LPT for the problem P 2||C max is at most .
Theorem 3.2. Thereexists an O ( m!·n m )-time FPTAS for prob- 6

lem P | C j ≤ T |( C j )max .
Algorithm DLPT.

4. Algorithm DLPT for the two-machine case Step 0: Let I be an instance on job set J = { J 1 , J 2 , . . . ,
J n }, with p 1 ≥ p 2 ≥ · · · ≥ pn , and machine set M =
In this section, we provide a better algorithm, called Al-
{ M 1 , M 2 }. Construct a new instance I  : the job set
gorithm DLPT, for the two-machine case (i.e., m = 2). We
J  := J and machine set M := M, and p 2i −1 :=
show that the worst-case ratio of Algorithm DLPT is at
ip 2i −1 and p 2i := ip 2i (obviously, p 2i −1 ≥ p 2i ) for 1 ≤
most 76 and at least 35 . Let n = 2k. 
33 i ≤ k.
Let
 σ = (π1 , π2 ) be any feasible schedule of P 2| C j ≤ Step 1: Perform Algorithm Partition on instance I  and get

T |( C j )max and p ji be the processing time of job
instance I  . Let σ  = (π1 , π2 ) be the schedule by Per-
that is scheduled on j-th position on machine M i ( j =
forming Algorithm LPT on I  and obtain schedule σ  .
1, . . . , k; i = 1, 2). A schedule σ  = (π1 , π2 ) is called de-
Step 2: Let σ be the schedule, obtained by replacing p j by
rived schedule of σ if σ  is obtained by replacing the
processing time p ji by p ji := (k − j + 1) p ji in σ , j =
p j (1 ≤ j ≤ n) in σ  , of instance I .
1, . . . , k; i = 1, 2. Then
By Lemma 4.2 and the definition of σ , the sched-
 ule σ derived by Algorithm DLPT is a Q-SPT schedule
Lemma 4.1. C max (σ  ) = ( C j (σ ))max .  
 I and σ is the derived
of P 2||( C j )max on instance
schedule of σ . Therefore ( C j (σ ))max = C max (σ  ) by
Proof.
By the definition
 of σ  , we have C max (σ ) =

Lemma 4.1.
max{ 1≤ j ≤k p j1 , 1≤ j ≤k p j2 } = max{ 1≤ j ≤k (k − j + 1) p j1 ,
  
(k − j + 1) p j2 } = max{ j ∈π1 C j (π1 ), j ∈π2 C j (π2 )} = 
Lemma 4.4. Let σ ∗ be an optimal schedule of P 2||C max on
1≤ j ≤k
( C j (σ ))max . This completes the proof. 2 instance I  and σ  be the schedule derived by Algorithm DLPT.
C (σ  )
 Then max ∗ ≤ 76 .
C max (σ
Lemma
 4.1 implies that solving the problem P 2| C j ≤ )

T ∗ |( C j )max can be transformed into finding an optimal


partition schedule, such that the two jobs J 2i −1 and J 2i
Proof. The result holds by Lemma 4.3. 2
from Ri (1 ≤ i ≤ k) are scheduled on the different ma-
chines M 1 and M 2 , of P 2||C max by changing the processing Theorem 4.5. Let σ be the schedule derived by Algorithm DLPT,
time of all jobs. and σ ∗ be an optimal schedule

for problem P 2 | Cj ≤
 ( C j (σ ))max
• Algorithm Partition: For a given instance I  with job T ∗ |( C j )max . Then 35
33
≤ 
( C j (σ ∗ ))max
≤ 76 .
set J  = { J 1 , J 2 , . . . , J n } and machine set M = { M 1 , M 2 },
J j has a processing time p j ( j = 1, . . . , n) and p 2i −1 ≥ p 2i  
Proof. Let σ ∗ be the derived schedule of σ ∗ and σ ∗ be
for 1 ≤ i ≤ k. Construct a new instance I  on job set J  := 
an optimal schedule of P 2||C max on instance I  and σ0∗
{ J 1 , J 3 , J 5 , . . . , J n−1 } and machine set M = { M 1 , M 2 },
the processing time of J 2i −1 be p 2i −1 := p 2i −1 − p 2i for
be the optimal partition schedule,
  of P 2||C max on instance 
I  , corresponding to σ ∗ . Then ( C j (σ ∗ ))max = C max (σ ∗ )
1 ≤ i ≤ k. Let σ  = (π1 , π2 ) be any feasible schedule of 
by Lemma 4.1, and σ ∗ is an optimal partition schedule of
P 2||C max in respect tojob set J  . Let J (i ) = { J 2 j −1 |1 ≤
P 2||C max on instanceI  since σ
∗ be an optimal sched-
j ≤ k and J 2 j −1 ∈ πi } { J 2 j |1 ≤ j ≤ k and J 2 j −1 ∈ π(3−i ) } ∗
ule for problem P 2| C j ≤ T |( C j )max on instance I .
for i = 1, 2. Let σ  be the schedule, in which the jobs    
So C max (σ ∗ ) = C max (σ0∗ ) = C max (σ ∗ ) + 1≤i ≤k p 2i and
in J (i ) are processed in the order of non-increasing sub-
C max (σ  ) = C max (σ  ) + 1≤i ≤k p 2i by Lemma 4.2. Hence
scripts on machines M i (i = 1, 2), in respect to job set J  .  
( C j (σ ))max C max (σ  ) C max (σ  )+ p 2i C max (σ  )
 =  =  1≤i ≤k ≤  ≤
( C j (σ ∗ ))max C max (σ ∗ ) C max (σ ∗ )+ 1≤i ≤k p 2i C max (σ ∗ )
Lemma 4.2. ([3]) Let σ  be the schedule derived by Algorithm 7
by Lemma 4.4.
Partition. Then σ  is a feasible partition schedule, such that the 6 
( C (σ ))max
two jobs J 2i −1 and J 2i are scheduled on the different machines The following example shows that ( C j(σ ∗ )) ≥ 35
33
.
j max
M 1 and M 2 , of P 2||C maxin respect to job set J  . Moreover, Assume that instance I contains 9 jobs with the pro-
C max (σ  ) = C max (σ  ) + 1≤i ≤k p 2i . cessing times p 1 = 91
30
, p 2 = 23
15
, p 3 = 23
15
, p 4 = 47
60
, p5 =
10 C. He, H. Lin / Information Processing Letters 120 (2017) 6–10

9
5
, p 8 = 4
5
, p 9 = 1, respectively. Further, instance I  con-
tains 5 jobs with the processing times p 1 = 3
2
, p 3 =
3
2
, 5p  = 1, p 
7 = 1, p 
9 = 1, respectively. The jobs of each in-
stance are processed on 2 identical machines M 1 and M 2 .
Fig. 2.1. The schedule σ  derived by Algorithm DLPT.
Figs. 2.1–2.4
show the running process  of Algorithm DLPT.
We have ( C j (σ ))max = 8.75 and ( C j (σ ∗ ))max = 8.25.
( C (σ ))max 35
Therefore ( C j(σ ∗ )) = 33
, yields the desired result. 2
j max

Fig. 2.2. The schedule σ  derived by Algorithm DLPT. References

[1] E. Angel, E. Bampis, F. Pascual, How good are SPT schedules for fair
optimality criteria, Ann. Oper. Res. 159 (2008) 53–64.
[2] P. Brucker, Scheduling Algorithms, Springer, New York, 2007.
[3] B.T. Eck, M. Pinedo, On the minimization of the makespan subject to
flowtime optimality, Oper. Res. 41 (1993) 797–801.
Fig. 2.3. The schedule σ derived by Algorithm DLPT. [4] R.L. Graham, Bounds on multiprocessing timing anomalies, SIAM J.
Appl. Math. 17 (1969) 416–429.
[5] R.L. Graham, E.L. Lawler, J.K. Lenstra, A.H.G. Rinnooy Kan, Optimiza-
tion and approximation in deterministic sequencing and scheduling: a
survey, Ann. Discrete Math. 5 (1979) 287–326.
[6] L. Wan, Z.H. Ding, Q.Q. Chen, Z.Y. Tan, Scheduling to minimize the
maximum total completion time per machine, Eur. J. Oper. Res. 242
(2015) 45–50.
Fig. 2.4. The optimal schedule σ ∗. [7] L. Wan, R. Ma, J.J. Yuan, Primary–secondary bicriteria scheduling on
identical machines to minimize the total completion time of all jobs
47 9 9 and the maximum T-time of all machines, Theor. Comput. Sci. 518
60
, p 6 = 20 , p 7 = 20 , p 8 = 15 , p 9 = 15 , respectively. So in-
 (2014) 117–123.
stance I contains 9 jobs with the processing times
p 1 = 91
30
, p 2 = 2315
, p 3 = 4615
, p 4 = 47
30
, p 5 = 47
20
, p 6 = 27
20
, p 7 =

You might also like