Theoretical Computer Science 508 (2013) 35–40

Contents lists available at ScienceDirect

Theoretical Computer Science
journal homepage: www.elsevier.com/locate/tcs

2D knapsack: Packing squares✩
Yan Lan a , György Dósa b , Xin Han c,∗ , Chenyang Zhou c , Attila Benko b
a b c

Dalian Neusoft Institute of Information, China Department of Mathematics, University of Pannonia, Veszprém, Hungary School of Software of Dalian University of Technology, China

abstract
In this paper, we study a two-dimensional knapsack problem: packing squares as many as possible into a unit square. Our results are the following: (i) we propose an algorithm called IHS (Increasing Height Shelf), and prove that the packing is optimal if in an optimal packing there are at most 5 squares, and this upper bound is sharp; , we propose a simple and fast algorithm (ii) if all the squares have side length at most 1 k
k+2 in time O(n log n); with an approximation ratio k +k3 2 (iii) we give an EPTAS for the problem, where the previous result in Jansen and Solis-Oba (2008) [16] is a PTAS, not an EPTAS. However our approach does not work on the previous model of Jansen and Solis-Oba (2008) [16], where each square has an arbitrary weight.
2

© 2012 Elsevier B.V. All rights reserved.

1. Introduction The knapsack problem is one of the most classical and well studied problems in the combinatorial optimization field and has a lot of applications in the real world [17]. The (classical) knapsack problem is given a knapsack and a set of items with weights and sizes, to maximize the total weight of selected items in the knapsack satisfying the capacity constraint. In this paper, we study a geometric version of the 2D knapsack problem, where items are squares with weight 1 and side at most 1 and the knapsack is a unit size square and the objective is to maximize the total number of squares packed in the knapsack. In the packing, the sides of the items should be parallel to the corresponding sides of the knapsack and overlapping is not allowed. The problem was first studied by Baker et al. [2]. They gave an approximation algorithm with an asymptotic ratio 4/3. As mentioned in [16], this geometric packing problem has received a lot of attention recently [15,16,14,9,4], and has its applications in stock cutting, advertisement placement, image processing, and VLSI design [15,9]. Related work: It is well-known that the 1D knapsack problem is NP-hard and admits fully polynomial time approximation schemes (FPTAS) and the corresponding fractional problems can be solved by a greedy algorithm [1,5,11,17]. For the 2D geometric knapsack, in [3] Caprara and Monaci gave a simple algorithm with an approximation ratio 3 + ϵ . Jansen and

✩ Partially supported by the NSFC (11101065) and ‘‘the Fundamental Research Funds for the Central Universities’’. The last author is supported in part by Project TAMOP-4.2.2/B-10/1-2010-0025. ∗ Corresponding author. Tel.: +86 411 87571630. E-mail addresses: lanyan@neusoft.edu.cn (Y. Lan), dosagy@almos.vein.hu (G. Dósa), hanxin.mail@gmail.com (X. Han), zcy1988@gmail.com (C. Zhou), benko.attila@almos.vein.hu (A. Benko).

0304-3975/$ – see front matter © 2012 Elsevier B.V. All rights reserved. doi:10.1016/j.tcs.2012.07.035

Finally. and prove that both algorithms work very well when all the items are small.. we also use ai to denote the side length of the item.e.8. 1 a2 i ≤ 1. As for the online version of the knapsack problem. An example of IHS packing is given in Fig. since the average area of the first k items is not larger than the average area of the first j items. We analyze algorithms by using one of the standards: the approximation ratio. i.12. . Increasing height shelf If a rectangular box has length 1. . where ϵ > 0 can be arbitrarily small and they [15] also gave a simple (2 + ϵ)approximation algorithm for another version of the problem to maximize the total number of rectangles packed in a rectangular box. not an EPTAS. [6]. Then Jansen and Solis4 Oba [16] proposed a (1 + ϵ) approximation algorithm. we denote an item by ai . RA = sup A(L) L where OPT (L) is the optimal value and A(L) denotes the number of items packed by algorithm A. k i=1 Lemma 2. refer to the papers [10. which is simpler and more efficient than the previous result [16]. where k ≥ 1 is an integer. Our contributions: We first propose a simple and fast algorithm called IHS (Increasing Height Shelf). there is no overlapping between any two items. i. If OPT (L) ≥ k + 1 then we must have that: the smallest k + 1 squares can be packed together. aj can be packed in the knapsack. the bottoms of all the squares lie on the bottom of the shelf. we first propose a simple algorithm called increasing height shelf (IHS). 2 2. The details are given in Table 1. In this paper.. In this section.e. 1. If OPT (L) ≤ k then the lemma holds. then pack shelves into the knapsack until all the squares are packed or there is no room for a shelf in the knapsack. The key point is how to select the heights of shelves. Proof. Otherwise assume OPT (L) = j > k. and (ii) IHS is not optimal for some instance L if OPT (L) ≥ 6.7. Fishkin et al. Harren [9] gave a ( 5 + ϵ) approximation algorithm. The main ideas of shelf packing are (i) we cut the square knapsack into a set of boxes with length 1.19. If each item (square) has its weight equal to its area. and a set of square items L = {a1 . Proof. where each square has an arbitrary weight. . 3. If k+1 i=1 a2 i > 1. . . Given an input sequence L. In the algorithm IHS. we call it a shelf. and prove that the packing by IHS is optimal if there are at most 5 squares packed in an optimal packing. the approximation ratio of an approximation algorithm A is defined as follows: OPT (L) . the sides of items are parallel to the corresponding sides of the knapsack. Our objective is to maximize the number of items packed in the knapsack. which causes a contradiction with the assumption. . in which we sort all the items in ascending order of side length at first. We can use the following two lemmas to estimate the optimal solution. an }. which runs in time O(n log n). Then we prove that (i) IHS is an optimal packing if OPT (L) ≤ 5. However our approach does not work on the previous model in [16]. .20]. and (ii) in each shelf we pack square items in a greedy way. It is not difficult to see that the smallest j j items a1 . then OPT (L) ≤ k. finally we give an EPTAS for the ratio is k +k3 if all the items have size (side length) at most 1 2 k problem. 3. a set of shelves. / Theoretical Computer Science 508 (2013) 35–40 Zhang [14] improved the ratio to (2 + ϵ). For packing squares with arbitrary weights. OPT (L) ≤ k. based on the Modified IHS we obtain a polynomial time approximation algorithm for the problem. . Due to a1 ≤ a2 ≤ · · · ≤ aj . So.. [4] gave a PTAS independently. Then we have i=1 a2 i ≤ 1.e. then we have k ≤ j . A simple algorithm IHS and its applications k+1 i=1 a2 i ≤ 1. secondly we propose a modified IHS algorithm and prove that its approximation k+2 .18.1. then OPT (L) ≤ k · α . Han et al. Packing Squares into a Knapsack Input: a square knapsack with a unit size. If α ≤ k i =1 1 α a2 i ≥ α . where n is the number of items in the input L. Assume a1 ≤ a2 ≤ · · · ≤ an in the input L. PTAS.13. i. and this upper bound of 5 is sharp. and Lemma 1.. Output: select a set of items F ⊆ L with a maximum number of squares which can be packed into the knapsack subject to the following constraints: 1. Hence this lemma holds.36 Y. 2. Then pack squares into shelves from small to large. i. Lan et al.e. where j > k. . we first sort squares in ascending order of their side length. Preliminaries and models In this section. Assume a1 ≤ a2 ≤ · · · ≤ an in the input L. we analyze algorithm IHS and a modified IHS algorithm. so it is very natural to consider small items first. we formally define our problem. then divide the knapsack into several layers and put as many items as possible into each layer.

IHS may not produce an optimal packing. Otherwise if i = 1 ai ≤ 1 then the five items are packed into two shelves.2. If i=1 ai ≤ 1 then we are done. we have ai + aj ≤ 1. that is. )(k−2) 2 2 . . Then for any 1 ≤ i < j ≤ 5. a3 ). the approximation ratio of IHS is (k−1k . then rename all unpacked squares as a1 ≤ a2 ≤ · · · and goto step 2. When OPT (L) = 5. else goto next step. 4 Delete all the squares just packed from the input list. Else stop. / Theoretical Computer Science 508 (2013) 35–40 37 Fig. we have the following result: a1 + a2 + a3 ≤ 1. Algorithm IHS is an optimal packing if OPT (L) ≤ 5. if . items a4 . By (1) we have a4 + a5 ≤ 1. we only prove the case: OPT (L) = 5 since the proof is similar for the other cases OPT (L) ≤ 4. Lan et al. whereas IHS only packs a1 . . It is not difficult to see that all the items in L can be packed in the knapsack. a5 ). a5 have size 1/3 and item a6 has size 2/3. and item a5 is packed into a shelf of size (1. . at each corner of the knapsack there is one square packed. Proof. a5 are packed into a shelf of size (1. a6 }. (1) 5 4 3. . (2) This result can be observed as below: given an optimal packing with OPT (L) = 5. Next we prove the negative result: when OPT (L) ≥ 6. Let a1 ≤ a2 ≤ · · · ≤ an be the items sorted. . where integer k ≥ 3. Increasing height shelf. 1. a5 ). and IHS is not optimal for some instance L if OPT (L) ≥ 6. . it is not difficult to see that the smallest squares a1 . pack these items into a shelf with width 1 and height ai . Then for some 1 ≤ i < j < k ≤ 5 we have ai + aj + ak ≤ 1. . and we do not change the positions of other squares in the optimal packing. since all the five items can be packed in a shelf of size (1. Consider the following input L = {a1 . a5 ) due to a4 + a5 ≤ 1. 1 Sort all the items in ascending order of their side length. . then the approximation ratio of IHS is at most (k−1k )(k−2) Theorem 2. otherwise squares ai and aj cannot be packed together. a1 . then the two shelves can be packed into the knapsack. If there are some squares unpacked. . . . If the shelf exceeds over the top of the knapsack. a5 can be packed in the knapsack. . 1. . Let a1 ≤ a2 ≤ · · · ≤ an be the input squares. IHS algorithm for packing small items In this subsection. . see Fig. Due to OPT (L) = 5. . . Otherwise OPT (L) ≤ 4. Theorem 1. Table 1 Algorithm: increasing height shelf. Otherwise if 3 i=1 ai ≤ 1 then the five items are also packed into two shelves too. By (1) we have a3 + a5 ≤ 1. then the two shelves can be packed into the knapsack. a1 . When all the items have side length at most 1/k. 2 Find a largest index i such that i j=1 aj ≤ 1. a4 are packed into a shelf of size (1. then pack the shelf on the bottom of the knapsack. a5 into the knapsack.Y. . that is. . else pack the shelf on the top of the last shelf in the knapsack. In particular. Items a1 . . . . 3 Pack the shelf into the knapsack: if there is no shelf in the knapsack. . . we adopt algorithm IHS for packing small items and analyze its approximation ratio. . a2 . all the small items have size at most 1/k. a4 ). a3 are packed into a shelf of size (1. then cancel packing the shelf and stop. For the positive result. by (1) we can repack the optimal packing like this.

Proof. (b) Remove the items just packed from the input list. we discard some items to get a feasible packing such that the number of items discarded is small. 2 By Lemma 1. Remember at Step 3(b).e. All the remaining items in shelves Sj for 1 ≤ j ≤ i form a feasible packing. In the algorithm. 2. (4) and (3). k2 Proof. . . Observe that in the vertical dimension we select the smallest i shelves in our solution. )(k−2) According to Theorem 2.389. A modified IHS algorithm for packing small items + k22 . pack these items into a shelf with width j=1 aj and height au . (7) j k where j ≥ k since each item has width at most 1 . . then in phase 2. . i. Trim the infeasible packing: (a) Let Sk be the shelf generated in the k-th round. the approximation ratio is 1.e. Otherwise. if there is no item unpacked. (6) i Let us consider the approximation factor in the horizontal dimension. According to algorithm IHS. we construct an infeasible packing based on the algorithm IHS (the infeasible packing helps us to estimate the upper bound of the optimal solution). Sort all the items in ascending order of their side length. let L = {a1 . in each shelf we remove at most one item. . we have h1 ≤ h2 ≤ · · · ≤ hj ≤ Then we have j  i=1 1 k . Also we can estimate the approximation ratio for k = 1 and k = 2. Let S1 . Theorem 3. . Assume the height of Sk is hk . when k = 10. Observe that for 2 ≤ i ≤ j in shelf Si each square has side at least hi−1 and in the horizontal dimension the total size of side lengths is at least 1 − 1/k. 3. 3. Sj . Y. The details are given in Table 2. packing items into a shelf with width equal to or larger than 1 until all the items are packed into shelves. . But we cannot estimate the approximation ratio for k = 1 or k = 2. so the approximation factor in the vertical dimension is m . we have the approximation ratio of IHS is (k−1k . . S2 . hence the total area of squares packed in shelf Si is as least   1 . Lan et al. i i is the maximal index such that j=1 hj ≤ 1. then just remove the item. S2 . an } be items sorted. the approximation factor in the horizontal dimension is at most j+1 k+1 ≤ . if the last item is over-packed. . (5) hi−1 · 1 − k Then by (5). . repeat the above packing. . where shelf Si has height hi . After applying IHS for packing items.. When we construct a feasible solution from the infeasible solution. IHS gets a pretty good result for small items. . i. where a1 ≤ a2 ≤ · · · ≤ an . say S1 . . . i. there are two phases: in phase 1. / Theoretical Computer Science 508 (2013) 35–40 1. which is better than the upper bound of In this subsection we propose a modified IHS with an upper bound 1 + 3 k algorithm IHS. Sk is at least  1− 1 k  j −1 i=1  hi ≥ 1− 1 k  1− 2 k  . Construct an infeasible packing: u u (a) Find a smallest index u such that j=1 aj ≥ 1. in a shelf with j items packed.e. . When all the items have side length at most 1/k. the total area of squares packed in S2 . where i ≥ k. i (b) Find a largest index i such that j=1 hj ≤ 1. Hence. j −1  i=1 hi > 1 − (4) otherwise shelf Sj+1 would have been packed. 2 k (3) hi > 1 − 1 k .38 Table 2 A modified IHS algorithm. For instance. the theorem holds.3. Sm be the shelves generated in the phase of constructing an infeasible solution. . the approximation ratio of Modified IHS is 1 + 3 k + 2 .. k . (c) Shrink the width of each shelf to 1. there is no item 1 bigger than 10 . assume there are j shelves packed in the knapsack. then we are done and the approximation ratio of IHS is 1.

we have two cases. (9) and (8). so the approximation factor in the 2 2 vertical dimension is i+ ≤ k+ . then there exists a packing in which each item is immediately adjacent to some item to its left (or to the bins’ left border) and to some item below (or to the bottom of the bin). Imagine that all the squares in ∪j=1 Sj are packed into i + 2 shelves. Hence. Observe that for 2 ≤ j ≤ i + 2 in shelf Sj each square has side at least hj−1 and in the horizontal dimension the total size of side lengths is at least 1. (9) otherwise shelf Si+1 would have been selected. . Case 2 m ≥ i + 2: we can prove that. Let us consider the approximation factor in the horizontal dimension. in each shelf we remove at most one item. i+1  j =1 hj > 1. where |Sj | is the number of squares packed in Sj after the second phase. . Thus there are at most 2c · 2c potential positions for each item. The main ideas to produce an EPTAS are below: we first guess whether OPT (L) is larger than a constant c . Next we give the details. Si+2 is at least i +1  j =1 (10) hj > 1. The following proof is from [14] at the bottom of page 331. where i ≥ k. Observe that if a packing exists. The most naive way to find a feasible packing of c items (or determine that no such packing exists) is probably the following. we give the proof again. Proof. the approximation factor in the horizontal dimension is at most j ≤ k+ . Lemma 3 ([14]). i. Hence the approximation ratio of our algorithm is width at most 1 k k+2 k × k+1 k =1+ 3 k + 2 k2 . According to algorithm IHS. if OPT (L) ≤ c then enumerate all the cases to get an optimal packing by Lemma 3.e. So we need other techniques which are described here. Observe that in the vertical dimension we select the smallest i shelves in our solution. OPT (L) is upper bounded by the total number of squares i +2 i+2 packed in ∪j =1 Sj . .4. In such a packing. Take a sufficiently small constant 0 < ϵ ≤ 1/7 such that 1 is ϵ 2 2 an integer. it is not enough to get an efficient polynomial time approximation scheme (EPTAS) for the general case. Let k = 1 . before we delete some squares from shelves in Sj . Lan et al. since i ≥ k.. the total area of squares packed in S2 . When i k we construct a feasible solution from the infeasible solution. and similarly. (8) hj ≤ 1. in a shelf with j j +1 1 items packed. An efficient polynomial time approximation scheme Though algorithms IHS and Modified IHS are good for packing small items. else remove large items then apply the Modified IHS algorithm for the remaining items.Y. m Case 1 m ≤ i + 1: The optimal value OPT (L) is upper bounded by j=1 |Sj |. By Lemma 2. the total area of all the squares +2 in ∪i j=1 Sj is at least 1. For the sake of completeness. by (6) and (7) the approximation ratio is m i × k+1 k ≤ i+1 i × k+1 k ≤ (k + 1)2 k2 =1+ 2 k + 1 k2 . 3. the y-coordinate is the sum of heights of a subset of items. Of course this is an infeasible solution. we can verify whether all the squares in S can be packed into the knapsack in time 4c . and therefore at most 4c possibilities to consider. +2 So the total area of squares in ∪i j=1 Sj is larger than 1. Given a set S with c squares. . Based on the above observations. ϵ . / Theoretical Computer Science 508 (2013) 35–40 39 To prove this theorem. where j ≥ k since each item has k . the x-coordinate (within the bin) of the bottom left corner of every item is the sum of widths of a subset of items. we have h1 ≤ h2 ≤ · · · ≤ hi +2 ≤ And we have i  j=1 1 k . the theorem holds. hence the total area of squares packed in shelf Sj is as least hj−1 · 1. before we shrink the shelves. Then by (10).

D. K. Kellerer. A) (1995) 73–104.R. Springer. the time complexity is O(n log n). Jansen. Online removable square packing. Han. Complexity of approximation algorithms for combinatorial problems: a survey. Zhang. 108–112. i. the approximation ratio of our algorithm is As the time complexity. Then we have k3 ≤ OPT (L) ≤ OPT (Ls ) + OPT (Lb ) ≤ OPT (Ls ) + k2 . in: ISPAN. Marchetti-Spaccamela. E. By Eq. Finite-state online algorithms and their automated competitive analysis. and in step 3. Theory of Computing Systems 43 (1) (2008) 38–55. 179–188. Sarbua. Knapsack Problems. in: Proc. Ser. the time complexity is at most O(k3 · 4 ) = O(4 ϵ −6 ). Ibarra. Average-case analysis of off-line and on-line knapsack problems. SIAM Journal on Algebraic and Discrete Methods 4 (3) (1983) 383–397. Jansen. 4288. 2005. pp. 293–305. So. 2 (11) By Theorem 3. E. Taketomi. [20] J. Online minimization knapsack problem.V. O. [18] G. SWAT (2004) 362–371. Theorem 4. Fishkin. Makino. [12] K. We leave these problems as open questions. A polynomial time approximation scheme for the square packing problem.G. 3. [14] K. G. [8] X. [2] B. J. Maximizing the number of packed rectangles. Baker. in step 1 of our algorithm. Algorithmica 47 (3) (2007) 323–342. The approximation ratio of our algorithm is 1 + 5ϵ . Proof. 2007. Horiyama. Approximation algorithms for orthogonal packing problems for hypercubes. G. Zhang. Iwama. WAOA (2009) 182–193. Maximizing the total profit of rectangles packed into a rectangle. Iwama. Fast approximation algorithms for the knapsack and sum of subset problems. A. pp. [9] R. Linear functions on the N -dimensional unit cube. in: Proc. [19] A. G. R. Coffman Jr. Packing weighted rectangles into a square.e. [10] T. 2002. Lagarias. An online partially fractional knapsack problem. [13] K. then apply the modified IHS for all the items with side at most ϵ . [3] A. Then we have OPT (Ls ) ≥ OPT (L) − k2 ≥ (1 − ϵ)OPT (L). Han. S. It seems that the upper bounds of both algorithms IHS and Modified IHS are not tight. Kim. 2006. [5] G. Online removable knapsack with limited cuts. And the time complexity is O(n log n + 4ϵ 3 −6 ). Journal of the ACM 22 (1975) 463–468. Babat. pp. K.. Iwama.G. Sixth Annual ACM-SIAM SODA.. (Russian). Guess whether OPT (L) ≥ k or not. in: LNCS. by Lemma 3 the time complexity is O(4k ) = O(4ϵ 6 (1+3ϵ +2ϵ 2 ) 1 −ϵ ≤ 1 + 5ϵ. U. MFCS (2005) 352–363. in: ISAAC. Pisinger. G. k6 in step 2. in: APPROX-RANDOM. Calderbank. Lan et al. If an item has side larger than ϵ = 1 k input L can be divided into sublists Ls for small items and Lb for large items. Removable online knapsack problems. [6] X. 3 2. Mathematical Programming 68 (1. Operations Research Letters 32 (2004) 5–14. then we call it large else small. / Theoretical Computer Science 508 (2013) 35–40 Polynomial time approximation scheme 1. C. Zhang. Vercellis. Jansen. 1995. Output the packing and i. pp. 180–188. Acknowledgments The authors wish to thank the referees for their useful comments on the earlier draft of the paper. Jansen. IPCO (2008) 184–198. −6 Remarks. Approximation algorithms for maximizing the number of squares packed into a rectangle. V. . M. K. Else OPT (L) ≥ k3 . Caprara. Noga. Theoretical Computer Science 410 (44) (2009) 4504–4532. R. It is not difficult to see that if OPT (L) < k we can get an optimal solution. K. 71–80. the approximation ratio is 1. On the two-dimensional knapsack problem. Gene. it is possible to improve the upper bounds of the two algorithms further. This can be done by the following: take the smallest k3 -th squares. then verify all the k3 squares can be packed together or not. ACM SIGACT News 12 (3) (1980) 52–65. K. [15] K. ICALP. Levner. Then the Next we consider the case OPT (L) ≥ k3 . Zhang. Monaci. Theoretical Computer Science 411 (44–46) (2010) 3956–3964. where k = ϵ −1 . Kawahara. 2004. Pferschy. Optimal resource augmentations for online knapsack. Iwama. [4] A. C. ) by Lemma 3. . [7] X. Lueker. vol. Harren. in: LNCS. [17] H.E.C.S. [11] O. Solis-Oba. Han. where ϵ ≤ 1/7. If OPT (L) < k3 then find a maximal i such that the smallest i-th squares can be packed together.H. (11).40 Y. pp. 2380. References [1] L. the solution by the modified IHS is at least OPT (Ls )/(1 + 3ϵ + 2ϵ ). [16] K.S. Solis-Oba. Stochastic on-line knapsack problems. J. Doklady Akademii Nauk SSSR 222 (1975) 761–762. Gerber. Makino.