Design and Analysis of Algorithm

UNIT – V
BRANCH AND BOUND -- THE METHOD
The design technique known as branch and bound is very similar to backtracking (seen in unit 4) in that it searches a tree model of the solution space and is applicable to a wide variety of discrete combinatorial problems. ach node in the combinatorial tree generated in the last !nit defines a problem state. All paths from the root to other nodes define the state space of the problem. Solution states are those problem states 's' for which the path from the root to 's' defines a tuple in the solution space. The leaf nodes in the combinatorial tree are the solution states. Answer states are those solution states 's' for which the path from the root to 's' defines a tuple that is a member of the set of solutions (i.e."it satisfies the implicit constraints) of the problem. The tree organi#ation of the solution space is referred to as the state space tree. A node which has been generated and all of whose children have not yet been generated is called a live node. The live node whose children are currently being generated is called the E$node (node being e%panded). A dead node is a generated node" which is not to be e%panded further or all of whose children have been generated. Bounding functions are used to kill live nodes without generating all their children. Depth first node generation with bounding function is called backtracking. &tate generation methods in which the E-node remains the E$node until it is dead lead to branch-and-bound method. The term branch$and$bound refers to all state space search methods in which all children of the $node are generated before any other live node can become the $node. 'n branch$and$bound terminology breadth first search(()&)$ like state space search will be called )')* ()irst 'n )irst *utput) search as the list of live nodes is a first $in$first $out list(or queue). A D-search (depth search) state space search will be called +')* (+ast 'n )irst *ut) search" as the list of live nodes is a list$in$first$out list (or stack).

,

'f the nodes are generated in this order" then the ne%t $node are node 2... This node becomes the $node. The branch$and$bound algorithms are rarely simple." 34 and 45 are generated./4$queens0 +et us see how a )')* branch$and$bound algorithm would search the state space tree (figure 1.and columns . 2 .. They tend to be quite complicated in many cases..Design and Analysis of Algorithm (ounding functions are used to help avoid the generation of sub trees that do not contain an answer node. -owever" this type of algorithms is oriented more toward optimi#ation." 34" and 45. 'nitially" there is only one live node" node. The only live nodes 2" . The branch$and$bound algorithms search a tree model of the solution space to get the solution. %ample .2) for the 4$queens problem.. in row . !sually" the goal here is to find a configuration for which the cost function is minimi#ed." 2" 3" and 4 respectively. An algorithm of this type specifies a real $valued cost function for each of the nodes that appear in the search tree. This represents the case in which no queen has been placed on the chessboard. These nodes represent a chessboard with queen. 't is e%panded and its children" nodes2" .

6odes .)igure . 6odes ..7 and 24 are killed as a result of the bounding functions." and . The selection rule for the ne%t $node does not give any preference to a node that has a very good chance of getting the search to an answer node quickly.7" 24" and 27 are generated. 6odes that are killed as a result of the bounding functions are a 8(8 under them.." when node 35 is generated" it should have become obvious to the search algorithm that this node will lead to answer node in one move. 6ode 27 is added to the queue of live nodes..3 are generated.. and . At the time the answer node" node 3. 6ow the E-node is node 34. 6umbers inside the nodes correspond to the numbers in )igure 1. 6ode 3 is immediately killed using the bounding function.3 are added to the queue of live nodes.Design and Analysis of Algorithm 't is e%panded and the nodes 3" . -owever" the rigid )')* rule first requires the e%pansion of all live nodes generated before node 35 was e%panded. Thus" in %ample .2.2 that is generated by a )')* branch$and$bound search." is reached" the only live nodes remaining are nodes 3.. becomes the ne%t E-node. 6ode . shows the portion of the tree of )igure 1. 3 . Least Cost (LC) Search: 'n both +')* and )')* branch$and$bound the selection rule for the ne%t $node is rather rigid and in a sense blind.6umbers outside the nodes give the order in which the nodes are generated by )')* branch$and$bound. 6odes . and 44.

.7" 24" 32" and 3.) for live nodes.-ence" such a strategy is called an +.The only other nodes to get generated are nodes 2" 34" 45" . -ence" by the time the cost of a node is determined" that sub$tree has been searched and there is no need to e%plore % again.. The ne%t E$node is selected on the basis of this ranking function. )or this reason" search algorithms usually rank nodes only based on an estimate (. and 34"27 and 34"and 35 and 3." 27" and 35 (in that order). 'f % is not an answer node" then c(%) 9infinity" providing the sub$tree % contains no answer node< otherwise c(%) is equals the cost of a minimum cost answer node in the sub$tree %.." .) The number of nodes on the sub$tree x that need to be generated before any answer node is generated or" more simply" (2) The number of levels the nearest answer node (in the sub$tree x) is from x !sing cost measure (2)" the cost of the root of the tree of )igure .$search (least cost search).) is defined as" if % is an answer node" then c(%) is the cost (level" computational difficulty" etc.).The costs of all remaining nodes on levels 2" 3" and 4 are respectively greater than 3" 2" and ..The remaining live nodes will never become E$nodes as the e%pansion of node 35 results in the generation of an answer node (node 3. The difficulty of using the ideal cost function is that computing the cost of a node usually involves a search of the sub$tree % for an answer node.).) such that (%) 9f (h(%)) : (%)" where h(%) is the cost of reaching % from the root and f(. +et (%) be an estimate of the additional effort needed to reach an answer node from %. 'f in the 4$queens e%ample we use a ranking function that assigns node 35 a better rank than all other live nodes" then node 35 will become E-node" following node 27..Design and Analysis of Algorithm The search for an answer node can often be speeded by using an 8intelligent8 ranking function (.) is any non$decreasing function. 4 .. this cost could be (. is 4 (node 3. !sing these costs as a basis to select the ne%t $node" the $nodes are nodes . )or any node x. are respectively 3" 2" and . is four levels from node .) of their cost. node % is assigned a rank using a function (.The costs of nodes .) of reaching % from the root of the state space tree.. .ost function c (.). A search strategy that uses a cost function (%) 9f (h(%)) : (%)" to select the ne%t e$ node would always choose for its ne%t e$node a live node with least (. The ideal way to assign ranks would be on the basis of the additional computational effort (or cost) needed to reach an answer node from the live node.

'f upper is an upper bound on the cost of a minimum$cost solution" then all live nodes % with (%)?upper may be killed as all answer nodes reachable from % have cost c(%)?9 (%)?upper.Design and Analysis of Algorithm 't should be easy to see that (. =e generali#e this problem to allow @obs with different processing times."2"3"4D.). A cost function (. 4 .". such a @ is optimal. -ence" a penalty can be incurred only on those @obs not in @.) with f (h(%)) 9h(%) is an appro%imation to c (.)"( " " )9(. The starting value for upper can be set to infinity.5"3"2)"( " " )9(B"2". . =e assume that each answer node % has a cost c(%) associated with it and that a minimum$cost answer node is to be found.@ob i requires units of processing time .learly" so long as the initial value for upper is no less than the cost of a minimum$ cost answer node" the above rule to kill live nodes will not result in the killing of a live node that can reach a minimum$cost answer node .. As an e%ample optimi#ation problem" consider the problem of @ob scheduling with deadlines. =e are given n @obs and one processor. Three common search strategies are )')*" +')*" and +. ach @ob i has associated with it a three tuple ( ). The ob@ective is to select a subset @ of the n @obs such that all @obs in @ can be completed by their deadlines.onsider the following instancesA n94"( " " )9(4". ach time a new answer is found "the value of upper can be updated. )rom now on (%) is referred to as the cost of %. The subset @ should be such that the penalty incurred is minimum among all possible subsets @. Bo !"#!$: A branch $and$bound searches the state space tree using any search mechanism in which all the children of the $node are generated before another node becomes the $ node.)"and( " " )9(3".The solution space for this instances consists of all possible subsets of the @ob inde% setC.) such that (%) >9c(%) is used to provide lower bounds on solutions obtainable from any node %.). .if its processing is not completed by the deadline " and then a penalty is incurred.". The solution space can be organi#ed into a tree by means of either of the two formulations used for the sum of subsets problem.

' )igure ...1 only non$square leaf nodes are answer nodes..B corresponds to the variable tuple si#e formulations while figure . 'n figure . This node corresponds to @9C2"3D and a penalty of .. The costs of the answer nodes of figure . B .Design and Analysis of Algorithm %#$ re &.)or this node @9 C2"3D and the penalty (cost) is .B all non$square nodes are answer nodes. 6ode 7 represents an optimal solution and is the only minimum$cost answer node . 'n both figures square nodes represent infeasible subsets. 'n figure ... 6ode 24 represents the optimal solution and is also a minimum$cost answer node.1 corresponds to the fi%ed tuple si#e formulation..1 are given below the nodes.

)ind least cost in a column and negate it with rest of the elements.Design and Analysis of Algorithm TRAVELLIN( SALESMAN )ROBLEM INTRODUCTION: 't is algorithmic procedures similar to backtracking in which a new branch is chosen and is there (bound there) until new branch is choosing for advancing. th row .) !se preserved cost matri% . /for the given graph g0 /ROW REDUCTION] )or all rows do step 2 )ind least cost in a row and negate it with rest of the elements. 'f two or more have same cost then arbitrarily choose any one among them.ompare all effective cost and pick up the largest l. (!sed to avoid infinite loop formation) 'f (i"@) not present" leave it.n$.G and ..I . . Delete (i" @) means delete ith row and @th column change (@" i) value to infinity. This technique is implemented in the traveling salesman problem /T&E0 which are asymmetric (.i@ >?. ∑ (i" @)9least cost in the i e%cluding (i" @) : least cost in the @ th column e%cluding (i" @)..alculate effective cost of the edges. to step 7 until the resultant cost matri% having order of 2H2 and reduce it. &TE)S INVOLVED IN THIS )ROCEDURE ARE AS %OLLO*S: STEP 0: STEP 1: STEP: STEP : STEP $: STEP %A STEP &: STEP ': STEP (: STEP ): STEP 10: STEP 11: Fenerate cost matri% . ((oth G. nlist all edges (i" @) having cost 9 5.n" . Gepeat step . /which row reduced first and then column reduced0 for the i th time. 1 .. Ereserve cost matri% .i@) where this technique is an effective procedure. !CO"U#N REDUCTION] !se cost matri%$ Gow reduced one for all columns do &T E 4.

to generate a complete tour.-e.1 .. 2 3 4 4 .7 2 .5 B 3. +.Design and Analysis of Algorithm . 35 B α . .1 α 24 1 4 3.4 7 45 22 . B α 4 . STEP 1*: EXAMP E! 22 21 4 24 . . 45 35 4 1 B 24 3 .5 21 24 . . 7 MATRI+: . α 4 2 24 α 3 45 .4 !se result obtained in &tep .7 . 45 . /rap0 + 4 24 .hoose an edge /i" @0 having value 95" at the first time for a preserved matri% and leave that matri%.

4 ....: Gow Geduction . 3 .2 α . .4 44 .4 .4 . α 5 . .4 . .2 α . 5 4 B 24 4 α 3 2 25 5 5 α 4 .olumn Geduction0 2 5 α . 2 3 4 4 STE) -: .Design and Analysis of Algorithm E-A& ' STE) .4 3 44 . .T'*6A . α 5 2 5 α 3 . /. 3 . 5 4 3 2 4 2 α 3 2 25 5 5 α 4 7 ... /G*= G D!..

) 9 . α 5 2 5 α 3 .)" (3"4)" (4"4)" (4"3)" (4"4)} STE) ': .0 (.5 . . 5 3 22 2 α 3 4 2 25 5 5 α 4 ..2 9 ..2 (4"4) 9 5:2 9 2 STE) &: ."2)" (2". 2 3 4 4 STE) /: +9 {(.4 . 93 (2".: Ereserve the above in ..Design and Analysis of Algorithm 2 3 4 4 STE) .2:3 9 .2 α .alculation of effective cost / . .4 3 44 ." .4 (3"4) 9 2:5 92 (4"4) 9 3:5 9 3 (4"3) 9 5:.4 .."2) 9 2:.

"2)  α if e%ists. .2 (.4 44 .) from .A . 6ow .3 4 ... 3 4 4 &T E .. 5 4 3 2 α 5 2 5 5 α 4 . 3 4 4 &T E 3A . and make change in it as (.Design and Analysis of Algorithm + having edge (2".2(G" G) 2 3 α .. .4 α . . .4 α 44 . Therefore" go to step .ost Jatri% 9 2 3 α ..ost matri% ≠ 2 % 2. 2 α 5 4 5 5 5 α ." G) 5 . STE) 0: Delete (2".5A The .) is the largest. E-A& ''A STE).

3 43 5 3 4 4 3 ...Design and Analysis of Algorithm 2 α . 5 4 . 2 α 5 4 5 5 5 α .2 .2 9 2 α . 2 α 5 4 5 5 5 α . &T E 4A Ereserve the above in . (3"4) 9 2:5 92 3 .2 . 5 4 .3 43 5 3 4 4 &T E BA +9 C(. (."4) 9 ..3 α . .."4)" (3"4)" (4"4)" (4"2)" (4"3)" (4"4)D &T E 1A calculation of .:5 9.3 α .

3 9.3 (4"4) 9 5:.2 and make change in it as (4"4) 9 α if e%ists.3 α 5 4 .3 5 3 4 &T E . &T E .3 (.2 α 5 4 5 5 α . 5 3 4 &T E 3A . ≠ 2%2 hence go to step ." G) 3 .A + having an edge (4"4) is the largest. &T E 7A Delete (4"4) from .3 (4"3) 9 5:..Design and Analysis of Algorithm (4"4) 9 .3 9.3 .cost matri% E-A& '''A &T E .5A T. .. 6ow" cost matri% 2 α .. (4"2) 9 5:.3 (G" G) 2 α .:5 9.A . 3 . 2 α . 9.

. 6ow cost matri% is 2 ..5A =e have got 2%2 matri% .."4) &T E 7A Delete (i"@) (. (.2.2 &T E . &T E 4A preserve the above in ..29.2 α 5 4 5 5 α . ." G) 9 2 3 . 5 2 3 &T E ."4)9. (4"2)95:.2 &T E BA +9C(.) 9 α if e%ists. 5 3 4 3 .2 (3"4)9.4 (."4) and (4"3) with cost 9 . -ence arbitrarily choose (.4 3 α 5 3 α 5 ..Design and Analysis of Algorithm 2 α .:59. (4"3)95:.4 (GG)9 2 5 5 3 4 .9.A -ere we are having two edges (..2:59."4)" (3"4)" (4"2)" (4"3)D &T E 1A calculation of ."4) and make change in it (4".

. 3 α 5 3 α .3 α .3 43 5 3 4 4 .A +'&T ..3 2 α .3 A6D .Design and Analysis of Algorithm 5 5 3 4 Therefore". 3 .4 2 5 5 4 . 5 ...2 α 5 4 5 5 α .4 9 2 5 5 3 4 α 5 3 α 5 &T E .2" . 2 α 5 5 5 5 α 2 3 4 4 . 5 3 4 .." .2 9 .4 .4 . . 9 .

. 3 .4 ..Design and Analysis of Algorithm α 5 . 5 3 22 2 α 5 2 25 5 5 α ."4) 95 -ence" T(3"2)" (..4 2 3 4 4 &T E .4 9 3 α 5 Eick up an edge ('" @) 95 having least inde% -ere (3"2) 95 -ence" T (3"2) !se . 5 3 4 Eick up an edge (i" @) 95 having least inde% -ere (.3 9 2 α .29 2 3 4 4 .2 α 5 4 5 5 α . !se ."4) !se . .2 α .2A i) 2 5 5 3 4 5 α .4 44 . .B 3 .

4 2 3 4 4 Eick up an edge (i" @) with least inde% (.3 ."4)" (4"4)" (2". 5 3 4 4 5 Eick up an edge (i" @) with least cost inde%.Design and Analysis of Algorithm α .) SOLUTION: ."4)  not possible because already chosen inde% i (i9@) (3"4)  not possible as already chosen inde%. 5 4 3 22 2 α 5 4 2 25 5 5 α .. -ere (. α 5 . . 9 . .3 α 43 ."4)" (4"4) !se .hoose it . T (3"2)" (.4 . 2 α 5 5 5 5 α . (4"4) 5 -ence" T (3"2)" (.6.4 44 . 3 ."2)  6ot possible (2".1 2 5 α .2 α ...)  . 3 ..

K4K4K3 Cost #s .. %#!a1 res 1tA 3K2K.Design and Analysis of Algorithm )rom the above list 3K2K.2/2'3/4 . .2-.K4K4 This result now" we have to return to the same city where we started (-ere 3).2...