This action might not be possible to undo. Are you sure you want to continue?

Thomas Sharkey

April 23, 2013

then the current solution is optimal to the original problem. If so. This is a similar idea to the set-partitioning formulation of the GAP. We solve a relaxation of the problem with a reduced number of constraints. We then determine if the optimal solution to the relaxation satisﬁes all the constraints in the original problem. Otherwise. we add a subset of the violated constraints to our relaxation.Background We have previously seen the core idea for constraint generation methods: We have a large number of constraints in the problem. Note that it actually may be beneﬁcial to formulate the problem with a large number of constraints. We will see a very similar method as Dantzig-Wolfe Decomposition. .

. the processing times of the jobs are uncertain. n and a set of facilities i = 1. we wish to allocate the customers to the facilities in such a way to minimize our expected costs over all possible scenarios of the characteristics of the jobs. The actual scheduling of the jobs associated with the customers assigned to a facility. . . m. In certain situations.Benders Decomposition: Traditional Motivation Suppose that we are running a scheduling system where we have a set of customers j = 1. . machines) must be done well before the characteristics of the jobs are known with certainty. .. though. In this situation. . the assignment of the customers to facilities (i. . can be done after the characteristics of the jobs are known.e. . However. . unlike previous models discussed in class.

We will have ‘ﬁrst stage constraints’ and ‘second stage constraints.e. Note that the ‘cost’ of the second stage decisions will be a function of: (i) the ﬁrst stage decisions x and (ii) characteristics of the job. K ). the assignment of customers to facilities. We will denote yk as the second stage scheduling decisions for scenario k (where k = 1. . . ..Benders Decomposition: Stochastic Scheduling We will let x denote the ‘ﬁrst stage’ decisions. . i.’ Our objective function will be equal to: K minimize c x + k =1 ωk f yk . .

K . ∈ X ∈ Y. . . .Benders Decomposition Formulation The formulation of our problem becomes: K minimize c x + k =1 ωk f yk . . (P) subject to Ax Bk x + Dyk x y = b = dk for k = 1. .

We will deﬁne the function zk (x ) to be equal to the optimal objective function value in scenario k given ﬁrst stage decisions x . This is accomplished by introducing constraints on the x variables that bound the objective function (or some portion of the objective function) from the problem.The Idea We will try to remove the variables yk from the formulation. For now. We will see how we can reformulate (P). . let’s assume that the y variables are continuous and that there always exists a feasible second-stage solution given any ﬁrst stage solution x .

First Reformulation The problem (P) now becomes: K minimize c x + k =1 ωk zk (x ) (P1) subject to Ax x = b ∈ X. .

(D(k )) = dk − Bk x ≥ 0.The Second-Stage Problem minimize f yk subject to Dyk yk Its dual problem is: maximize pk (dk − Bk x ) subject to pk D ≤ f . (SP(k )) .

We deﬁne EP to be the extreme points of the feasible region of (D(k )) and note that: zk (x ) = max(p i ) (dk − Bk x ).Getting to the Second Reformulation By the deﬁnition of zk (x ) and linear programming duality. i ∈EP Alternatively. . zk (x ) is the smallest value of zk such that: (p i ) (dk − Bk x ) ≤ zk for all i ∈ EP . we can express zk (x ) as the optimal solution to (D(k )).

(p ) (dk − Bk x ) ≤ zk for all i ∈ EP .Second Reformulation The problem (P1) now becomes: K minimize c x + k =1 ωk zk (P2) subject to Ax i = b ∈ X. x .

. We will then solve (R-P2) over a relaxed set of constraints and arrive at a solution (x ∗ . which refer to as (R-P2).e. We then need to either verify that this solution is optimal or determine which constraints (i. z ∗ ).Constraint Generation Approach/Cutting Plane Algorithm We will relax (P2). . cuts) need to be added to the problem. by removing the extreme point constraints.

. The primal problem will give us an optimal solution yk It is easy to verify if this solution has an objective function ∗ greater than or equal to zk . ∗ Recall that there is a complementary dual solution to yk that is optimal to the dual. The dual problem will immediately give us the constraint to be added to (R-P2).The “Pricing” Problem We have the option of solving either the primal or dual problem associated with zk (x ∗ ). z ∗ ). the one that is violated by (x ∗ . We then would know that a constraint needs to be added to (R-P2). . but it is not immediately clear which one. i. ∗.e.

The Relaxation at Any Point K minimize c x + k =1 ωk zk (R-P2) subject to Ax = b Constraints on (portions of) the objective function x ∈ X. .

A Generic Problem for Benders Decomposition minimize f (x . (P) . y ) subject to C (x . y ) are satisﬁed x ∈ Dx . y ∈ Dy .

.Generic Relaxation minimize z subject to z ≥ Bx h (x ) for h = 1. where x h is a previous optimal solution to the relaxation and Bx h (x ) is a cut on the objective function. . (R-P2) . H x ∈ Dx . . .

The Cuts are Important The driving factor in the success of a Benders decomposition is the quality of the cuts.e.. Good cuts are readily available when the subproblems for a ﬁxed x are continuous problems. . But. scheduling problems are often discrete (i. integer variables) and/or combinatorial. The question now becomes how can we develop quality cuts for these types of problems.

On constraint generation methods

On constraint generation methods

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue reading from where you left off, or restart the preview.

scribd