You are on page 1of 4

DS424 – Multi-objective Programming-Prof. Dr. Tarek H.M.

Abou-El-Enien

Notes on Lesson (2)


Methods for Multiple Objective Decision Making
Methods for no Articulation of Preference Information Given
- That methods following this approach do not need any inter-objective or other subjective
preference information from the DM once the problem constraints and objectives have been
defined.
- This approach requires that the DM be able to accept the solution.
- The advantage of this route is that in the process of obtaining the solution, the DM will not
be able disturbed by the analyst, which is preferable from the point-of-view of the DM.
- A major disadvantage then is the necessity for the analyst to make many assumptions about
the DM's preference.

Global Criterion Method

- Consider the following multi-objective programming problem:


𝑀𝑎𝑥𝑖𝑚𝑖𝑧𝑒 [𝑓1 (𝑋), 𝑓2 (𝑋), … , 𝑓𝑘 (𝑋)]
𝑠𝑢𝑏𝑗𝑒𝑐𝑡 𝑡𝑜 } (1)
𝑔𝑖 (𝑋) ≤ 0, 𝑖 = 1,2, … , 𝑚
where 𝑋 is an 𝑛-dimensional decision variable vector. The problem consists of 𝑛 decision
variables, 𝑚 constraints and 𝑘 objectives. Any or all of the functions may be nonlinear. In
literature this problem is often referred to as a vector Maximum problem (VMP), or vector
optimization problem (VOP).

Page 1 of 4
DS424 – Multi-objective Programming-Prof. Dr. Tarek H.M. Abou-El-Enien

The algorithm (Alg-I):


Step (0):
Consider the following multi-objective programming problem:
𝑀𝑎𝑥𝑖𝑚𝑖𝑧𝑒 [𝑓1 (𝑋), 𝑓2 (𝑋), … , 𝑓𝑘 (𝑋)]
𝑠𝑢𝑏𝑗𝑒𝑐𝑡 𝑡𝑜 } (1)
𝑔𝑖 (𝑋) ≤ 0, 𝑖 = 1,2, … , 𝑚
Step(1):
Solve the following Problems:
𝑀𝑎𝑥𝑖𝑚𝑖𝑧𝑒 𝑓𝑗 (𝑋)
𝑠𝑢𝑏𝑗𝑒𝑐𝑡 𝑡𝑜 } , 𝑗 = 1,2, … , 𝑘 (2)
𝑔𝑖 (𝑥) ≤ 0, 𝑖 = 1,2, … , 𝑚

to obtain the ideal solutions, 𝑋𝑗∗ , 𝑗 = 1,2, … , 𝑘.


Step(2):
Construct the pay-off table
Step(3):
Solve the following problem at 𝑝 = 1,2, … . , 𝑒𝑡𝑐,
𝑘 𝑝
𝑓𝑗 (𝑋𝑗∗ ) − 𝑓𝑗 (𝑋)
𝑀𝑖𝑛𝑖𝑚𝑖𝑧𝑒 ∑ ( )
𝑓𝑗 (𝑋𝑗∗ )
𝑗=1

𝑠𝑢𝑏𝑗𝑒𝑐𝑡 𝑡𝑜 (3)
𝑔𝑖 (𝑥) ≤ 0, 𝑖 = 1,2, … , 𝑚
To find the preferred/ best compromised solution.
Note:
Problem (3) represent the sum of the squares of the relative deviation of the criteria
(objective functions) from the feasible ideal points.
Step (4): Stop.
Page 2 of 4
DS424 – Multi-objective Programming-Prof. Dr. Tarek H.M. Abou-El-Enien

Example:
𝑀𝑎𝑥𝑖𝑚𝑖𝑧𝑒 𝑓1 (𝑋) = 0.4𝑥1 + 0.3𝑥2
𝑀𝑎𝑥𝑖𝑚𝑖𝑧𝑒 𝑓2 (𝑋) = 𝑥1
𝑠𝑢𝑏𝑗𝑒𝑐𝑡 𝑡𝑜 (4)
𝑥1 + 𝑥2 ≤ 400,
2𝑥1 + 𝑥2 ≤ 500,
𝑥1 , 𝑥2 ≥ 0.
Use the Global Criterion method to solve the above problem (let p=1 & 2).
Solution:
Step(1):Obtain the ideal solution
(I)
𝑀𝑎𝑥𝑖𝑚𝑖𝑧𝑒 𝑓1 (𝑋) = 0.4𝑥1 + 0.3𝑥2
𝑠𝑢𝑏𝑗𝑒𝑐𝑡 𝑡𝑜
𝑥1 + 𝑥2 ≤ 400,
2𝑥1 + 𝑥2 ≤ 500,
𝑥1 , 𝑥2 ≥ 0.

The ideal solution is: 𝑥1∗ = 100, 𝑥2∗ = 300, 𝑓1 (𝑋 ∗ ) = 130.


(II)
𝑀𝑎𝑥𝑖𝑚𝑖𝑧𝑒 𝑓2 (𝑋) = 𝑥1
𝑠𝑢𝑏𝑗𝑒𝑐𝑡 𝑡𝑜
𝑥1 + 𝑥2 ≤ 400,
2𝑥1 + 𝑥2 ≤ 500,
𝑥1 , 𝑥2 ≥ 0.

The ideal solution is: 𝑥1∗ = 250, 𝑥2∗ = 0, 𝑓2 (𝑋 ∗ ) = 250.


Page 3 of 4
DS424 – Multi-objective Programming-Prof. Dr. Tarek H.M. Abou-El-Enien

Step(2): Construct the pay-off table


𝑓1 𝑓2 𝑥1 𝑥2
𝑓1 130 100 100 300
𝑓2 100 250 250 0

Step(3): Obtain the preferred solution


Case (1): p=1
130 − (0.4𝑥1 + 0.3𝑥2 ) 250 − 𝑥1
𝑀𝑖𝑛𝑖𝑚𝑖𝑧𝑒 [( )+( )] =
130 250
𝑀𝑖𝑛𝑖𝑚𝑖𝑧𝑒 [2 − 0.00708𝑥1 − 0.00233𝑥2 ]
𝑠𝑢𝑏𝑗𝑒𝑐𝑡 𝑡𝑜
𝑥1 + 𝑥2 ≤ 400, 2𝑥1 + 𝑥2 ≤ 500, 𝑥1 , 𝑥2 ≥ 0.

The preferred solution is 𝑥1∗ = 250, 𝑥2∗ = 0, 𝑓1 (𝑋 ∗ ) = 100, 𝑓2 (𝑋 ∗ ) = 250.


Case (II): p=2
2
130 − (0.4𝑥1 + 0.3𝑥2 ) 250 − 𝑥1 2
𝑀𝑖𝑛𝑖𝑚𝑖𝑧𝑒 [( ) +( ) ]
130 250

𝑠𝑢𝑏𝑗𝑒𝑐𝑡 𝑡𝑜
𝑥1 + 𝑥2 ≤ 400, 2𝑥1 + 𝑥2 ≤ 500, 𝑥1 , 𝑥2 ≥ 0.

The preferred solution is 𝑥1∗ = 230.7, 𝑥2∗ = 38.6, 𝑓1 (𝑋 ∗ ) = 103.9, 𝑓2 (𝑋 ∗ ) = 230.7.

Page 4 of 4

You might also like