Professional Documents
Culture Documents
NEGOTIATION
RAIS KANDAR
29115663
Reading Summary | Chapter 6
THE DECISION TREE
Decision trees can help the decision maker to make a clear view of the
structure of a problem and make it easier to determine what is the possible
scenarios which can result if a particular course of action is chosen. This can lead to
creative thinking and resulting some options which were not previously being
considered.
Constructing a Decision Tree
Two symbols being used in decision trees.
A circle is used to represent a chance node. The branches which stem from this
sort of node represent the possible outcomes of a given course of action and
the branch which is followed will be determined, not by the decision maker, but
by circumstances which lie beyond his or her control.
Ulvila used the technique to help the US postal service to decide on whether
to continue with the nine-digit zip code for business users. The analysis was
designed to compare the monetary returns which might result from the use of
various types of automatic sorting equipment either with or without the code.
Identify a node with no arrows pointing into it (since there can be no loops at
least one node will be such).
If there is a choice between a decision node and an event node, choose the
decision node.
Place the node at the beginning of the tree and remove the node from the
influence diagram.
For the now-reduced diagram, choose another node with no arrows pointing
into it. If there is a choice a decision node should be chosen.
Place this node next in the tree and remove it from the influence diagram.
Repeat the above procedure until all the nodes have been removed from the
influence diagram.
(1) Construct a tree with branches representing all the possible events which can
occur and write the prior probabilities for these events on the branches.
(2) Extend the tree by attaching to each branch a new branch which represents
the new information which you have obtained. On each branch write the
conditional probability of obtaining this information given the circumstance
represented by the preceding branch.
(3) Obtain the joint probabilities by multiplying each prior probability by the
conditional probability which follows it on the tree.
(4) Sum the joint probabilities.
(5) Divide the appropriate joint probability by the sum of the joint probabilities to
obtain the required posterior probability.
The Expected Value of Perfect Information
In many decision situations it is not possible to obtain perfectly reliable
information, but nevertheless the concept of the expected value of perfect
information (EVPI) can still be useful.
The Expected Value of Imperfect Information
A summary of the main stages is given below:
(1) Determine the course of action which would be chosen using only the prior
probabilities and record the expected payoff of this course of action.
(2) Identify the possible indications which the new information can give.
(3) For each indication:
a. Determine the probability that this indication will occur.
b. Use Bayes theorem to revise the probabilities in the light of this
indication.
c. Determine the best course of action in the light of this indication (i.e.
using the posterior probabilities) and the expected payoff of this course
of action.
(4) Multiply the probability of each indication occurring by the expected payoff of
the course of action which should be taken if that indication occurs and sum
the resulting products. This will give the expected payoff with imperfect
information.
(5) The expected value of the imperfect information is equal to the expected
payoff with imperfect information (derived in stage 4) less the expected
payoff of the course of action which would be selected using the prior
probabilities (which was derived in stage 1).