You are on page 1of 4

TUGAS RESUME STRATEGIC DECISION MAKING AND

NEGOTIATION
RAIS KANDAR
29115663
Reading Summary | Chapter 6
THE DECISION TREE
Decision trees can help the decision maker to make a clear view of the
structure of a problem and make it easier to determine what is the possible
scenarios which can result if a particular course of action is chosen. This can lead to
creative thinking and resulting some options which were not previously being
considered.
Constructing a Decision Tree
Two symbols being used in decision trees.

A square is used to represent a decision node and because each branch


emanating from this node presents an option, the decision maker can choose
which branch to follow.

A circle is used to represent a chance node. The branches which stem from this
sort of node represent the possible outcomes of a given course of action and
the branch which is followed will be determined, not by the decision maker, but
by circumstances which lie beyond his or her control.

Practical Applications of Decision Trees

Ulvila used the technique to help the US postal service to decide on whether
to continue with the nine-digit zip code for business users. The analysis was
designed to compare the monetary returns which might result from the use of
various types of automatic sorting equipment either with or without the code.

Assessment of Decision Structure


Real Life Decision Problem
There is no obviously right or wrong representation of any problem related to
real life. Also no normative technique for showing the structure of the decision
problem. It is really a matter of the decision analysts judgment as to whether the
shown tree is a fair representation of the decision makers decision problem.

Once a structure is agreed then the computation of expected utility is fairly


straightforward. Structuring is therefore a major problem in decision analysis,
because if the structuring is wrong then it is a necessary consequence that
assessments of utilities and probabilities may be inappropriate and the expected
utility computations may be invalid.
Eliciting decision tree representations
Methods to help decision analyst to elicting decision tree representation is
influence diagrams, which are designed to summarize the dependencies that are
exist among events and acts within a decision.
One step-by-step procedure for turning an influence diagram into a decision
tree is as follows:
1

Identify a node with no arrows pointing into it (since there can be no loops at
least one node will be such).

If there is a choice between a decision node and an event node, choose the
decision node.

Place the node at the beginning of the tree and remove the node from the
influence diagram.

For the now-reduced diagram, choose another node with no arrows pointing
into it. If there is a choice a decision node should be chosen.

Place this node next in the tree and remove it from the influence diagram.

Repeat the above procedure until all the nodes have been removed from the
influence diagram.

Reading Summary | Chapter 8


REVISING JUDGMENTS IN THE LIGHT OF NEW INFORMATION
In Bayes theorem an initial probability estimate is known as a prior probability.
When Bayes theorem is used to modify a prior probability in the light of new
information the result is known as a posterior probability.
BAYES THEOREM
Some steps in the process of Bayes theorem are summarized below:

(1) Construct a tree with branches representing all the possible events which can
occur and write the prior probabilities for these events on the branches.
(2) Extend the tree by attaching to each branch a new branch which represents
the new information which you have obtained. On each branch write the
conditional probability of obtaining this information given the circumstance
represented by the preceding branch.
(3) Obtain the joint probabilities by multiplying each prior probability by the
conditional probability which follows it on the tree.
(4) Sum the joint probabilities.
(5) Divide the appropriate joint probability by the sum of the joint probabilities to
obtain the required posterior probability.
The Expected Value of Perfect Information
In many decision situations it is not possible to obtain perfectly reliable
information, but nevertheless the concept of the expected value of perfect
information (EVPI) can still be useful.
The Expected Value of Imperfect Information
A summary of the main stages is given below:
(1) Determine the course of action which would be chosen using only the prior
probabilities and record the expected payoff of this course of action.
(2) Identify the possible indications which the new information can give.
(3) For each indication:
a. Determine the probability that this indication will occur.
b. Use Bayes theorem to revise the probabilities in the light of this
indication.
c. Determine the best course of action in the light of this indication (i.e.
using the posterior probabilities) and the expected payoff of this course
of action.
(4) Multiply the probability of each indication occurring by the expected payoff of
the course of action which should be taken if that indication occurs and sum
the resulting products. This will give the expected payoff with imperfect
information.
(5) The expected value of the imperfect information is equal to the expected
payoff with imperfect information (derived in stage 4) less the expected

payoff of the course of action which would be selected using the prior
probabilities (which was derived in stage 1).

You might also like