You are on page 1of 5

Decision trees and influence

Diagrams decision trees can serve a number of purposes when complex


multi-stage problems are encountered. They can help a decision maker to develop a clear
view of the structure of a problem and make it
easier to determine the possible scenarios which can result if a particularcourse of action
is chosen. Decision trees can also help a decision maker to judge the nature of the
information which needs to be gathered in order to tackle a problem and, because they
are generally easy to understand, they can be an excellent medium for communicating
one persons perception of a problem to other individuals.
Constructing a decision tree
The decision problem is now perceived to have two stages. At stage one a decision has to
be made between the designs or not developing the problem at all. At stage two a
decision may have to be made on whether the design should be modified.
Determining the optimal policy
A policy is a plan of action stating which option is to be chosen at each decision node
that might be reached under that policy. the decision tree can be used to identify the
optimal policy. The technique for determining the optimal policy in a decision tree is
known as the rollback method. To apply this method, we analyze the tree from right to
left by considering the later decisions first. the rollback method allows a complex
decision problem to be analyzed as a series of smaller decision problems.

Decision trees involving continuous probability distributions


In the decision problem we considered above there were only two possible
outcomes for each course of action, namely success and failure. in many
decision trees the probability distributions will be dependent. there are clear
advantages in using EP-T approximation. Above all, it is simple and each
distribution requires only three estimates to be made which has the obvious
effect of reducing the decision makers judgmental task.

Practical applications of decision trees

The EP-T approximation was used to represent probability distributions of the


extent to which the code would be used, and the savings which would result from
the various options
in the tree. decision trees are the major analytical structures underlying
application of decision analysis to problems involving uncertainty.

Assessment of decision structure

Structuring is therefore a major problem in decision analysis, for if the structuring


is wrong then it is a necessary consequencet hat assessments of utilities and
probabilities may be inappropriate and the expected utility computations may be
invalid.

Decision
trees constructed early in the analyst/decision maker interaction may be
incomplete representations of the decision problem facing the decision maker.

Eliciting decision tree representations

To complete the tree, the possible choices at each decision node and the possible
events at each event node must now be specified. Very complex decision trees
can be represented as one-page influence diagrams.

Applying simulation to decision


problems
Monte Carlo simulation
Applying simulation to a decision problem

The application of simulation to a problem like this involves the


following stages:
(1) Identify the factors that will affect the payoffs of each course of action.
(2) Formulate a model to show how the factors are related.
(3) Carry out a preliminary sensitivity analysis to establish the factors
for which probability distributions should be assessed.
(4) Assess probability distributions for the factors which were identified
in stage 3.
(5) Perform the simulation.
(6) Apply sensitivity analysis to the results of the simulation.
(7) Compare the simulation results for the alternative courses of action
and use these to identify the preferred course of action.

Plotting the two distributions

By inspecting graphs of the two probability distributions the decision maker can
compare the probabilities of each product making a loss or the probabilities that
each product would reach a target level of profit.

Determining the option with the highest expected utility

utility theory can be used to identify the option which the decision maker should
choose

Stochastic dominance

Stochastic dominance can be recognized by plotting the cumulative probability


distribution functions (CDFs). First- and second-degree stochastic dominance are
the two most useful forms which the CDFs can reveal.

The meanstandard deviation approach

When a decision problem involves a large number of alternative courses of action


it is helpful if inferior options can be screened out at an early stage. In these
situations the meanstandard deviation approach can be useful

Applying simulation to investment decisions


The net present value (NPV) method

The NPV figures are obviously only as good as the estimates on which the
calculations are based. In general, there will be uncertainty about the size of the
future cash flows and about the lifetime of the project

Using simulation

the
simulation approach can be extended to handle investment problems
involving sequences of decisions using a method known as stochastic
decision tree analysis.

Utility and net present value

either of the NPV assumptions is seriously violated then


the NPV will not accurately represent the decision makers preferences
between sums of money arriving at different points in time.

Modeling dependence relationships

Revising judgments in the light of new


information
Bayes theorem

In Bayes theorem an initial probability estimate is known as a prior


Probability. When Bayes theorem is used to modify a prior
probability in the light of new information the result is known as a
posterior probability.
The steps in the process which we have just applied are summarized
below:
(1) Construct a tree with branches representing all the possible events
which can occur and write the prior probabilities for these events on
the branches.
(2) Extend the tree by attaching to each branch a new branch which
represents the new information which you have obtained. On each
branch write the conditional probability of obtaining this information
given the circumstance represented by the preceding branch.
(3) Obtain the joint probabilities by multiplying each prior probability
by the conditional probability which follows it on the tree.
(4) Sum the joint probabilities.
(5) Divide the appropriate joint probability by the sum of the joint
probabilities to obtain the required posterior probability.

The effect of new information on the revision


of probability judgments

the new information has less than a 0.5 chance of being


reliable its result is of interest and the more unreliable it is, the greater
the effect it will have on the prior probability

Applying Bayes theorem to a decision problem

We will now consider the application of Bayes theorem to a decision


problem: a process which is sometimes referred to as posterior analysis.
This simply involves the use of the posterior probabilities, rather than
the prior probabilities, in the decision model.
A retailer has to decide whether to hold a large

Assessing the value of new information

New information can remove or reduce the uncertainty involved in a


decision and thereby increase the expected payoff.

The expected value of perfect information

In many decision situations it is not possible to obtain perfectly reliable


information, but nevertheless the concept of the expected value of perfect
information (EVPI) can still be useful. the EVPI can be very useful in
giving an upper bound to the value of new information.

The expected value of imperfect information

value of
perfect information, we will need to consider the possible indications
the test will give, what the probabilities of these indications are and the
decision the manager should take in the light of a given indication.
the more reliable the new
information, the closer its expected value will be to the EVPI.
A summary of the main stages in the above analysis is given below:
(1) Determine the course of action which would be chosen using only
the prior probabilities and record the expected payoff of this course

of action;
(2) Identify the possible indications which the new information can give;
(3) For each indication:
(a) Determine the probability that this indication will occur;
(b) Use Bayes theorem to revise the probabilities in the light of
this indication;
(c) Determine the best course of action in the light of this indication
(i.e. using the posterior probabilities) and the expected payoff of
this course of action;
(4) Multiply the probability of each indication occurring by the expected
payoff of the course of action which should be taken if that indication
occurs and sum the resulting products. This will give the expected
payoff with imperfect information;
(5) The expected value of the imperfect information is equal to the
expected payoff with imperfect information (derived in stage 4) less
the expected payoff of the course of action which would be selected
using the prior probabilities (which was derived in stage 1).

Practical considerations

You might also like