You are on page 1of 17

Markov Models in Medical Decision Making

:
A Practical Guide

FRANK A. SONNENBERG, MD, J. ROBERT BECK, MD

Markov models are useful when a decision problem involves risk that is continuous over
time, when the timing of events is important, and when important events may happen more
than once. Representing such clinical settings with conventional decision trees is difficult
and may require unrealistic simplifying assumptions. Markov models assume that a patient
is always in one of a finite number of discrete health states, called Markov states. All events
are represented as transitions from one state to another. A Markov model may be evaluated

by matrix algebra, as a cohort simulation, or as a Monte Carlo simulation. A newer repre-
sentation of Markov models, the Markov-cycle tree, uses a tree representation of clinical
events and may be evaluated either as a cohort simulation or as a Monte Carlo simulation.
The ability of the Markov model to represent repetitive events and the time dependence of
both probabilities and utilities allows for more accurate representation of clinical settings that
involve these issues. Key words: Markov models; Markov-cycle decision tree; decision mak-
ing. (Med Decis Making 1993;13:322-338)

A decision tree models the prognosis of a patient sub- of survival’ or from standard life tables.’ This paper
sequent to the choice of a management strategy. For explores another method for estimating life expec-
example, a strategy involving surgery may model the tancy, the Markov model.
events of surgical death, surgical complications, and In 1983, Beck and Pauker described the use of Mar-
various outcomes of the surgical treatment itself. For kov models for determining prognosis in medical ap-
practical reasons, the analysis must be restricted to a plications.’ Since that introduction, Markov models
finite time frame, often referred to as the time horizon have been applied with increasing frequency in pub-
of the analysis. This means that, aside from death, the lished decision analyses.’-9 Microcomputer software
outcomes chosen to be represented by terminal nodes has been developed to permit constructing and eval-
of the tree may not be final outcomes, but may simply uating Markov models more easily. For these reasons,
represent convenient stopping points for the scope of a revisit of the Markov model is timely. This paper
the analysis. Thus, every tree contains terminal nodes serves both as a review of the theory behind the Mar-
that represent &dquo;subsequent prognosis&dquo; for a particular kov model of prognosis and as a practical guide for
combination of patient characteristics and events. the construction of Markov models using microcom-
There are various ways in which a decision analyst puter decision-analytic software.
can assign values to these terminal nodes of the de- Markov models are particularly useful when a de-
cision tree. In some cases the outcome measure is a cision problem involves a risk that is ongoing over time.
crude life expectancy; in others it is a quality-adjusted Some clinical examples are the risk of hemorrhage
life expectancy.’ One method for estimating life ex- while on anticoagulant therapy, the risk of rupture of
pectancy is the declining exponential approximation an abdominal aortic aneurysm, and the risk of mor-
of life expectancy (DEALE),2 which calculates a patient- tality in any person, whether sick or healthy. There
specific mortality rate for a given combination of pa- are two important consequences of events that have
tient characteristics and comorbid diseases. Life ex- ongoing risk. First, the times at which the events will
pectancies may also be obtained from Gompertz models occur are uncertain. This has important
implications
because the utility of an outcome often depends on
when it occurs. For example, a stroke that occurs im-
Received February 23, 1993, from the Division of General Internal
mediately may have a different impact on the patient
Medicine, Department of Medicine, UMDNJ Robert Wood Johnson than one that occurs ten years later. For economic
Medical School, New Brunswick, New Jersey (FAS) and the Infor- analyses, both costs and utilities are discounted&dquo;,&dquo;
mation Technology Program, Baylor College of Medicine, Houston, such that later events have less impact than earlier
Texas (JRB). Supported in part by Grant LM05266 from the National ones. The second consequence is that a
given event
Library of Medicine and Grant HS06396 from the Agency for Health than As the
Care Policy and Research. may occur more once.
following example
Address correspondence and reprint requests to Dr. Sonnenberg: shows, representing events that are repetitive or that
occur with uncertain timing is difficult
Division of General Internal Medicine, UMDNJ Robert Wood John- using a simple
son Medical School, 97 Paterson Street, New Brunswick, NJ 08903. tree model.

322
Downloaded from http://mdm.sagepub.com at National Institutes of Health Library on January 21, 2009

labelled ANTICOAG.&dquo. increments of time. Other approaches.05 per person referred to as Markov states. agulant therapy. Figure 3 shows a commonly used dressed by using a recursive decision tree. Either kind of event causes morbidity (short-term and/ or chronic) and may result in the patient’s death. Each repetition of the by a circle. labelled BLEED. transition from DISABLED to WELL is not allowed. Downloaded from http://mdm. transition diagram. the patient may make a transition from events that may occur more than once can be ad. EMBOLUS. In our example of a patient the patient’s normal life expectancy may be less than with a prosthetic heart valve. and DEAD. in which each state is represented peared previously in the tree. cannot make a transition to any other state. such as assuming that embolus will result in the same state (DISABLED) and the stroke occurs halfway through the patient’s nor.sagepub. of time spent in the state.com at National Institutes of Health Library on January 21. a person who is in the DEAD state. some nodes have branches that have ap. either may oc. Only certain transitions is depicted in figure 2. a short time horizon. Sec. Such a patient may have an embolic or hemorrhagic event at any time. then cycle. A per- placed by the chance node ANTICOAG. labelled POST EMBOLUS. the occurrence of a stroke would have DISABLED. The first problem. one state to another. The Markov Model may be addressed by using the tree structure in figure The Markov model provides a far more convenient 1 and making the assumption that either BLEED or way of modelling prognosis for clinical problems with EMBOLUS occurs at the average time consistent with ongoing risk. 20 years. a person in the WELL state Here. the analyst Thus. In fact. a single arrow emanates from the DEAD state. and No EVENT are re. For the sake of simplicity in this the paradoxical effect of improving the patient’s life example. with 17 terminal branches. the patient remains WELL. are arbitrary and may lessen the The time horizon of the analysis is divided into equal fidelity of the analysis.12 In a re. are allowed. Each occurrence of die and thus make a transition to the DEAD state. However. if always in one of a finite number of states of health the rate of hemorrhage is a constant 0. The first chance node. representation of Markov processes. Therefore. to itself indicate that the patient may remain in that cursive tree that models the anticoagulation problem state in consecutive cycles. the structure implies that either hemorrhage or embolus may occur only once. Finally. the nodes representing the previous terminal may make a transition to the DISABLED state. the back to itself. Arrows leading from a state and any event may be considered repeatedly. All events of interest are per year. ond. and WELL. the recursive model can represent when events occur. It is assumed that a patient in a given tree in figure 2 is &dquo. the event state is assigned a utility. and the contribution of this of having a fatal hemorrhage will be associated with utility to the overall prognosis depends on the length a utility of 20 years of normal-quality survival. and NO EVENT.05 or 20 years. If NO EVENT occurs. The decision tree fragment in figure 1 shows one way of representing the prognosis for such a patient. POST BLEED. The model assumes that the patient is the known rate of each complication. Both BLEED and EMBOLUS may be either FATAL or NON-FATAL. Thus. POST-EMBOLUS. we assume that either a bleed or a non-fatal expectancy. There are several shortcomings with this model.bushy. However. Each of a hemorrhage is 1/0. mal life expectancy. then the average time before the occurrence modelled as transitions from one state to another. so ever. that the disability is permanent. specifying when events occur. referred to as Markov cycles. First. task equivalent to specifying the prognosis for each of these non-fatal outcomes. Thus. these states are WELL. 2009 . How- BLEED or EMBOLUS represents a distinct time period. For example. leading rying out the recursion for only two time periods. Simple tree fragment modeling complications of antico- the model does not specify when events occur. carrying out this analysis for even five years would cur more than once. 323 A Specific Example Consider a patient who has a prosthetic heart valve and is receiving anticoagulant therapy. a recursive model is tractable only for a very still is faced with the problem of assigning utilities. FIGURE 1. at the terminal nodes result in a tree with hundreds of terminal branches. obviously. Dur- Both the timing of events and the representation of ing each cycle. Arrows connecting two different states in- tree structure represents a convenient length of time dicate allowed transitions. which appeared son in either the WELL state or the DISABLED state may previously at the root of the tree. state can make only a single state transition during a If each level of recursion represents one year. has three branches. despite this relatively simple model and car. but a nodes POST-BLEED. For example. A re. called a state- cursive tree.

patient is &dquo. For a model that ample.given credit&dquo. the average amount perioperative myocardial infarction (MI) following pre. there spans the entire life history of a patient and relatively is little advantage to using a monthly cycle length. Recursive tree mod- eling complications of antico- agulant therapy. if only yearly probabilities are available. The length of the cycle is chosen to represent a determined by the available probability data. On the other hand. rare events the cycle length can be one year. For ex- clinically meaningful time interval. of time) spent in each state. for example monthly or even weekly. An example is the risk of number of cycles (or analogously. for the time spent in each The rapidity of this change in risk dictates a monthly state. 2009 .&dquo. The cycle time also must be shorter if a rate Evaluation of a Markov process yields the average changes rapidly over time. the vious MI that declines to a stable value over six months.324 FIGURE 2. Often the choice of a cycle time will be survival.com at National Institutes of Health Library on January 21. if the time frame is shorter and models events that may occur much more frequently. the cycle INCREMENTAL UTILITY time must be shorter. Seen another way.sagepub. then one need only add together the average Downloaded from http://mdm. If the only attribute of interest is duration of cycle time.

The DEALE with spending one cycle in a particular state is referred can be used to derive the constant mortality rates to as the incremental utility. as shown in cremental utility of 1. * For medical the incremental utility of the absorbing examples. If it has an absorbing state. they can be represented by an n x n matrix.sagepub. However. the transition probability for the FIGURE 3.5 solution of Markov chains described in detail by Beck cycles in the WELL state and 1. of course.7. In general. If the patient spends. a separate incremental utility may be specified for each state. be zero.5 X 1) + (1. there will be n2 transition probabilities. The model is evaluated separately for cost and survival. as discussed below.0. Markov-state diagram. the utility assigned would be (2. expected the probability of dying from unrelated causes will increase continuously.7 quality-adjusted cycles specific mortality rates have resulted in greater reli- to the expected utility. Arrows indicate allowed transitions.7). 325 When performing cost-effectiveness analyses. each multiplied by the incremental utility The net probability of making a transition from one for that state. as the patient gets older. The second component is the n probability of suffering a fatal hemorrhage or embolus Expected utility = ~ tss=l i during the cycle. called the P quantity of utility equal to the duration of a single matrix. For example. state to another during a single cycle is called a tran- sition probability. or 3. the cess depicted in figure 3. Each circle represents a Markov state. the net utility for the Markov process would be infinite. This means that for every cycle table 1. Downloaded from http://mdm. This may or may not be constant over time. 2.9 quality-ad- justed cycles.25 cycles in the DISABLED and Pauker. P Matrix expectancy of the patient. needed to implement a Markov chain.* and that the WELL state has an in. each state. its behavior factor representing the quality of life in that state rel. The Markov process is completely n defined by the probability distribution among the Expected utility = ~ s=l i ts X Us starting states and the probabilities for the individual allowed transitions. representing the financial cost of being in that state for one cycle. Consider the Markov pro. For a Markov model of n states. This number is the quality-adjusted life TaMe 1 . Utility accrued for the entire ance on Markov processes with time-variant proba- Markov process is the total number of cycles spent in bilities. amount of time in the DEAD state and if the incremental utility were non-zero.10. If the incremental utility of availability of specialized software to evaluate Markov the DISABLED state is 0. 2009 . on average. Each state is associated with a quality Markov chain. where ts is the time in state A of Markov process in which the tran- special type spent s. The utility that is associated simple matrix algebra.25 X 0. Cost-effectiveness ratios are cal- culated as for a standard decision tree. changes over time because.’ state before entering the DEAD state. the transition probabilities may change over time. quality of survival is consid- however. forms the basis for the fundamental matrix Markov cycle.com at National Institutes of Health Library on January 21. The first component is the probability of dying from unrelated causes. This matrix. In the most general type of Markov process. When these Let us assume that the DEAD state has an incremen- tal utility of zero. then spending the cycle in processes and the greater accuracy afforded by age- the DISABLED state contributes 0. sition are constant over time is called a probabilities Usually. this probability times spent in the individual states to arrive at an survival for the process. over time can be determined as an exact solution by ative to perfect health. Probabilities representing disallowed transi- spent in the WELL state the patient is credited with a tions will. probabilities are constant with respect to time. the ered important. DEAD state must be because the patient will spend an infinite zero. transition from WELL to DEAD consists of two compo- nents.11 TYPES OF MARKOV PROCESSES Markov processes are categorized according to whether the state-transition probabilities are constant over time or not.

cycle. the state WELL actually contains two Such states are called absorbing states because. and there is no arrow from the STROKE back to itself. The second use is represents a temporary state. lowed strictly in medical problems. That is. cycles. it is not pos- a bleed a different utility or risk of death than someone sible for the prognosis of a patient in a given state to disabled from an embolus. is WELL but has a history of gallstones. 2009 . may actually have different prognoses depending on sumption is necessary in order to model prognosis previous events. In for the model to reflect the different prognoses for medical examples the absorbing states must represent these two classes of well patients. the process has memory for no earlier An example of tunnel states is depicted in figure 5.com at National Institutes of Health Library on January 21. after distinct populations of people. however. if one wishes to keep track of the causes of death. The Markovian assumption is not fol. The shaded circle labeled &dquo. Such states are defined by having transitions only to other states and not to themselves. it must has the same probability of developing biliary com- have at least one state that the patient cannot leave. POST MI2 and POST spent in the WELL state before becoming DISABLED. For example. those with gallstones a sufficient number of cycles have passed. The POST Mil state is associated with the n + 1. Thus. The model illustrated in figure 3 is compatible with special arrangement of temporary states consists A a number of different models collectively referred to of an array of temporary states arranged so that each as finite stochastic processes. These states are called to represent a Markov process. in our example. temporary state for a single cycle. one additional restric. he or she enters For this reason. the patient has a certain probability of developing com- MARKOV STATES plications from the gallstones. tunnel states without having surgery. In order for this model has a transition only to the next. it must contain two death because it is the only state a patient cannot distinct well states. Thus. Temporary states have two uses. in which the risk of perioperative each subset of the cohort that has a distinct utility or death is constant. then more than one DEAD state may be used. Put M13 are associated with successively lower risks of per- another way. one representing WELL WITH GALL- leave. prognosis. STATUS-POST Downloaded from http://mdm. If we want to assign someone disabled from Because of the Markovian assumption. Each cycle. all patients in the DISABLED state have the ioperative death.326 state. labeled STROKE. to assign temporarily different transition probabilities. the as. This restriction. In order cohort will have been absorbed by those states. we must create two dis. the probability of death may be higher in the STROKE state than in either the WELL state or the THE MARKOV PROPERTY DISABLED state. This ensures that a patient may spend no more than a single cycle in the STROKE state. If a patient passes through all three same prognosis regardless of their previous histories. sometimes referred to as sequence. It does not matter how much time the person highest risk of perioperative death. Markov-state diagram. for example WELL. represent the first three months fol- that he or she will end up in the DEAD state after cycle lowing an MI.sagepub. a separate state must be created for the POST Mi state. However. This guarantees that the patient can spend. tunnel states because they can be visited only in a fixed tion applies. Figure 4 illustrates a Markov process that is the same as that shown in figure 3 except that a temporary state has been added. the patient will again be WELL but no longer In order for a Markov process to terminate. abled states. An arrow leads to STROKE only from the WELL state. at most. if someone is in the The three tunnel states. Often. one cycle in that state. However.STROKE&dquo. plications. For example. the entire and those who have had a cholecystectomy. because the incremental utility for the DEAD state is zero. patients in a given state. depend on events prior to arriving in that state. we know the probability through POST M13. analogous to passing through a tunnel. Following a cholecys- tectomy. The the Markovian assumption’ or the Markov property) 14 purpose of an array of tunnel states is to apply to specifies that the behavior of the process subsequent incremental utility or to transition probabilities a tem- to any cycle depends only on its in that description porary adjustment that lasts more than one cycle. consider a patient who with a finite number of states. Temporary states are required whenever there is an event that has only short-term effects. shaded and labelled POST Mil DISABLED state after cycle n. The first use is to apply a utility or cost adjustment specific to the FIGURE 4. There is usually no need for more than one DEAD STONES and the other representing WELL.

are modelled by tree structure &dquo. The Markov process is being used simply to calculate survival for a terminal node of the tree. At least for medical ap- plications. the Markov process is used to represent all events. The second disadvantage is the restriction to constant transition probabilities. The third disadvan- tage is the need to represent all the possible ways of making a transition from one state to another as a single transition probability. the Markov process incorporates all events of interest and the decision analysis is reduced simply to comparing the values of two Markov processes. if prognosis depends in In any way on past history.exact&dquo. general. the expected utility may be calculated by matrix algebra to yield the fundamental matrix. For more details of the matrix algebra solution the reader is referred to Beck and Pauker. In this case. In panel A below) permits representing all relevant events within (top). CHOLECYSTECTOMY. Tunnel states: the three shaded circles represent tem- porary statesthat can be visited only in a fixed sequence. it requires that there be one distinct state to represent the different histories. However. solution that is not affected by the cycle length. USE OF THE MARKOV PROCESS IN DECISION ANALYSIS The Markov process models prognosis for a given patient and thus is analogous to a utility in an ordinaiy decision tree. the Markov process. if we are trying to choose between surgeiy and medical therapy. 327 1#nufllati’is of Markov Models THE FUNDAMENTAL MATRIX SOLUTION When the Markov process has constant transition probabilities (and constant incremental utilities) for all states. events of interest. A far more efficient structure is shown in figure 6B.’ FIGURE 5. the Markov process. 2009 . which shows.outside&dquo. the matrix algebra solution has been largely relegated to the history books. the expected length of time spent in the state. In panel B (bottom). Use of Markov processes in a decision model. because it requires that an entire Markov process be run for each terminal node. Downloaded from http://mdm. such as operative death and cure. the Markov process is used only as a utility.sagepub.com at National Institutes of Health Library on January 21. we may con- struct a decision tree like that shown in figure 6A. for each starting state. For example. There are three main disadvantages of the matrix formation. This structure is inefficient. In this case. this is less of a problem than when Beck and Pauker’ described the technique. because many commonly available micro- computer spreadsheet programs now perform matrix algebra. The first is the difficulty in per- forming matrix inversion. of which there may be dozens or even hundreds. The use of the cycle tree representation (discussed in detail FIGURE 6. The matrix solution is fast and provides an &dquo.

state is 0. Panel C 2. does not contribute to the cycle sum because its in- ing it into blocks and summing their areas versus cal- culating the area by solving the integral of the function Table 2 . as shown in table 2.000 X 1) + (2.sagepub. The simulation is run for enough cycles so that the entire cohort is in the DEAD state (fig. and U.328 simulation in the DEAD state as a result of operative mortality. In ac- cordance with the transition probabilities specified in FIGURE 7. fs is the fraction of the cohort in state s. This process is repeated in subsequent cycles. because the incremental utility of the DISABLED between a cohort simulation and the matrix formu. is the incremental utility of state s. The first row of the table represents the start- ing distribution. Markov cohort simulation. PanelA (top). which is the sum of MARKOV COHORT SIMULATION the number of cohort members in each state multi- The Markov cohort simulation is the most intuitive plied by the incremental utility for that state. the fraction of the cohort initially in each state is par- titioned among all states according to the transition probabilities specified by the P matrix. For each cycle. The cohort simulation can be represented in tabular form. For example. with the entire cohort in the DEAD state. This results in a new distribution of the cohort among the various states for the subsequent cycle.000 X 0. The simulation considers a hy- pothetical cohort of patients beginning the process with some distribution among the starting states.400. The cycle sum is added to a running total that is referred to as the cumulative utility.000 pa- tients begins in the WELL state. Figure 7B shows the distribution of the cohort after a few cycles.000 patients (20% of the original distribution with all patients in the WELL state.com at National Institutes of Health Library on January 21. The fifth column in table 2 shows the calculation of the cycle sum. For ex- representation of a Markov process. This method may be im- plemented easily using a microcomputer spreadsheet program. The simulation is &dquo. 2009 . The difference ample.7) 7. it is not necessary to have all patients in the same state at the beginning of the simulation. if the strategy repre- sents surgery. The second row shows the distribution at the end of the first cycle. represented by the Markov-state dia- gram in figure 3. shows the initial the P-matrix (table 1). a fraction of the cohort may begin the Downloaded from http://mdm. 7C). In this example. Markov Cohort Simulation describing the curve. However. 2.000 patients to the DEAD state. as follows. Fifty percent of the cohort remains in the WELL state. Thirty percent of the cohort is in the SICK state and 20% in the DEAD state. Con- sider again the prognosis of a patient who has a pros- thetic heart valve. remaining in the WELL state. A hypothetical cohort of 10.run&dquo. the cycle sum during cycle 1 is equal to lation may be thought of as analogous to the difference (6. The utility accrued for the cycle is referred to as the cycle sum and is cal- culated by the formula: where n is the number of states. all pa- tients are in the WELL state.7.000 (60%) (bottom) shows the final distribution. Figure 7A illustrates the cohort at the beginning of the simulation. This leaves 6. Panel B (middle) cohort) have moved to the DISABLED state and another shows the distribution partway through the simulation. The DEAD state = between determining the area under a curve by divid.

the expected utility is 23. Notice that the cohort memberships at the start do not contribute to these sums. Figure 8 shows a sur- vival curve for members of a state. 329 ‘ cremental utility is zero. At each tick. there is a finite probability of a patient’s remaining alive. Thus. consistently overestimates survival. Counting membership cycle. for a net unadjusted life expectancy of 2. Counting cohort membership at the beginning of each each cycle to give credit according to the fraction of cycle. cohort at the end of each FIGURE 8.5 cycles in the WELL state and 1. as in figure 9.com at National Institutes of Health Library on January 21.sagepub. In this case. 1. the distribution of states is ad- justed to reflect the transitions made during the pre- ceding cycle. The unadjusted life expectancy may be found by summing the entries in the columns for the WELL and DISABLED states and dividing by the cohort size.3752 quality- adjusted cycles. 1 person-cycle) or when the fraction of the cohort remaining alive falls below a certain amount.752/10. counting the membership only at the beginning or at the end of the cycle will lead to errors. To more accurately reflect the continuous nature of the state transitions. Because the probabilities of leaving the WELL and DEAD states are finite and the probability of leaving the DEAD sate is zero.g. The sixth column shows the cumulative utility following each cycle. In this case. In the example illustrated in table 2. There- fore. the cohort in each state. THE HALF-CYCLE CORRECTION The Markov model assumes that during a single cycle.. the simu- lation is stopped when the cycle sum falls below some arbitrarily small threshold (e. The fraction of the cohort in the DEAD state actually is always less than 100% because. 2009 . FIGURE 10.75 cycles. the cohort members will spend. each patient undergoes no more than one state transition. The process of carrying out a Markov simulation is anal- ogous to calculating expected survival that is equal to the area under a survival curve. on average. we make the assump. One way to visualize the Markov process is to imagine that a clock makes one &dquo. for each cycle length.tick&dquo. or 2. during each cycle. the bookkeeping was performed at the end of each cycle. Illustration of the half-cycle correction. For this reason. transitions occur not only at the clock ticks. Counting at the beginning of each cycle. but continuously throughout each cycle. Each rectangle under the curve represents the accounting of the cohort membership during one cycle when the count is performed at the end of each cycle. the cycle sum falls below 1 after 24 cycles. Downloaded from http://mdm. The smoothness of the curve reflects the continuous nature of state tran- sitions. The area of the rectangles consistently underestimates the area under the curve. In reality. more and more of the cohort ends up in the DEAD state. The expected utility for this Markov cohort simulation is equal to the cumulative utility when the cohort has been completely absorbed divided by the original size of the cohort.000.25 cycles in the DISABLED state. The Markov cohort simulation requires explicit bookkeeping (as illustrated in table 2) during FIGURE 9.

25 cycles. a patient overestimations will be balanced. However. Thus. calculate because transitions from one state to another sequent cycle. the patient portance of the half cycle correction depends on cycle remains WELL. as in figure 10. The subtrees are attached to a special type of node designated a Markov node as depicted in figure 12. for actual clinical settings. the sum of the probabilities of all paths leading to terminal nodes labelled with the name of a particular ending state is equal to the transition probability from the beginning state to the ending state. of the simulation if the cohort is completely absorbed The probability tree corresponding to the WELL state because the state membership at that time is infini. Downloaded from http://mdm. if the simulation is terminated prior to the absorption of the cohort. then the under. and a patient in the DEAD state will always remain in that state. Therefore. The prob- the beginning of each cycle. There is no way to determine the In the preceding discussion. the highlighted paths in figure 12 show all transitions from FIGURE 11. by having an accident. There is one branch of the Markov node for each Mar- kov state. then add a half cycle for the starting membership at or by dying of complications of a coexisting disease. However. However.330 tion that state transitions occur. The prob- is subtraction of one half cycle from the membership of each ability tree for patients beginning in the DEAD state starting state. The Mar- kov node together with its attached subtrees is referred to as a Markov-cycle tree15 and. the branch labelled SURVIVE leads to a chance node sorption. tree is labelled with the name of the state in which a erage survival. For example. ASR. The interested reader A patient surviving either an embolus or a bleed will should note that the fundamental matrix represen- tation is equivalent to counting state membership at begin the next cycle in the DISABLED state. an additional correction must be made by modelling whether the patient has a BLEED or an EM- subtracting a half cycle for members of the state who BOLUS. This is equivalent to in the WELL state may make a transition to the DEAD shifting all cycles one half cycle to the right. is illustrated in figure 11. If the patient does not die from natural causes. For example. We must stateby having a fatal stroke. and race (ASR)-specific mortality. the beginning to compensate for this shift to the right. a patient reaching any terminal node la- belled DEAD will begin the next cycle in the DEAD state. If the cycle length is veiy short relative to av. Starting at any state branch. the difference between actual survival patient reaching that terminal node will begin the next and simulated survival (as shown in figure 8) will be small. sex. either of which may be fatal. Probability tree corresponding to the WELL state. difference will be more significant. possible events taking place during each cycle are rep- The shift to the right makes no difference at the end resented by a probability tree. be in the middle of a cycle that begins halfway through transition probabilities may be quite complicated to the previous cycle and ends halfway through the sub. WELL to DISABLED. the shift to the right modeling the occurrence of death from age. transition probabilities state membership in the middle of the cycle. If neither BLEED are still alive at the end of the simulation. It contains a chance node tesimal. the branch labelled DIE will result in overestimation of the expected survival.com at National Institutes of Health Library on January 21. 2009 . except that pa- that should be applied to the result of a matrix solution tients having NO EVENT will still be DISABLED. If the cycle time is larger relative to suivival. Each probability from the Markov node to one of its branches is equal to the probability that the patient will start in the corresponding state. The im- nor EMBOLUS occurs (the branch NO EVENT). consists only of the terminal node labelled with the name of the DEAD state since no event is possible. hayway THE MARKOV-CYCLE TREE through each cycle. Hollenberg15 devised an el- in an expected utility of 2. Each transition probability must take into account all Adding a half cycle for the example in table 2 results of these transition paths.and may happen in a variety of ways. if we consider the count at the end of each cycle to were provided as if they were elemental data supplied with a problem.875 quality-adjusted cycles egant representation of Markov processes in which the and a life expectancy of 3. the cycle. for simulations that terminate prior to ab. on average. Each terminal node in the probability length. along with the incre- mental utilities and the probabilities of the branches of chance nodes. is a complete representation of a Markov process.sagepub. Therefore) the correction ability tree for patients beginning in the DISABLED state is identical to that for the WELL state.

that a Markov cycle tree may be evaluated as a Monte the patient is given credit for each cycle spent in a Carlo simulation. 2009 .. which reflects how the cohort appears after a single cycle.). which is added to the cumulative utility. resulting in a utility of (2 X 1) + (3 X 0. large number of trials. Each As an alternative to simulating the prognosis of a trial generates a quality-adjusted survival time. However. the Monte Carlo sim. usually when the quantity of utility accumulating for each state drops below some specified small quantity. Finally. The mean value of this distribution of individual patients. The process is repeated a MONTE CARLO SIMULATION very large number (on the order of 104) of times. these constitute a distribution ulation determines the prognoses of a large number of survival values. Complete Markov-cycle tree corresponding to the an- fining a Markov model.ab- sorbed. If a single component proba.com at National Institutes of Health Library on January 21. They allow the analyst to break up a large problem into smaller. ticoagulation problem. The model pro- vides a great deal of flexibility when changing or re. For the example in figure used most often in recently published Markov deci. This occurs when the fraction of the cohort in the DEAD state approaches one. Just as for the cohort simulation.e. 13. FIGURE 12. Because of quality of life.&dquo. The fraction of the cohort currently in each state is then credited with the appropriate in- cremental utility to form the cycle sum. This is illustrated in figure 13. thus enhancing the fidelity of the model. ADVANTAGES OF THE CYCLE TREE REPRESENTATION Cycle trees have many of the same advantages that decision trees have for modelling complex clinical sit- uations. standard deviation of the expected utility may be de- bilities to determine in which state the patient will termined from this distribution. a random-number survival. After a hypothetical cohort of patients.sagepub. Each sub- tree is then traced from its root to its termini (&dquo. will be similar to the expected utility obtained by a Each patient begins in the starting state (i. the WELL cohort simulation. partitioning the subcohort for the corre- sponding state according to the probability tree. statistical measures such as variance and generator is used together with the transition proba. It should be noted begin the next cycle. the cycle tree representation has been the simulation is stopped. The result is a new distribution of the cohort among the states. 331 EVALUATING CYCLE TREES A cycle tree may be evaluated as a Markov cohort simulation. in addition to the mean state). this can be done without recalculating the aggregate tran- sition probabilities. and at the end of each cycle. the disaggregation of tran- sition probabilities permits sensitivity analysis to be state and each state may be non-DEAD adjusted for performed on any component probability. The process is repeated until some pre- determined criterion is reached. more manageable ones. its advantages. First.folding forward&dquo.’-’ three cycles in the DISABLED state before being &dquo. bility or a detail of a subtree needs to be changed. This clarifies issues for the analyst and for others trying to understand the results. Downloaded from http://mdm. The use of subtrees promotes appropriate symmetiy among the various states. When the patient enters the DEAD state. the starting composition of the co- hort is determined by partitioning the cohort among the states according to the probabilities leading from the Markov node to the individual branches.1 quality-adjusted cycles.7) or 4. The new distribution of the cohort is then used as the starting distribution for the next cycle. the patient spends two cycles in the WELL state and sion analyses.

com at National Institutes of Health Library on January 21. The features of the three representations are summarized in table 3. where f is the fraction surviving at time t and r is the constant transition rate. on the other hand. and cycle tree) per- mit the analyst to specify transition probabilities and incremental utilities that vary with time. Monte Carlo. Rates may be converted to probabilities if their proper relationship is considered. solution that is not sensitive to cycle time (as in the cohort simulation) or number of trials (as in . . Tpmsiflon Probabilides CONVERSION OF RATES TO PROBABILITIES The tendency of a patient to make a transition from one state to another is described by the rate of tran- sition. The major disadvantages * Adapted from Beck et al. - the Monte Carlo simulation). the fraction COMPARING THE DIFFERENT REPRESENTATIONS Each has Table 3 9 Characteristics of Markov Approaches* representation specific advantages and disadvantages for particular purposes. if these are desired. describes the like- lihood that an event will occur in a given length of time. and Monte Carlo) is the necessity for repetitive and time-consuming calculations.. A probability. Monte Carlo simulation. At any given time.5 Downloaded from http://mdm. The Monte Carlo method and the matrix solutions provide measures of variability.332 of the matrix formulation are the restriction to prob- lems with constant transition probabilities. Such variation is necessary to model certain clinical realities. The matrix manipulations required for a Mar- kov process with a large number of states may require special computational resources.sagepub. The simula- tions (Markov cohort.--. 2009 . However. Such measures are not possible with a cohort simulation. The rate describes the number of occurrences of an event (such as death) for a given number of patients per unit of time and is analogous to an in- stantaneous velocity. -. such as the increase in baseline mortality rate with age.exact&dquo. Probabilities range from zero to 1. The figure shows the state tran- sitions of a single person until death occurs during cycle 6. The fundamental matrix solution is very fast because it involves only matrix algebra and provides an &dquo. the need to express each composite transition probability as a single number. and the difficulty of performing matrix algebra. Rates range from zero to infinity. A dis- advantage common to all simulations (cohort. the avail- ability of specialized microcomputer software to per- form these simulations has made this much less of an issue. The probability of an event that occurs at a constant rate (r) in a specified time (t) is given by the equation: This equation can be easily understood by examining the survival curve for a process defined by a constant rate. cycle tree.. The equation describing this survival curve is: FIGURE 13.

Thus. and then uation of therapy. the rate and probability are highly than those occurring in the future. The following example is a Markov implementation priate transition probabilities for the shorter cycle. the patient still may reject tion. patient reject the cycle. the transition England Medical Center. fewer are at risk or benefits occurring immediately are valued more for the transition later in the time period. where Ut is the increment utility at time t.sagepub. the therapy will be stopped at gradually increases during adulthood. When the rate is small or t is short. as above. The dis- counting formula is: very similar. model is a Markov-cycle tree as used by two specific tion 2 for r: decision analysis microcomputer programs Decision Maker1s and SMLTREE.6 Downloaded from http://mdm. a ther- one. If of a decision analysis that has been published in detail the original rate is a yearly rate. the mortality rate will increase significantly dur. possibly lethal melanoma. therapy ensures that the patient’s kidney will be re- hort ages. a formula and converted to a transition probability. Often. If therapy is continued.7 and that of life with a func- and using such tables. Cessation of this is the probability of death. kidney function. the appropriate mortality rate is calculated from kidney immediately. 333 that has experienced the event is equal to 1 . and retrieved as the Markov model is eval. DISCOUNTING: TIME DEPENDENCE OF UTILITIES the curve describing the probability that the event will Incremental utilities. Some rates are not easily described as a simple func. two synchronous malignant melanomas ap- TIME DEPENDENCE OF PROBABILITIES peared and required wide resection. A 42-year-old man had had a cadaveric kidney transplant 18 months previously and had done well except for an early rejection episode. lifetime. been treated successfully. Some computer software used for evaluating ity assessment from the patient. indexed by cycle 5. the transition probabilities immunosuppressive therapy increases the chance of in a Markov model vary with time.3 For each clock 1. Uo is the cycle length for t in equation 1. like transition probabilities. or 1 . Another ex- the time the rejection occurs. these rates must be converted to the correspond- ing transition probabilities by substituting the Markov.e . While he was receiving standard treatment with azathioprine and prednisone. For use in a Markov anal- ysis.f. Continuation of In the most general case. may occur in time t is simply 1 . One example is the actual mortality rate over a the kidney.If one has only the yearly = ysis was performed for an actual patient at the New transition probability and not the rate. A second will not be considered. The probability of transition in time t is always less than the corresponding rate per time t dependence is the discounting used in cost-effec- tiveness analyses. tioning transplant 1.’. to recal.10 Because of the time variance. 2. data supplied for an analysis pro- vide rates of complications. which initially is high during early childhood. In 4. If the time horizon for the analysis is a long jected and will require his return to dialysis. The key assumptions in the construction of this ing later cycles. 2009 . functioning transplant.r/12. which had Then.’° This is based on the fact that costs because as the cohort members die. There are two ways of handling such model are: changing probabilities. discounting cannot be PRECAUTIONS IN CHANGING THE CYCLE LENGTH used when the fundamental matrix solution is used.com at National Institutes of Health Library on January 21. such as the Gompertz function. how- ever. If the patient rejects the kidney despite contin- falls to a minimum during late childhood. The implementation of the probability can be converted to a rate by solving equa. An obvious example another.’9 Case history. which increases as the co. the utility of life Markov processes provides facilities for constructing on dialysis was 0.e .f. the calculated rate is used. then the monthly elsewhere as an ordinary decision tree. One important application of this time in equation 1.l6. such cases. and d is the discount rate. but with a lower probability. One is with a continuous func- tion.rt as shown vary with time.0&dquo. He had maintained normal culate the transition probability. This anal- probability is p 1 . Quality of life is lower on dialysis than with a number. apeutic modality he prefers to avoid. one cannot simply divide the calculated transition probabilities by 12 to arrive at the appro. 3. If the will therapy is stopped. ample is the risk of acquiring a disease (such as Hodg- kins’ disease) that has a bimodal age distribution. Based on the original util- uated. Whenchanging the Markov-cycle duration from yearly A Med Example to monthly. initial incremental utility. the necessary rates (or corresponding transplant probabilities) may be stored in a table.

The first was that recurrent melanoma occurred at a fixed time in the future (one year)16 although. it may occur at any time. APY and NO REJECT are assumed not to occur. If we assume that the occurrence of REJECT or NO REJECT. Root of the tree representing the Markov model of the fication. renal status. the original model required several simplifying assumptions. Downloaded from http://mdm. picted in figure 16 consists of a Markov node with one In the caseof CONTINUE) a chance node models the branch for each Markov state. FIGURE 15. the midpoint of the patient’s life expectancy. Therefore. five states are required to represent the scenario. Because it was a simple tree. The lowest probability is for patients whose ther. then experiencing transplant rejection was assigned the average of the utilities for transplant and dialysis. The Markov-cycle tree de- representing CONTINUE and STOP therapy. DIALWELL (di- bilities in this model must be assigned to reflect the alysis. These anoma. respectively.com at National Institutes of Health Library on January 21. melanoma case. the utility of continuing therapy. Simple decision tree for the kid- ney transplant.sagepub. are (from top to bottom) TRANSWEL (transplant well). and DEAD. anoma depends only on whether or not the patient minal nodes represent the six possible combinations is receiving immunosuppressive therapy. on treatment because it is assumed that patients in ued. 6. If the patient values time on dialysis differ- ently now compared with later. It will be further reduced if the patient address all of these issues. is for those whose therapy is continued indefinitely. melanoma case. this is an oversimplification because the patient ac- 7. this is an oversimpli.334 FIGURE 14. 2009 . ’I’he root of the tree in figure 15 is a The simple tree modeling this problem is shown in decision node with two branches representing the two figure 14. the transplant states are on immunosuppressive ther- apy is stopped immediately. noma). The highest probability apy and those in the dialysis states are not. Ter. Separate states are not needed based pending on whether or not therapy has been contin. Because therapy will be stopped. then only of therapy. no melanoma). The development utility of a state depends only on whether the patient of a new melanoma is modelled by the chance node is on dialysis or not and that the probability of mel- with the branches MELANOMA and No MELANOMA. Again. TRANSMEL (transplant with melanoma). No adjustment is made to quality of life for having of melanoma recurrence in this intermediate scenario recurrent melanoma. DIALMEL (dialysis and mela- different risks of developing a new melanoma de. There are two branches of the decision node. The patient’s life expectancy is reduced because tually has a high probability while on the therapy and ofhaving had a renal transplant and a previous a low probability while off it. choices CONTINUE and sTOP. in reality. was the average of the high and low probabilities. and occurrence of a new mel. The two combinations representing sTOY Trrrr~. Proba. patients who ex- perience rejection after an initial period of continuing therapy will have an intermediate risk of melanoma recurrence. goes on dialysis or if melanoma recurs. The second was that transplant rejection occurred at a fixed time. ’I’he Markov decision model is shown in figure 15 and figure 16. The Markov model can melanoma. The third assumption was that the probability kidney transplant.

For Recur following Reject. A monthly cycle length will result simply a terminal node assigned to the state DEAD. respectively. if the risk of allograft rejection in figure 16. so the probability of that state should be 1. in a 12-fold increase in evaluation time over a yearly since no event is possible and all patients return to cycle length. Downloaded from http://mdm. For NoRecur following Reject and NoRecur2 following FIGURE 16. The event tree for the TRANSMEL state is also shown cycles. In this case the utility is DEAD. all patients begin in the TRANSWEL state. For patients who do not die (the branch Survive). if the appropriate currence and that for the DIALMEL state models only half-cycle corrections are made. length is one year or one month. It is simpler than that for TRANSWEL be. the appropriate state is DIALMEL. Only the latter branch repre- sents a return to the starting state. for Recur2 fol- lowing NoReject the appropriate sate is TRANSMEL. For this example. all patients begin in the oraLwELL state. for the STOP strategy. Similarly. Thus. For the CONTINUE strategy.exact&dquo. the same Markov-cycle tree can be used as a subtree&dquo. For example. NoReject. In practice. 2009 . The most complex is for the TRANSWEL state. the analyst must decide on the cycle length. the probability tree for the DIALWELL state however. Assignment of terminal states is similar to sideration is that the cohort simulation is an approx- &dquo. Similarly. TRANSWEL. Markov-cycle tree representing the kidney transplant. were markedly different in month 3 than in month 1. since there is no im- the DEAD state in the subsequent cycle. A bind- ing set between the strategy branch and the Markov node can set the appropriate variable to 1. Each of these branches leads to a terminal node. because a patient who dies during one cycle will begin the next cycle in the DEAD state. The first event modelled is the chance of dying from all causes (the branch Die). cause the risk of melanoma recurrence need not be then a monthly cycle should be used. a yearly cycle length is used. Another con- modeled. 335 INITIAL PROBABILITIES The first task is to assign probabilities to the branches of the Markov node. it makes little difference whether the cycle models only the risks of death and of melanoma re. We can implement these as- sumptions by assigning the probabilities of the TRANSWEL and DIALWELL branches as variables. SUBSEQUENT PROGNOSIS Each branch of the Markov node is attached to a subtree that models the possible events for each Mar- kov state. The cycle length The next task is to assign probabilities to the events should be short enough so that events that change in each event tree. Recall that these probabilities rep- resent the probabilities of starting in the individual states.20 A final consideration the risk of death. solution when the cycle length is short. the next chance node models the chance of transplant rejection (the branches Reject and NoReject). to represent the prognosis for both strategies. not make a transition to the TRANSWEL or DIALWELL state.com at National Institutes of Health Library on January 21. portant change during a year. CHOICE OF CYCLE LENGTH ASSIGNMENT OF PROBABILITIES Before the probabilities can be assigned. Die leads to a terminal node. The event tree for the DEAD state is is evaluation time. shown at the top of figure 16. Subsequent to each of these branches is a chance node modelling the risk of recurrent melanoma (the branches Recur and NoRecur and Recur2 and NoRecur2).sagepub. the appropriate states are DIALWELL and melanoma case. Each state has a chance node rep- over time can be represented by changes in successive resenting the occurrence of death during a given cycle. that for the TRANSWEL state except that patients may imation and will more closely approximate the &dquo.

However. Moreover. the value of the expression for pDIE table of age-specific mortality rates and look up the should change for each cycle of the simulation as mASR appropriate value for each cycle. The the values of mEXCESS are different for the individual most convenient way to implement this is to use a states. pReject. With a yearly cycle increases. each state must added to the baseline mortality rate to produce a total be assigned an incremental utility that reflects the compound mortality rate. We can above can be placed in a binding proximal to the Mar- assign a different mASR for each cycle to reflect the kov node. However. on average. quires entering the expression four times (it isn’t needed culate the cycle-specific age. when the value of (when all members are older). the age should be reduced by 0. this is cumbersome. Therefore. Finally. where StartAge and therefore must be specified for each state. the in. because it re- There are three refinements to the age used to cal.19 retrieve the appropriate rate. and slows evaluation.. sex.2 The total mortality rate may value of being in that state for one cycle. 2009 . Empirically. Second. halfway through a entire expression must be placed on the binding stack cycle.2 years corrects for these effects. utilities in a Markov cohort sim- example.0036/ year (for a 43-year-old male). thus ensuring that the evaluation will use the published rate for age 50 actually applies to a group current value of m.CYCLE and mEXCESS. Each branch of the Markov node then needs a binding For a detailed discussion of these corrections. At first glance. the starting age should be corrected according The required binding expression is: to the formula: where cyclen is the length of the Markov cycle in years. increasing mortality rate as patients get older.5 cycle.’O the probabilities of pReject and pRecur depend on The mortality rate may be retrieved from the table whether the patient is on immunosuppressive therapy (MTABLE) by the following expression. for the appropriate value of mEXCESS. it may seem that the expression depends on the patient’s age. the patient’s age at the end of cycle n is: once and thus would not allow the value of the expres- sion to change. deaths are slightly This function tells the computer program to place the more likely to occur among the older members of a entire expression on the binding stack instead of eval- heterogeneous cohort and toward the end of a year uating the expression first. published mortality rates for patients of a given age An ideal solution is provided by deferring the eval- (e.g. Maker18 and SML TREE) 19 this is accomplished by setting ing any cycle: the values of three special variables. because the that transitions occur. be entered in a binding proximal to the Markov node. A simple binding would be evaluated only length.CYCLE and the appropriate local with an average age of 50. during the evaluation of each state during each cycle. and race.5 year to PASSBIND function in Decision Maker&dquo. reducing the age by an of m. Thus. The variable m. rather than with of mortality due to the patient’s coexisting diseases is terminal nodes of the tree. The is corrected as above and m. and pRecur for each state number: are shown in table 4. First. since it is shared by all branches. One solution is to place a binding with the above expression on each branch of the Markov node. anywhere in the tree.5 years and the cycle-specific value of mEXCESS. and SMLTREE. 50 years old) are derived from all patients between uation of the expression until the value of pDIE is that age and the next (50 and 51 years).com at National Institutes of Health Library on January 21.uINCR represents the incremental utility of a state for one cycle. The expression thus can additional 0.CYCLE is the Markov-cycle values of mEXCESS. the death rate applies to patients who are slightly older expression will be evaluated with the prevailing values than the average. the initial value of mASR is 0. The excess component ulation are associated with a state. Thus.336 This probability is based on the mortality rate for each DEFERRED EVALUATION state. In Decision then be used to calculate the probability of death dur. Thus. which consists of two components. the baseline There are two ways to introduce the expression de- mortality rate and the excess mortality due to any fining the probability of death into the Markov decision comorbid diseases. The values of terested reader is referred to Sonnenberg and Wong. the needed. It is used to im- Downloaded from http://mdm.1 to 0. ASSIGNING UTILITIES For this As described above.sagepub. The baseline mortality rate (mASR) model. because we assume for the DEAD state). so that the observed pDIE is needed at any time. This is accomplished using the age should be reduced by an additional 0.uINIT is a one-time ad- justment to the incremental utility that is made at the beginning of the Markov simulation. The variable m.

as in figure 17.sagepub.uINCR is 1 because there is no utility decrement for the TRANSPLANT states.2 the cohort. because no utility accrues for membership in the DEAD The assignment of quality adjustments to incre- state. adjusted life expectancy.uTAIL will represent the life ex- pectancy of the patient beyond the first six months. The matrix algebra solution requires the least com- because the DIALYSIS states are associated with a utility of only 0. variable on the left side wherever it appears.TAIL is 0 be. When Markov-state names plement the half-cycle correction and therefore its value are used on the right side of a binding expression. the analysis favors continuing therapy by a large X m.5 Thus. 2009 . a chance node with branches Recur and NoRecur appears in three places in the tree.com at National Institutes of Health Library on January 21.19 has been developed to implement Markov-cycle trees.0.7 relative to perfect health. The problem is that several of the branches are ter- minal nodes and their utilities apply only to one spe- cific context. In this case. these bindings may be omitted for mental utility permits Markov analyses to yield quality- the DEAD state because if their values are not specified. but can be used only when transition prob- abilities are constant. 337 TO* 4 o Mortality Rates and Probabilities places in the tree. the expected utilities are: The second use is to apply the half-cycle correction CONTINUE THERAPY 7. For the (by setting quality of life on dialysis to unity). In practice. if a Markov process is used only to represent the events EVALUATION taking place during the first six months following an When the Markov model is evaluated as a cohort operation.uINCR. With the at the end of the simulation. Similarly. . m.u.uTAIL program substitutes the state on the right side for the is used when the Markov cohort simulation is termi. . multiple sub- jects). and when with branches Reject and NoReject appears in two the utility of an outcome depends on when it occurs. Markov processes may be repre- sented by a cohort simulation (one trial. Examination of figure 16 reveals that a chance node in which events may occur more than once. For example.. the is usually set to 0. then m. the bindings are: is a formalism that combines the modelling power of the Markov process with the clarity and convenience . to incremental utilities in cost-effectiveness analyses. margin. or by a matrix algebra solution. by a Monte Carlo simulation (many trials. This per- nated before the entire cohort is in the absorbing state. All clinically For the DwLwELL and DIALMEL states. mits representing the prognoses of all four non-dead Its value is added to the incremental utility for a state states with a single subtree. the tail utility must be set to . of a decision-tree representation.. The Markov-cycle tree DEAD state. simulation. We would like to use a common subtree to represent these events in all portions of the tree.4 to a simulation that is stopped prior to absorption of STOP THERAPY 5.5 X m. a sin- gle subject for each). One is to represent the prognosis beyond the functionally identical to the one in figure 16. Conelusion cause we are planning to run the simulation until the Markov models consider a patient to be in one of a cohort is completely absorbed. The tail utility has two state bindings shown. even if the subsequent prognosis is of no interest. Discounting may be applied they will be assumed to be zero. The Markov model provides a means of modelling clin- MARKOV-STATE BINDINGS: A REFINEMENT ical problems in which risk is continuous over time. this Markov-cycle tree will be uses. stopping point in the Markov simulation. Downloaded from http://mdm. a special case of the Markov For the process called a Markov chain.. Specialized com- puter software18. The variable m. more than two quality-adjusted life years. If The values of these special variables are set with the quality adjustment is removed from the analysis bindings on each branch of the Markov node. the bindings important events are modelled as transitions from one are: state to another.16 putation. finite number of discrete states of health. The solution to this problem is the use of Markov-state bindings.uINCR. then the TRANSWEL and TRANSMEL states the bindings are: results are: The value of m.

In Kelley WN. Pauker SG. 12. Textbook of Internal Medicine. Med Decis Making.113:852-71. 16. Pauker SG. 1991. Ann Intern Med.sagepub.267:2055-61. 1990. Fine-tuning Markov models for life- data available in 1989.115:513.1987. Salem D. Sonnenberg FA. 1977. Kassirer JP. 2009 . SMLTREE: The All Purpose Decision Tree Builder. Markov-cycle tree using subtrees and state bindings. Proceedings prosthetic heart valves. 1. Myocardial re. Decision Maker: an advanced per- SG. has the potential to permit the development of deci.3:419-58.113:147-54. I. 1992. Goldman L. 1976. Gafni A. 1990. 1982. Does low risk mean high cost? JAMA. 7. Pauker SG. Hollenberg JP. Smith TJ.98:504-13. expectancy calculations. Philadelphia: J.73:883-8. Lippincott. 6. 1984. A convenient approximation of 14. Desch CE. Levey AS. 107. Vital Statistics of the United tient with Malignant Melanoma? Med Decis Making. Snell JL. Decision analysis. 2. and 13. Beck JR. vascularization for chronic stable angina: an analysis of the role Boston: Pratt Medical Group. Med Decis Making. On the nature of the function expressive of the law plex Markov processes (abstr).com at National Institutes of Health Library on January 21. Pauker SG. Pauker 18. Hillner BE. 1983.Sonnenberg FA. Lau J. 1984.296:716-21. of autologous bone marrow transplantation in metastatic breast cancer. Pauker SG. JAMA. Marrin CA. Stason WB. Wong JB. Anticoagulation for noncardiac procedures in patients with sonal computer tool for clinical decision analysis. 17. Am J Verlag. 5.C. Kassirer JP. Eckman MH. of human mortality. justified simplifying assumptions and may be com. Kassirer JP. 11. Validation of the model. N Engl J Med. Foundations of cost-effectiveness sion models that more faithfully represent clinical analysis for health and medical practices.338 FIGURE 17. The Markov process in medical prognosis. Weinstein MC. Estimates using decision analysis while awaiting clinical ventional decision trees may require unrealistic or un- trial results. D. Efficacy and cost-effectiveness considerations. Med. Wong JB. A clinician’s guide to cost-effectiveness analysis. Naglie IG. Sonnenberg FA. Ann Intern Med. 1991. New York: Springer- life expectancy (the "DEALE"). Washington. Ann Intern Med. Durand-Zaleski I.340:520-3. Thus.0: improved de- cision analysis by personal computer. Med Decis Making. Cucharal GJ.263:1513-21. Beck JR. Part A. 9. of the Eleventh Annual Symposium on Computer Applications 1990. ed. Washington. Finite Markov Chains. DECISION MAKER 3. Cardiac risks and complications of noncardiac sur- healthy-years equivalents. Mehrez A.4:529. 1988. of percutaneous transluminal coronary angioplasty based on 20. utility theory. in Medical Care. 1985. Most analytic problems involve at least one of these 8. gery. Beshansky JR. Detsky AS. 1993.: IEEE Computer Society. Mortality. Should patients with putationally intractable.4:82- States. Public Health Service. 1992. 1825. Med Decis Making. Markov cycle trees: a new representation for com- 3. Philos Trans R Soc London. problems. Pauker SG. Should Immunosuppression Be Continued in a Transplant Pa- 4. Referenced 1983.Birkmeyer JD. 1988. Kidney Failure or Cancer: 85. Kemeny JB. Med Decis Making. 10. 1983. Hollenberg J.13:170-2. Sonnenberg FA.3:39-43. Vol II. National Center for Health Statistics. Levine HJ. B. Downloaded from http://mdm. Section 6. the use of Markov models Bjork-Shiley valves undergo prophylactic replacement? Lancet. Modelling such problems with con.9:142-9. 1989. Gompertz B. 19.Quality-adjusted life years. 15. O’Connor G-T.