Professional Documents
Culture Documents
2440127104
LHDA
Renewal Process
• Renewal process
➢ A renewal process as a special case of a counting process
➢ Counting process {N(t); t ≥0} generated by {T(n); n ≥1} is called a renewal
process if { T(n); n ≥1} is a sequence of nonnegative iid random variables.
➢ a counting process for which the times between successive events are
independent and identically distributed with an arbitrary distribution is called
a renewal process
➢ Thus, a renewal process is a counting process such that the time until the
first event occurs has some distribution F, the time between the first and
second event has, independently of the time of the first event, the same
distribution F, and so on. When an event occurs, we say that a renewal has
taken place.
STAT6096-Stochastic Processes
Rizal Hafidhillisan
2440127104
LHDA
• Semi-Markov Processes
➢ Consider a process that can be in state 1 or state 2 or state 3. It is initially in
state 1 where it remains for a random amount of time having mean μ1, then
it goes to state 2 where it remains for a random amount of time having mean
μ2, then it goes to state 3 where it remains for a mean time μ3, then back to
state 1, and so on. What proportion of time is the process in state i, i = 1, 2, 3?
➢ Pi =the proportion of time that the process is in state i, is
➢ Similarly, if we had a process that could be in any of N states 1, 2, .,N and that
moved from state 1 → 2 → 3→· · ·→N −1 → N → 1, then the long-run
proportion of time that the process spends in state i is
o
where μi is the expected amount of time the process spends in state i during
each visit
➢ Let us now generalize the preceding to the following situation.
➢ Suppose that a process can be in any one of N states 1, 2, . . . , N, and that
each time it enters state i it remains there for a random amount of time
having mean μi and then makes a transition into state j with probability Pij.
Such a process is called a semi-Markov process.
➢ Note that if the amount of time that the process spends in each state before
making a transition is identically 1, then the semi-Markov process is just a
Markov chain.
➢ Consider πi, the proportion of transitions that take the process into state i.
STAT6096-Stochastic Processes