You are on page 1of 4

Simple Random walk :

Let X1, . . . , Xt, . . . be a sequence of


independent Rademacher random
variables,
Xt ={1 with probability 1/2
−1 with probability 1/2 }
so
E (Xt) = 0, and Var (Xt )= 1 and (t = 1, 2, . .
.)
Example :
Imagine a game played between Hetty and
Taylor, in which a fair coin is tossed
repeatedly.
When Heads occurs, Hetty wins a dollar
from Taylor, and when Tails occurs Taylor
wins a dollar
from Hetty. Then Xt is the change in
Hetty’s net winnings on the t
th coin toss.
Markov chain :

basic theory :
A Markov chain is a stochastic process,
but it differs from a general stochastic
process in that a Markov chain must be
"memory-less." That is, (the probability of)
future actions are not dependent upon the
steps that led up to the present state. This
is called the Markov property. While the
theory of Markov chains is important
precisely because so many "everyday"
processes satisfy the Markov property,
there are many common examples of
stochastic properties that do not satisfy the
Markov property.
Model :
Example:
Gambling :
Suppose that you start with $10, and you
wager $1 on an unending, fair, coin toss
indefinitely, or until you lose all of your
money. If {\displaystyle X_{n}} represents
the number of dollars you have
after n tosses, with {\displaystyle
X_{0}=10}, then the
sequence {\displaystyle \{X_{n}:n\in
\mathbb {N} \}} is a Markov process. If I
know that you have $12 now, then it would
be expected that with even odds, you will
either have $11 or $13 after the next toss.
This guess is not improved by the added
knowledge that you started with $10, then
went up to $11, down to $10, up to $11,
and then to $12. The fact that the guess is
not improved by the knowledge of earlier
tosses showcases the Markov property,
the memoryless property of a stochastic
process.
Ehrenfest chains :
Basic theory :
suppose that we have two urns, labeled 0
and 1, that contain a total of m balls. The
state of the system at time n∈N is the
number of balls in urn 1, which we will
denote by Xn. Our stochastic process
is X=(X0,X1,X2,…) with state
space S={0,1,…,m}. Of course, the
number of balls in urn 0 at time n is m−Xn .
Model :

You might also like