You are on page 1of 18

Statistics & Decisions 99, 10011020 (2010) / DOI 10.1524/stnd.2010.

c R. Oldenbourg Verlag, Mnchen 2010


Comparison of Markov processes via
innitesimal generators
Ludger Rschendorf, Viktor Wolf
Received: 10.2010; Accepted: 01.2011
Summary: We derive comparison results for Markov processes with respect to stochastic order-
ings induced by function classes. Our main result states that stochastic monotonicity of one process
and comparability of the innitesimal generators implies ordering of the processes. Unlike in pre-
vious work no boundedness assumptions on the function classes are needed anymore. We also
present an integral version of the comparison result which does not need the local comparability
assumption of the generators. The method of proof is also used to derive comparison results for
time-discrete Markov processes.
1 Introduction
Ordering conditions for Markov processes in terms of conditions on the innitesimal
generators have been given in several papers in the literature. Massey (1987), Herbst and
Pitt (1991), Chen and Wang (1993) and Chen (2004) describe stochastic ordering for dis-
crete state spaces for diffusions and diffusions with jumps in terms of their innitesimal
generators. For bounded generators and in the case of discrete state spaces Daduna and
Szekli (2006) give a comparison result for the stochastic ordering of Markov processes in
terms of comparison of their generators. Rschendorf (2008) (abbreviated in the sequel
as [Ru]) established a comparison result for Markov processes on polish spaces using
boundedness conditions on the order dening function classes. These boundedness con-
ditions arise from the method of proof used in that paper which makes essential use of an
idea in Liggetts characterization result for association of Markov processes (see Liggett
(1985)). A similiar idea was also used in a paper of Cox et al. (1996) and Greven et al.
(2002) for the special case of directionally convex ordering of a system of interacting
diffusions. For Lvy processes some general ordering results were derived in Bergen-
thum and Rschendorf (2007) (abbreviated in the sequel as [BeRu]) and for the case of
supermodular ordering in Buerle et al. (2008). The comparison results in [BeRu, 2007]
go beyond the frame of Markov processes to semimartingales and are based on stochastic
analysis (Its formula and generalized Kolmogorov backward equation).
AMS 2000 subject classication: 60E15, 60J25, 90B15
Key words and phrases: Stochastic ordering, stochastic monotonicity, Markov processes, innitesimal genera-
tors,
1002 Rschendorf Wolf
In this paper we extend the approach in [Ru, 2008] based on strongly continuous
semigroups and their innitesimal generators. In Section 2 we recollect the necessary
notation and results on strongly continuous semigroups on Banach spaces B and their
generators. This generality allows to omit the restrictive boundedness conditions in pre-
vious papers which prevent applications to interesting orderings dened by nonbounded
function classes as e.g. convex orderings. As consequence we obtain general ordering
results for Markov Processes in Section 3 by the same simple method as in [Ru, 2008].
Furthermore we give a variant of this comparison result where the conditions appear in
an integrated form. This integral version of the comparison result can be useful in those
cases where a local comparability of the generators is difcult to achieve. We discuss
several applications that can be dealt with the generalized approach in this paper. More-
over the simple method of proof in continuous time which is used in Section 3 is adapted
also to the discrete time case. This in fact gives a new proof to a classical comparison
result in for discrete time processes.
2 Strongly continuous semigroups and their
innitesimal generators
In this section we recollect some important notions and results from semigroup theory
on general Banach spaces. Our main references are Engel and Nagel (2000) and Pazy
(1983).
Let T = (T
t
)
tR
+
be a semigroup of bounded linear operators on a Banach space
(B, [[ [[) i.e.
T
0
= id,
T
t+s
= T
t
T
s
for s, t 0.
(2.1)
T is called uniformly continuous, if
lim
t0
[[T
t
id[[ = 0. (2.2)
T is called a contraction, if
[[T
t
f[[ [[f[[, t 0, f B. (2.3)
T is called a strongly continuous semigroup on B also called C
0
-semigroup, if
lim
t0
T
t
f = f, f B. (2.4)
If T is strongly continuous then
lim
h0
1
h
_
t+h
t
T
s
fds = T
t
f, f B. (2.5)
The innitesimal generator A : T
A
B B of a strongly continuous semigroup is
dened as
Af := lim
t0
1
t
(T
t
f f) (2.6)
Comparison of Markov processes via innitesimal generators 1003
for its domain
T
A
:=
_
f B : lim
t0
1
t
(T
t
f f) exists
_
.
We list two classical examples of semigroups.
Examples 2.1 (1) For the Banach space B = C
b
([0, )) of bounded continuous
functions on [0, ) with sup-norm the translation semigroup T is dened by
(T
t
f) (x) := f(x +t). (2.7)
T is a C
0
-contraction semigroup on C
b
([0, )) with innitesimal generator
Af =
d
dx
f = f

(2.8)
and with domain T
A
, the set of all f C
b
with f

C
b
. Obviously A is not a
bounded operator.
(2) Consider the diffusion semigroup T on B = L
p
(R
d
), 1 p < , dened by
T
t
f(x) :=
1
(4t)
d/2
_
R
d
e

|xy|
2
4t
f(y)dy. (2.9)
T is a strongly continuous semigroup on L
p
(R
d
), the class of measurable, p-times
integrable functions on R
d
w.r.t. Lebesgue measure. The innitesimal generator
of T is the closure of the Laplace operator
f(x) =
d

i=1

2
ii
f(x
1
, . . . , x
d
). (2.10)
A is dened for all f in the Schwarz space
o(R
d
) =
_
f C

(R
d
); f rapidly decreasing
_
.
The following result is basic (see Engel and Nagel (2000, Lemma 1.3 in Chapter II)).
Lemma 2.2 Let (A, T
A
) be the innitesimal generator of a strongly continuous semi-
group T. Then it holds:
(1) A : T
A
B B is a closed, densely dened linear operator, i.e. T
A
B is
dense.
(2) If f T
A
, then T
t
f T
A
and
d
dt
T
t
f = T
t
Af = AT
t
f, f B, t 0. (2.11)
1004 Rschendorf Wolf
(3) For f B, t 0 holds
_
t
0
T
s
fds T
A
. (2.12)
(4) For t 0 holds
T
t
f f = A
_
t
0
T
s
fds, if f B (2.13)
=
_
t
0
T
s
Afds, if f T
A
. (2.14)
For a time homogeneous Markov process X = (X
t
)
t0
on some measure space
(E, E ) let P
t
denote the transition kernel
P
t
(x, B) = P(X
t
B[X
0
= x), x E, B E (2.15)
and T = (T
t
) the corresponding transition semigroup dened by
T
t
f(x) =
_
E
P
t
(x, dy)f(y) (2.16)
= E (f(X
t
)[X
0
= x)
for f in a suitable Banach space B of functions on E. Then this puts Markov processes
in the framework of semigroups.
Examples 2.3 (1) Let B
t
= (B
1
t
, . . . , B
d
t
) be a Brownian motion on R
d
, then
P
t
(x, V ) = (2t)
d/2
_
V
e

|yx|
2
2t
dy (2.17)
for 0 < t, V B(R
d
). The corresponding semigroup T can be considered e.g.
on C
0
or on C
b
.
(2) Let X = (X
t
)
t0
be a Lvy process on R
d
with semigroup T = (T
t
) and in-
nitesimal generator A, then C
2
0
T
A
and
Af(x) =
d

i=1
b
i

i
f(x) +
1
2
d

i,j=1
c
i,j

2
ij
f(x) (2.18)
+
_
R
d
_
f(x +y) f(x)
d

i=1

i
f(x)y
i
1
{y
i
:|y
i
|1}
_
F(dy)
for f C
2
0
, where (b, c, F) is the local characteristic of X with drift vector
b = (b
i
)
1id
, c = (c
i,j
)
i,jd
the diffusion characteristic, a symmetric, pos-
itive semidenite matrix and F the Lvy measure of X i.e. F(0) = 0 and
_
R
d
_
[x[
2
1
_
F(dx) < characterizing its jumps.
Comparison of Markov processes via innitesimal generators 1005
(3) For countable state spaces E we denote by
p
t
(x, y) :=P
t
(x, y), x, y E, t 0 (2.19)
the transition function,
P
t
:=(p
t
(x, y))
x,yE
the transition matrix
and
Q :=(q(x, y))
x,yE
the innitesimal generator
or intensity matrix, where
q(x, y) =lim
t0
1
t
p
t
(x, y) for x ,= y (2.20)
q(x, x) =lim
t0
1
t
(p
t
(x, x) 1) (2.21)
assuming existence and niteness of the limits. For many applications of ordering
results to queuing networks the theory of continuous time Markov chains with
transition function p
t
respectively induced semigroup T = (T
t
)
t0
is essential.
The following proposition from Sato (1999, E.34.10) gives an example for the use-
fulness of the generalized frame of semigroups on Banach spaces different from C
0
or
C
b
.
Proposition 2.4 Let (X
t
)
t0
be a time homogeneous translation invariant Markov pro-
cess with X
0
= x and transition function P
t
. Then the corresponding semigroup T =
(T
t
)
t0
T
t
f(x) =
_
R
d
P
t
(x, dy)f(y) =
_
R
d
P
t
(dy)f(x +y), f L
p
(2.22)
is a C
0
-contraction semigroup on L
p
= L
p
(R
d
), 1 p < .
For a strongly continuous semigroup T = (T
t
)
t0
with innitesimal generator A
dene F(t) := T
t
f, t 0. Then by (2.11) F solves the homogeneous Cauchy problem
F

(t) = AF(t), F(0) = f for f T


A
. (2.23)
The link to the inhomogeneous Cauchy problem and the characterization of its solution in
the following theorem is the fundamental tool for the comparison result in the following
section. We adapt the proof of Liggett (1985, Chapter I, Theorem 2.15), to the general
class of strongly continuous semigroups on a Banach space as considered in this section.
Some related results can be found in Pazy (1983) or in Yan (1987) as well.
1006 Rschendorf Wolf
Theorem 2.5 (Inhomogeneous Cauchy problem) Let T = (T
t
)
t0
be a strongly con-
tinuous semigroup on B with innitesimal generator A and let F, G : [0, ) B be
functions such that
(1) F(t) T
A
, for all t 0.
(2)
_
t
0
T
ts
G(s)ds exists for all t 0.
(3) F solves the inhomogeneous Cauchy problem, i.e.
F

(t) = AF(t) +G(t) for t 0,


then
F(t) = T
t
F(0) +
_
t
0
T
ts
G(s)ds. (2.24)
Proof: We consider the B valued function v(s) = T
ts
F(s) which is well-dened, since
T is a C
0
-semigroup on B. Furthermore we have
T
tsh
F(s +h) T
ts
F(s)
h
= T
ts
_
F(s +h) F(s)
h
_
+
_
T
tsh
T
ts
h
_
F(s)
+ (T
tsh
T
ts
)F

(s)
+ (T
tsh
T
ts
)
_
F(s +h) F(s)
h
F

(s)
_
.
The rst term converges to T
ts
F

(s) as h tends to zero, since T is a bounded linear


operator. To see that the second term converges to T
ts
AF(s) as h tends to zero one
has to use the rst assumption and (2.11). The third expression converges to zero as h
tends to zero, since T is a strongly continuous semigroup that is it fullls (2.4). Treating
the last term in the same way and using additionally the linearity of T we see that it
disappears as h tends to zero. As a result v(s) is differentiable for 0 < s < t and using
the third assumption we obtain
d
ds
v(s) = T
ts
F

(s) T
ts
AF(s) = T
ts
G(s).
Moreover T
ts
G(s) is integrable by assumption. Integrating this equality yields
F(t) T
t
F(0) =
_
t
0
T
ts
G(s)ds for all t 0,
which proves the claim. 2
Remark 2.6 Assumption 2.5-(2) holds in particular if G is continuous on [0, ). If G
is not continuous but in L
1
([0, ], B), then 2.5-(2) does hold as well, since we have
[[T
t
G[[
L
1
([0,],B)
c[[G[[
L
1
([0,],B)
for t [0, ) and c > 0.
Comparison of Markov processes via innitesimal generators 1007
3 Comparison of Markov processes
3.1 Continuous time Markov processes
We assume that X and Y are two time homogeneous Markov processes with values in
(E, E ). Let S = (S
t
) and T = (T
t
) denote their semigroups which we assume are
strongly continuous semigroups on some Banach function space B on E like C
0
(E),
C
b
(E), L
p
(E, ) where E allows to dene these structures as Banach spaces. Denote by
A and B the corresponding innitesimal generator of S and T respectively. Let T B
be a set of real functions on E and let
F
denote the corresponding stochastic order on
M
1
(E), the set of probability measures on E dened by

F
if
_
fd
_
fd, for all f T. (3.1)
We assume that
T T
A
T
B
. (3.2)
Theorem 3.1 (Conditional comparison result) Assume that
(1) X is stochastically monotone w.r.t. T i.e. f T implies S
t
f T for all t 0,
and
(2)
Af Bf [P
X
0
], for all f T. (3.3)
Then
S
t
f T
t
f [P
X
0
], f T. (3.4)
Proof: Dene for f T the function F : [0, ) B by F(t) := T
t
f S
t
f. Then F
satises the differential equation
F

(t) = BT
t
f AS
t
f
= B(T
t
f S
t
f) + (B A)(S
t
f).
By the stochastic monotonicity assumption holds S
t
f T and thus H(t) := (B
A)(S
t
f) is well-dened in B and H(t) 0 by (3.3). Thus F solves the nonhomogeneous
Cauchy problem that is
F

(t) = BF(t) +H(t), F(0) = 0. (3.5)


Since H(t) 0 it follows that
_
t
0
T
ts
H(s)ds exists and is nite. Further by Lemma
2.2-(2) and by the stochastic monotonicity assumption we obtain
F(t) = T
t
f S
t
f T
B
.
1008 Rschendorf Wolf
Thus the assumptions of Theorem 2.5 are satised and imply that the solution F(t) has
an integral representation of the form
F(t) = T
t
F(0) +
_
t
0
T
ts
H(s)ds
=
_
t
0
T
ts
H(s)ds as F(0) = 0.
Since H(s) 0 it follows that F(t) 0 for all t > 0 and thus the statement of the
theorem. 2
For diffusion processes with jumps a related comparison theorem is given by Zhang
(2006). He applies this comparison result to discuss the existence and uniqueness of in-
variant probability measures for uniformly elliptic diffusion processes with jumps. Sev-
eral examples where the local comparison condition for the innitesimal generators is
easy to verify were given by [Ru, 2008].
Example 3.2 For pure diffusion processes X and Y in R
d
with respective diffusion
matrices a := (a
ij
(x)) and b := (b
ij
(x)), the innitesimal generators are given by
Af(x) =
1
2

ij
a
ij
(x)

2
f
x
i
x
j
and Bf(x) =
1
2

ij
b
ij
(x)

2
f
x
i
x
j
. (3.6)
Thus for convex ordering the comparison condition (3.3) is implied by
Af(x) Bf(x), f T
cx
C
2
which is equivalent to the positive semideniteness of the matrix b a. Note that since
the choice L
1
(E) as the considered Banach space is included, an application of the
comparison result to convex ordering in the nonbounded case is justied by our extension
of the comparison results.
Dene the componentwise ordering of processes X, Y by
(X)
F
(Y ) if Eh(X
t
1
, . . . , X
t
k
) Eh(Y
t
1
, . . . , Y
t
k
) (3.7)
for all 0 t
1
< < t
k
and for all functions h that are componentwise in T and
are integrable. Thus componentwise ordering is an ordering of the nite dimensional
distributions. As in [Ru, 2008, Corollary 2.4] we obtain the following comparison result
as consequence of the conditional ordering theorem (Theorem 3.1) and the separation
theorem (see [BeRu, 2007, Proposition 3.1]) for the ordering of Markov processes.
Corollary 3.3 (Componentwise ordering result) Assume that the conditions of Theo-
rem 3.1 hold true and that X
0

F
Y
0
. Then the componentwise ordering (X)
F
(Y )
of the processes X and Y holds.
Comparison of Markov processes via innitesimal generators 1009
For some applications the pointwise ordering conditions on the innitesimal genera-
tors as in (3.3) are too strong. For instance in the pure diffusion case with generator as in
equation (3.6) with coefcients given by
a
ij
(x) = 1 and b
ij
(x) = 2
_
1
R
d
+
(x) 1
R
d

(x)
_
for all i, j 1, . . . , d (3.8)
the pointwise ordering condition is not true for the class of directionally convex func-
tions. It turns out that a weaker integral condition allows to obtain an integral compari-
son result. In order to describe the development of stochastic dependence between two
processes X and Y we establish the integral comparison result for more general function
classes. For example we would like to have conditions that imply that over the time t the
positive dependence in process Y between Y
0
and Y
t
is getting stronger than in process
X between X
0
and X
t
. That is
Eh(X
0
, X
t
) Eh(Y
0
, Y
t
) for all t > 0 (3.9)
and for all integrable dependence functions h T
[2]
. In order to compare w.r.t. such
functions h we need the information about the type of comparison which is contained in
a tupel of function classes (T, T
(2)
). Therefore let T
(2)
be a function class on E
2
and let
T be a function on E and dene
T
[2]
:= f T
(2)
[ f(x, ), f(, x) T.
Typical examples for T
(2)
are T
cx
(R
2d
), d N, the class of convex functions on R
2d
and T
sm
(R
2d
), d N the class of supermodular functions on R
2d
and for T are T
cx
(R
d
)
resp. T
sm
(R
d
) (for denitions and properties, see Mller and Stoyan (2002)). Then
T
[2]
= T
cx
(R
2d
) resp. T
[2]
= T
sm
(R
2d
), since the functions in T
(2)
are component-
wise in T
cx
(R
d
) and T
sm
(R
d
) respectively
Let X and Y be two time homogeneous Markov processes as introduced at the be-
ginning of this section. Additionally we assume that for every f T
[2]
the existence of
the integrals Ef(X
0
, X
t
), Ef(Y
0
, Y
t
) and the domain condition (3.2).
Theorem 3.4 (Integral comparison result w.r.t. T
[2]
) Assume that
(1) X is stochastically monotone w.r.t. T
[2]
i.e. f T
[2]
implies (S
t
f(x, )) T, for
x E and t 0,
(2) for R
+
the map s (T
s
Bf(x, )) is integrable on [0, ) for all f
T
[2]
, x E and
(3) for all f T
[2]
and all t s it holds that
_
E
_
E
(Af(x, )) (y)P
Y
ts
(x, dy)P
X
0
(dx)

_
E
_
E
(Bf(x, )) (y)P
Y
ts
(x, dy)dP
X
0
(dx).
(3.10)
1010 Rschendorf Wolf
Then
_
E
(S
t
f(x, )))(x)P
X
0
(dx)
_
E
(T
t
f(x, )) (x)P
X
0
(dx) for all f T
[2]
and t 0.
(3.11)
Proof:Let x E and dene for f T
[2]
the function F
x
: [0, ) B by F
x
(t)() :=
(T
t
f(x, ) S
t
f(x, )) (). Then F satises the differential equation
F

x
(t) =
d
dt
(T
t
f(x, ))
d
dt
(S
t
f(x, ))
= B(T
t
f(x, )) A(S
t
f(x, ))
= B(T
t
f(x, ) S
t
f(x, ))) + ((B A)(S
t
f(x, ))
By the stochastic monotonicity assumption holds S
t
f(x, ) T and thus H
x
(t)() :=
((B A)(S
t
f(x, )) () is well-dened in B. Thus F
x
solves the nonhomogeneous Cauchy
problem, i.e.
F

x
(t) = BF
x
(t) +H
x
(t) and F
x
(0) = 0.
Further we obtain
_
t
0
|T
ts
H
x
(s)|ds =
_
t
0
| (T
ts
(BS
s
f(x, )) T
ts
(AS
s
f(x, ))) |ds

_
t
0
|T
ts
(BS
s
f(x, )))|ds +
_
t
0
|T
ts
(AS
s
f(x, )) |ds
< .
This follows fromthe stochastic monotonicity assumption, the integrability of T
t
Bf(x, )
and property (2.14) from Lemma 2.2. Hence the expression
_
t
0
T
ts
H
x
(s)ds exists and
is nite. Moreover by Lemma 2.2-(2) and the stochastic monotonicity assumption we
have
F
x
(t) = T
t
f(x, ) S
t
f(x, ) T
B
.
Thus the assumptions of Theorem 2.5 are fullled and imply that the solution F
x
(t) has
an integral representation of the form
F
x
(t) = T
t
F
x
(0) +
_
t
0
T
ts
H
x
(s)ds
=
_
t
0
T
ts
H
x
(s)ds as F
x
(0) = 0.
By assumption we can integrate both sides to obtain
_
E
F
x
(t)P
X
0
(dx) =
_
E
_
t
0
T
ts
H
x
(s)dsP
X
0
(dx)
=
_
t
0
_
E
T
ts
H
x
(s)P
X
0
(dx)ds,
Comparison of Markov processes via innitesimal generators 1011
where we apply the Theorem of Fubini. But since the inner integral has a representation
of the form
_
E
_
E
H
x
(s)(y)P
Y
ts
(x, dy)P
X
0
(dx) =
_
E
_
E
((B A)(S
s
f(x, )) (y)P
Y
ts
(x, dy)P
X
0
(dx),
condition (3.10) delivers the positivity of
_
E
F
x
(t)P
X
0
(dx) which proves the statement
of the theorem. 2
In the case of equal initial distributions we obtain from (3.11) for all t > 0
Ef(X
0
, X
t
) =
_
E(f(x, X
t
)[X
0
= x)P
X
0
(dx)
=
_
_
(S
t
f(x, )) (z)
_

z=x
P
X
0
(dx)

_
_
(T
t
f(x, )) (z)
_

z=x
P
X
0
(dx)
=
_
E(f(x, Y
t
)[Y
0
= x)P
Y
0
(dx)
= Ef(Y
0
, Y
t
).
The latter theorem is also true for functions in T, i.e. for function classes on E. Thus in
the case of equal initial distribution we obtain the following corollary.
Corollary 3.5 (Integral comparison result w.r.t. T) Assume that P
X
0
= P
Y
0
and
(1) X is stochastically monotone w.r.t. T,
(2) for R
+
the map s (T
s
Bf) is integrable on [0, ) for all f T and
(3) for all f T and all t s it holds that
_
E
_
E
Af(y)P
Y
ts
(x, dy)P
X
0
(dx)
_
E
_
E
Bf(y)P
Y
ts
(x, dy)dP
X
0
(dx).
(3.12)
Then
X
t

F
Y
t
for all t 0. (3.13)
Remark 3.6 (a) In the particular case when Y is a stationary homogeneous Markov
process with invariant distribution and X is a homogeneous Markov process
with initial distribution , then condition (3.12) becomes
_
E
Af(x)(dx) 0, (3.14)
since it holds that
_
P
Y
t
(x, dy)(dx) = (dy) for all t > 0. In situations like
(3.8) the local comparability of the generators is not true, so Theorem 3.1 is not
applicable, however condition (3.14) is satised. Thus we obtain X
t

F
Y
t
for all
t 0 from Corollary 3.5.
1012 Rschendorf Wolf
(b) One further application of Corollary 3.5 is the comparison of bivariate distribution
functions of Markov processes. As it is easily seen from the proof of Theorem 3.4
condition (3.10) can be replaced by
_
V
_
E
(Af(y)) P
Y
ts
(x, dy)P
X
0
(dx)
_
V
_
E
(Bf(y)) P
Y
ts
(x, dy)dP
X
0
(dx),
(3.15)
where V E in order to obtain
_
V
(S
t
f(x)))P
X
0
(dx)
_
V
(T
t
f(x)) P
X
0
(dx)
for all f T and t 0, if both expressions exist.
Now consider E = R and V := (, u]. From Corollary 3.5 we have a compar-
ison of the two dimensional distribution functions of (X
0
, X
t
) and (Y
0
, Y
t
) for all
s, u R and f := 1
(,s]
:
P(X
0
u, X
t
s) =
_
u

S
t
f(x)P
X
0
(dx)

_
u

T
t
f(x)P
X
0
(dx) = P(Y
0
u, Y
t
s)
(3.16)
provided that the assumptions of Corollary 3.5 are fullled. If X and Y addition-
ally have the same marginals then (3.16) implies the concordance order.
In several cases condition (3.10) is not veriable. In general one is interested in com-
paring Markov processes without knowing much about their transition kernels. The in-
tegral comparison theorem implies however that a bivariate comparison of X and Y is
possible if a suitable local comparison condition on the innitesimal generators holds
true. The next result implies in particular the integral comparison result in (3.11).
Corollary 3.7 (Bivariate comparison result) Assume that P
X
0
= P
Y
0
and
(1) X is stochastically monotone w.r.t. T
[2]
i.e. f T
[2]
implies (S
t
f(x, )) T, for
x E and t 0,
(2) for R
+
the map s (T
s
Bf(x, )) is integrable on [0, ) for all f
T
[2]
, x E and
(3) for all f T
[2]
and all x E it holds that
Af(x, ) Bf(x, ). (3.17)
Then
Ef(X
0
, X
t
) Ef(Y
0
, Y
t
) for all f T
[2]
and t 0. (3.18)
Comparison of Markov processes via innitesimal generators 1013
Note that in general it is not enough to assume condition (3.3) in order to get (3.18).
In networks for example telecommunication networks or company supply networks
where different velocities in evolution are extant, it is valuable to understand their internal
dependencies in order to respond reasonably to changes in the evolution of the network.
Is it possible to transfer the knowledge about the dependencies in the faster evolving
network to the slower evolving network and vice versa? It is of interest to know how the
dependence of a network driven by a Markov process X alters when changing the speed
of development. An intuitive approach is to study speeding-down versions of a Markov
process.
For f T
[2]
, x E and c (0, 1) consider the speeding-down Markov process

X
c
induced by the following innitesimal generator:
Bf(x, ) =
_
c Af(x, ), for y E s.t. (Af(x, )) (y) 0,
0 for y E s.t. (Af(x, )) (y) < 0.
(3.19)
Thus in general both inequalities
Af(x, ) < Bf(x, ) on Af(x, ) < 0 and
Af(x, ) cAf(x, ) = Bf(x, ) on Af(x, ) 0
hold true. In consequence Corollary 3.7 is not applicable since condition (3.17) does not
hold for the innitesimal generators of the Markov processes X and

X
c
.
The following result generalizes the speeding-down property in (3.19). Assuming
that the innitesimal generators of the corresponding Markov processes fulll condition
(3.10) allows to apply the integral comparison result Theorem 3.4.
Corollary 3.8 Let X be a homogeneous Markov process. Assume that X is stochasti-
cally monotone w.r.t. T
[2]
. Moreover for x Eand R
+
let the map s T
s
Af(x, )
be integrable on [0, ) for all f T
[2]
. If for some c = (c
1
, c
2
) [0, 1)
2
and all f T
[2]
it holds that
_ _
{Af(x,)<0}
(1 c
2
) (Af(x, )) (y)P
X
ts
(x, dy)P
X
0
(dx)

_ _
{Af(x,)0}
(c
1
1) (Af(x, )) (y)P
X
ts
(x, dy)P
X
0
(dx),
(3.20)
then
Ef(X
0
, X
t
) Ef(

X
c
0
,

X
c
t
), for all f T
[2]
where

X
c
is the speeding-down version of X induced by the innitesimal generator
Bf(x, ) =
_
c
1
Af(x, )(y), for y E s.t. (Af(x, )) (y) 0,
c
2
Af(x, )(y), for y E s.t. (Af(x, )) (y) < 0.
(3.21)
Proof: For proving this statement we use Theorem 3.4. Due to the construction in (3.21)
the properties of X are transferred to

X
c
. Thus the only thing to do is to check condition
(3.10). But this follows from (3.20). 2
1014 Rschendorf Wolf
3.2 Discrete time Markov processes
Now we transfer the method of proof of the comparison result for the continuous time
case to the discrete time case. Let T B be a set of real functions on E and let
X = (X
n
)
nN
0
and Y = (Y
n
)
nN
0
be real-valued, discrete-time homogeneous Markov
processes. Denote the one-step transition kernels for X and Y by K
X
: E E [0, 1]
and K
Y
: E E [0, 1]. Also dene as usual for a kernel K
Kf(x) =
_
E
f(y)K(x, dy).
By a modication of the method of proof of Theorem 3.1 we obtain a discrete time
version of the conditional comparison result. This in fact gives a new proof to a classical
result (see e.g. Mller and Stoyan (2002, Theorem 5.2.11)).
Proposition 3.9 (Discrete time conditional comparison result; f T) (a) Assume that
(1) K
Y
is stochastically monotone, i.e. f T implies K
Y
f T, and
(2) K
X
(x, )
F
K
Y
(x, ) for all x E. (3.22)
Then we have
_
f(x)K
X
n
(y, dx)
_
f(x)K
Y
n
(y, dx), for all f T, n N
0
,
where K
n
denotes the n-step transition kernel (K)
n
.
(b) If in addition X and Y possess the same initial distribution, then it holds that
X
n

F
Y
n
, for all n N
0
.
Proof: (a) As in the proof of the conditional comparison result in Theorem 3.1 dene
for f T the function F : N
0
B by F(n) := K
Y
n
f K
X
n
f. Then we obtain similar
to the case of continuous time a recursive equation for F:
F(n + 1) F(n) =(K
Y
id)(K
Y
n
f) (K
X
id)(K
X
n
f)
=(K
X
id)(K
Y
n
f K
X
n
f) + (K
Y
K
X
)(K
Y
n
f)
=(K
X
id)F(n) + (K
Y
K
X
)(K
Y
n
f).
This yields
F(n + 1) =K
X
F(n) + (K
Y
K
X
)(K
Y
n
f) (3.23)
F(0) =0. (3.24)
For n = 0 we have F(1) = (K
Y
K
X
)f 0, by condition (3.22).
Comparison of Markov processes via innitesimal generators 1015
For n = 1 this implies that F(2) = K
X
F(1) + (K
Y
K
X
)(K
Y
f) 0 by using
the
F
-monotonicity of K
Y
and the positivity preserving property of K
X
. Proceeding
inductively we get for all n N
0
that
0 F(n) = K
Y
n
f K
X
n
f,
i.e. we have
_
f(x)K
X
n
(y, dx)
_
f(x)K
Y
n
(y, dx).
(b) Taking expectations, bearing in mind that X and Y have the same initial distribution,
we get X
n

F
Y
n
for all n N
0
. 2
Observe that equation (3.23) can be rewritten in
F(n + 1) =
n

k=0
_
K
X
k
_
K
Y
K
X
_ _
K
Y
nk
f
_
_
for all n N
0
, (3.25)
using the recursive denition of F(n) and that fact that F(0) = 0. This enables us to
establish a comparison result for f T, where the conditions on the transition kernel
appear in integrated form.
Proposition 3.10 (Discrete integral comparison result w.r.t. T) Let X = (X
n
)
nN
0
and Y = (Y
n
)
nN
0
be discrete-time homogeneous Markov processes. Assume that P
X
0
=
P
Y
0
=: , and
(1) K
Y
is stochastically monotone w.r.t. T, and
(2) for all f T it holds that
0
_
E
_
K
X
n
_
K
Y
K
X
_
f
_
(x)(dx) for all n N
0
(3.26)
Then
X
n

F
Y
n
for all n N
0
.
Proof: As in the proof of our integral comparison result Theorem 3.4 dene for f T
the function F : N
0
B by F(n)() :=
_
K
Y
n
K
X
n
f
_
(). Then by integrating both
sides with respect to and using (3.25) we obtain
_
F(n + 1)(x)(dx) =
_
n

k=0
_
K
X
k
_
K
Y
K
X
_ _
K
Y
nk
f
_
_
(x)(dx),
1016 Rschendorf Wolf
which is non-negative due to assumption (3.26) and the stochastic monotonicity of Y .
Moreover for all f T it holds that
Ef(Y
n+1
) Ef(X
n+1
)
=
_
E(f(Y
n+1
)[Y
0
= x) E(f(X
n+1
)[X
0
= x)(dx)
=
_
_
K
Y
n+1
f(x) K
X
n+1
f(x)
_
(dx)
=
_
F(n + 1)(x)(dx),
and the statement of the propostion follows. 2
Note that condition (3.26) reduces to Ef(X
0
) Ef(Y
1
) for all f T if X is station-
ary with invariant distribution . Then in this case X
n

F
Y
n
is obvious, since Y is
stochastically monotone w.r.t. T.
The recursive formula in (3.25) is also true for f T
[2]
if for x E we dene the
function F
x
: N B by F
x
(n)() :=
_
K
Y
f(x, ) K
X
f(x, )
_
(). Then we obtain
similarly the following corollary.
Corollary 3.11 (Discrete integral comparison result w.r.t. T
[2]
) Let X = (X
n
)
nN
0
and Y = (Y
n
)
nN
0
be discrete-time, homogeneous Markov processes. Assume that
P
X
0
= P
Y
0
=: , and
(1) K
Y
is stochastically monotone w.r.t. T
[2]
, i.e. f T
[2]
implies K
Y
f(x, ) T,
for all x E , and
(2) for all f T
[2]
and all x E it holds that
0
_
E
_
K
X
n
_
K
Y
K
X
_
f(x, )
_
(x)(dx) for all n N
0
. (3.27)
Then
Ef(X
0
, X
n
) Ef(Y
0
, Y
n
) for all f T
[2]
, n N
0
.
Acknowledgements. We are thankful to an anonymous referee for carefully reading the
manuscript and posing the question whether an extension of the continuous time results
to the discrete time case is possible.
References
J. Bergenthum and L. Rschendorf. Comparison of semimartingales and Lvy processes.
Annals of Probability, 35:228254, 2007.
Comparison of Markov processes via innitesimal generators 1017
N. Buerle, A. Mller, and A. Blatter. Dependence properties and comparison results for
Lvy processes. Mathematical Methods and Operations Research, 67:161186, 2008.
M.-F. Chen. From Markov Chains to Non-equilibrium Particle Systems. World Scientic,
Singapore, 2004.
M.-F. Chen and F.-Y. Wang. On order preservation and positive correlations for multi-
dimensional diffusion processes. Probability Theory and Related Fields, 85:421428,
1993.
T. Cox, K. Fleischmann, and A. Greven. Comparison or interacting diffusions and an
application to their ergodic theory. Probability Theory and Related Fields, 105:513
528, 1996.
H. Daduna and R. Szekli. Dependence ordering for Markov processes on partially or-
dered spaces. Journal of Applied Probability, 43:793814, 2006.
K. Engel and R. Nagel. One-Parameter Semigroups for Linear Evolution Equations.
Number 194 in Graduate Texts in Mathematics. Springer, 2000.
A. Greven, A. Klenke, and A. Wakolbinger. Interacting diffusions in a random medium:
comparison and long-time behavior. Stochastic Processes and Their Applications, 98:
2341, 2002.
I. Herbst and L. Pitt. Diffusion equation techniques in stochastic monotonicity and posi-
tive correlations. Probability Theory and Related Fields, 87:275312, 1991.
T. Liggett. Interacting Particle Systems, volume 276 of Comprehensive Studies in Math-
ematics. Springer, 1985.
W. A. Massey. Stochastic Ordering for Markov Processes on partially ordered spaces.
Mathematical and Operations Research, Vol. 12:350367, 1987.
A. Mller and D. Stoyan. Comparison Methods for Stochastic Models and Risks. John
Wiley and Sons, New York, 1 edition, 2002.
A. Pazy. Semigroups of Linear Operators and Applications to Partial Differential Equa-
tions, volume 44 of Applications of Mathematical Sciences. Springer, 1983.
L. Rschendorf. On a comparison result for Markov processes. Journal of Applied
Probability, 45:279286, 2008.
K. I. Sato. Lvy Processes and Innitely Divisible Distributions, volume 68 of Cambridge
Studies in Advanced Mathematics. Cambridge University Press, 1999.
J. A. Yan. A pertubation theorem for semigroups of linear operators. In Lecture Notes in
Mathematics, volume 1321, pages 8991, 1987.
X. Zhang. On stochastic ordering for diffusion with jumps and applications. Statistics &
Probabiltiy Letters, 77:614620, 2006. Fudan University, Shanghai.
1018 Rschendorf Wolf
Ludger Rschendorf
Mathematische Stochastik
University of Freiburg
Eckerstr. 1
79104 Freiburg
Germany
ruschen@stochastik.uni-freiburg.de
Viktor Wolf
Mathematische Stochastik
University of Freiburg
Eckerstr. 1
79104 Freiburg
Germany

You might also like