You are on page 1of 53

1

Chapter 16: Markov Cha|ns


! Cb[ecuves: SLudenLs should be able Lo:
" ldenufy and descrlbe a SLochasuc rocess
" ldenufy and formulaLe a problem as a Markov Chaln and
be able Lo ldenufy Lhe maln characLerlsucs of Lhe chaln
" llnd long Lerm probablllues and mean rsL passage umes
for Lrgodlc chalns
" ueLermlne expecLed sysLem cosLs/rewards (and oLher
measures of lnLeresL) for a Markov Chaln
" llnd probablllLy of absorpuon for Absorblng Chalns
" undersLand Lhe baslcs of a Conunuous 1lme Markov Chaln
2
Some Dehn|nons
! SLochasuc
" CpposlLe of deLermlnlsuc
" 8andom, or lnvolves some randomness
! SLaLe (of a sysLem)
" A characLerlsuc LhaL descrlbes Lhe condluon" of Lhe sysLem
" AL any glven ume (sLage), Lhe sysLem musL be ln one of Lhe sLaLes

Lxamples:
3
What |s a Stochasnc rocess?
! Suppose we observe some characLerlsuc of a
sysLem aL dlscreLe polnLs ln ume.
! LeL k
!
be Lhe value of Lhe sysLem
characLerlsuc aL ume !. ln mosL slLuauons, k
!

ls noL known wlLh cerLalnLy before ume !
and may be vlewed as a random varlable.
! A d|screte-nme stochasnc process ls slmply
a descrlpuon of Lhe relauon beLween Lhe
random varlables k
"#
k
%#
k
&
...
4

! A connnuous-nme sLochasuc process ls
slmply Lhe sLochasuc process ln whlch Lhe
sLaLe of Lhe sysLem can be vlewed aL any ume,
noL [usL aL dlscreLe lnsLanLs ln ume.
! lor example, Lhe number of people ln a
supermarkeL ! mlnuLes aer Lhe sLore opens
for buslness may be vlewed as a conunuous-
ume sLochasuc process.
3
Mov|ng through the process
! ln such processes, we know Lhe probablllLy of
movlng from one sLaLe Lo anoLher, based on
Lhe prevlous sLaLes of Lhe sysLem
!
ls known
! We wlll refer Lo movlng from a ume ! Lo ! + 1
as a Lransluon
( )
! ! " " "
# # # $ ! " ! " ! " # " $
% % % % %
= = = =
! ! +

6
So, why do we care?
! 1he probablllLy LhaL a sysLem wlll be ln a
cerLaln sLaLe aer a number of Lransluons
! 1he average number of umes a sysLem wlll be
ln a sLaLe over a cerLaln number of Lransluons
! 1he average number of umes Lhe sysLem wlll
need Lo Lransluon ln order Lo geL Lo a
parucular sLaLe
! 1he llkellhood of reachlng a cerLaln sLaLe
wlLhln a specled perlod.
7
Lxamp|e of a Stochasnc rocess
! WeaLher ln Pallfax
" Say Lhe chance of lL belng dry Lomorrow ls 80 lf
lL ls dry Loday, buL only 60 lf lL ls ralnlng Loday
" Assume Lhese probablllues are Lhe same
regardless of whaL Lhe weaLher was yesLerday
! 1he evoluuon of Lhe weaLher could be vlewed
as a sLochasuc process, sLarung aL !'" for each
day ! ' "# %# &# (
! 1he sLaLe of Lhe weaLher can be dry or ralnlng.
LeL State 0 = Dry, state 1 = ka|n
8
Lxamp|e (connnued)
! 8andom varlaLe X
t
Lakes on Lhe values:

! 1he sLochasuc process ls Lhen glven by
!
"
#
=
!"#$ &"' ( )"* #+ ,
)!* #' ( )"* #+ -
!
"
{ } { } ! ! !
" # $
! ! ! !
"
=
9
Another Lxamp|e
! Cambler's 8uln roblem:
" ?ou sLarL a game wlLh $2. ln each round of Lhe
game, you have a probablllLy ! of wlnnlng $1, and
a probablllLy 1 - ! of loslng $1. 1he game ends
when you run ouL of money, or have accumulaLed
$4
" "
#
=
" 1ype of problem known as kandom wa|k
" Clven Lhls sysLem, whaL mlghL you wanL Lo know?
10
What |s a Markov Cha|n?
! Dehn|non: A dlscreLe-ume sLochasuc process
ls a Markov cha|n lf, for # = 0,1,2. and all
sLaLes $ = 0,1,.,%


" 1hls says LhaL Lhe probablllLy dlsLrlbuuon of Lhe
sLaLe aL ume #+1 depends on Lhe sLaLe aL ume #
on|y and does noL depend on Lhe sLaLes Lhe chaln
passed Lhrough on Lhe way Lo $
#
aL ume #.
( )
( )
! ! !
! ! ! ! !
" # " # $
" # " # " # " # $
= = =
= = = =
+
! ! +
!
" " " !
#
$ $ # # #

11
1rans|non robab|||nes
! 1he probablllLy LhaL Lhe sysLem wlll be ln sLaLe
) aL ume ! * % glven LhaL lL ls currenLly ln sLaLe [
! lf Lhls probablllLy ls Lhe same no mauer whaL
ume # lL ls - ln oLher words:


Lhen we call Lhese probablllues stanonary and
denoLe Lhem as !
'$
.
P X
t+1
= j | X
t
= i
( )
P X
t+1
= j | X
t
= i
( )
= P X
1
= j | X
0
= i
( )
for all t
12
Lxamp|es - Stanonar|ty
! lf we assume LhaL 10 years from now, Lhe
probablllLy LhaL lf lL ls dry Loday, lL wlll be dry
Lomorrow ls sull 80, Lhen Lhls process would
be consldered sLauonary
! ?ou have a box of balls, 30 are red, 30 are
blue. AL each ume L you plck a ball from Lhe
box.


ls ["
#
} a sLauonary process?
!
"
#
=
!"## !#%& " '()* +,% (- .
!"## /&0 " '()* +,% (- 1
!
"
13

! 1he Lransluon probablllues are dlsplayed as
an (M + 1) by (M + 1) trans|non probab|||ty
matr|x !:

! % + 1 = Lhe number of posslble sLaLes
! WhaL ls Lhe mlnlmum condluon requlred for Lhe probablllues?
00 01 0
10 11 1
0 1
0 1
0
1
M
M
M M MM
M
p p p
p p p
p p p M
! "
# $
# $
=
# $
# $
# $
% &
P
!
!
" " "
!
1ransluon robablllLy MaLrlx
14

! lor each l and [:
! lor each i:

?ou have Lo be |n some state ln Lhe nexL sLage.
noLe LhaL Lhls could lnclude sLaLe i (you could
sLay where you are)
" Lach enLry ln ! musL be nonnegauve.
" Pence, all enLrles ln Lhe Lransluon probablllLy
maLrlx are nonnegauve, and Lhe enLrles ln each row
musL sum Lo 1.
13
Gamb|er's ku|n rob|em
! WhaL are Lhe sLaLes?

CompleLe Lhe maLrlx:





! "
# $
# $
# $ =
# $
# $
# $
% &
P
16
Graph of the weather prob|em
the chance of it being dry tomorrow is 80% if it is dry today,
but only 60% if it is raining today


Transition matrix:

The graph:
17
Gamb|er's ku|n rob|em
! Say p = 0.60 (and Lherefore 1 - p = 0.40)
!
!
!
!
!
!
"
#
$
$
$
$
$
$
%
&
=
! " " " "
#" $ " " %" $ " " "
" #" $ " " %" $ " "
" " #" $ " " %" $ "
" " " " !
% '
( '
) '
! '
" '
% ' ( ' ) ' ! ' " '
!
18
Graph of th|s rocess
States?

Transition probabilities?
19
8ack to 1he Weather Lxamp|e
! lf lL ls dry Loday, whaL ls Lhe probablllLy LhaL lL
wlll be ralnlng 2 days from now?
" Pow can l geL from SLaLe 0 Lo SLaLe 1 ln Lwo
Lransluons?
Dry
0
Rain
1
0.80
0.60
0.20
0.40
20
!-Step 1rans|non robab|||nes
Today
Dry
Rain
Today
+ 1 day
Dry
Rain
Today
+ 2 day
Dry
Rain
Today
+N day
Dry
Rain
the chance of it being dry tomorrow is 80% if it is dry today,
but only 60% if it is raining today
80%
40%
80%
40%
80%
40%

Pr(Rain in 2 days | it is Dry today) =
Rain
0.8x0.20 + 0.20x0.40 = 0.24
80%
40%
21
!-Step 1rans|non robab|||nes
! lf a sLauonary Markov chaln ls ln a sLaLe i aL
ume m, whaL ls Lhe probablllLy LhaL n perlods
laLer Lhe Markov chaln wlll be ln sLaLe j?
! 1hls probablllLy wlll be lndependenL of +, so
we may wrlLe


where ls called Lhe !-step trans|non probab|||ty
of a Lransluon from sLaLe ) Lo sLaLe ,.

( ) ( )
( ) !
"# ! $ ! $
% " & # & ' " & # & ' = = = = = =
+ !
" "
( ) !
"#
$
22
1he Weather Lxamp|e
! lf p
01
(2)
ls Lhe probablllLy LhaL ln 2 days, lL wlll
be ralnlng, glven LhaL lL ls dry Loday, Lhen


! .and Lhe probablllLy LhaL lL wlll be dry ln 2
days ls
( )
( )( ) ( )( )
2
01
0.8 0.2 0.2 0.4 0.24 p = + =
23
What about aher 3 days?
! We already know and
! So Lhe probablllLy lL ls dry aer 3 days lf lL ls
dry now:
( )
!" # $
!
$%
= !
( )
!" # $
%
$$
= !
( ) 3
00
p =
24
What |f there were 3 states?
0
1 2
( ) 2
01
p =
General case for three states:
23
16.3 Chapman-ko|mogorov Lquanons
! So, lf (=1 Lhen
! And lf (=)-1, Lhen
! lor n=2
( ) ( ) ( )
! " # # #
$
%
" !
%&
"
'%
!
'&
< < =
!
=
"
!
!
( ) ( )
!
=
"
=
!
"
#
"$ %"
#
%$
& & &
!
"
( ) ( )
!
=
"
=
!
"
"#
$
%"
$
%#
& & &
!
"
( ) 2
0 0 1 1
0
...
M
ij ik kj i j i j iM Mj
k
p p p p p p p p p
=
= = + + +
!
00 01 0
10 11 1
0 1
M
M
M M MM
p p p
p p p
p p p
! "
# $
# $
=
# $
# $
# $
% &
P
!
!
" " # "
!
!
!
!
!
"
#
$
$
$
$
%
&
!
!
!
!
"
#
$
$
$
$
%
&
=
!! ! !
!
!
!! ! !
!
!
" " "
" " "
" " "
" " "
" " "
" " "

! " ! !

! " ! !

! "
! !! !"
" "! ""
! "
! !! !"
" "! ""
#
!
!
!
!
!
!
!
!
!
"
#
$
$
$
$
$
$
$
$
%
&
=
' ' '
' ' '
' ' '
= = =
= = =
= = =
!
"
"! !"
!
"
" !"
!
"
" !"
!
"
"! "
!
"
" "
!
"
" "
!
"
"! "
!
"
" "
!
"
" "
# # # # # #
# # # # # #
# # # # # #
! !
"
!
!
!
"
!
" "
!
! "
!
!
!
" !
!
! !
#

! " ! !

!
1herefore, ls Lhe $!'
#*
elemenL of !
2

(lf you counL Lhe rsL row and column as 0)

( ) !
!"
#
( )
!
=
=
!
"
"# $" $#
% % %
!
"
Chapman-Kolmogorov
Equations
28
16.3 Chapman-ko|mogorov Lquanons
! So, lf (=1 Lhen
! And lf (=)-1, Lhen
! Whlch lndlcaLe:
! and ls Lhe $,'
th
elemenL of !
)
! and
( ) ( )
!
=
"
=
!
"
#
"$ %"
#
%$
& & &
!
"
( ) ( )
!
=
"
=
!
"
"#
$
%"
$
%#
& & &
!
"
( ) ( )
( )
1
1
n n
n n
n
n
!
!
= " = " =
# =
P P P P P P
P P
( ) !
"#
$
( )
n m
n m
+
= ! P P P
29
1he Weather Lxamp|e

! Cuesuon: lf you sLarL on a dry day ( $=0 ),
Lhe probablllLy LhaL lL wlll be dry ( '=0 ) 3
days laLer ls?
!
"
#
$
%
&
=
!
"
#
$
%
&
!
"
#
$
%
&
=
!" # $ %! # $
!& # $ %' # $
& # $ ' # $
! # $ " # $
& # $ ' # $
! # $ " # $
!
!
30

! When you do noL know Lhe sLaLe of Lhe Markov
chaln aL ume #=0 (we know Lhe probablllues of
belng ln dlerenL sLaLes, lnsLead)
! uene +
$
= probablllLy LhaL Lhe chaln ls ln sLaLe $ aL
Lhe ume 0, ln oLher words, ,("
0
=$) = +
$
" vecLor += [+
1
, +
2
,.+
-
] ls Lhe |n|na| probab|||ty
d|str|bunon for Lhe Markov chaln.
! robablllLy of belng ln sLaLe ' aL ume )
! Lxample?
Uncond|nona| State robab|||nes
( )
( ) !
"#
$ "
"
" !
% & # ' (
!
=
=
= =
!
31
16.4 C|ass|hcanon of States |n a Markov Cha|n
! A path from sLaLe i Lo sLaLe j ls a sequence of
Lransluons LhaL beglns ln i and ends ln j, such LhaL
each Lransluon ln Lhe sequence has a posluve
probablllLy of occurrlng.
! A sLaLe j ls access|b|e from sLaLe i lf Lhere ls a paLh
leadlng from i Lo j.
! ldenufy all posslble paLhs of
"
"
"
0 1 2 3 4
0 .4 .6 0 0 0
1 .5 .5 0 0 0
2 0 0 .3 .7 0
3 0 0 .5 .4 .1
4 0 0 0 .8 .2
P
! "
# $
# $
# $ =
# $
# $
# $
% &

! 1wo sLaLes i and j are sald Lo commun|cate lf j ls
reachable from i, and i ls reachable from j.
! Lxample:
! A seL of sLaLes ./ ln a Markov chaln ls ln a separaLe
c|ass lf no sLaLe ouLslde of ./ ls reachable from any
sLaLe ln ./.
Identify the classes in matrix P:
#
#
#

! lf all sLaLes ln Lhe chaln communlcaLe, Lhe Markov
chaln ls sald Lo be |rreduc|b|e
0 1 2 3 4
0 .4 .6 0 0 0
1 .5 .5 0 0 0
2 0 0 .3 .7 0
3 0 0 .5 .4 .1
4 0 0 0 .8 .2
P
! "
# $
# $
# $ =
# $
# $
# $
% &
! A sLaLe $ ls a trans|ent state lf Lhere exlsLs a sLaLe '
LhaL ls accesslble from $, buL Lhe sLaLe $ ls noL
accesslble from sLaLe '.
" 1here ls a chance LhaL Lhe process wlll never reLurn Lo
Lhls sLaLe agaln, upon enLerlng Lhe sLaLe.
! A sLaLe $ ls a recurrent state lf aL some fuLure
polnL Lhe sysLem wlll reLurn Lo sLaLe $
! noLe: a sLaLe ls elLher LranslenL or recurrenL
! A sLaLe , ls an absorb|ng state once Lhe sysLem
enLers sLaLe ', wlll never leave. 1hls means LhaL
!
''
=1.
"
34
Gamb|er's ku|n
! ldenufy Lhe sLaLes' Lype:
!
!
!
!
!
!
!
!
"
#
$
$
$
$
$
$
%
&
=
1 0 0 0 0
60 . 0 0 40 . 0 0 0
0 60 . 0 0 40 . 0 0
0 0 60 . 0 0 40 . 0
0 0 0 0 1
4 $
3 $
2 $
1 $
0 $
4 $ 3 $ 2 $ 1 $ 0 $
P
33
1rans|ent State Lxamp|e
! ldenufy Lype of sLaLes
!
!
!
!
!
"
#
$
$
$
%
&
=
!" # $ %" # $ $
"$ # $ "$ # $ $
&$ # $ '$ # $ ($ # $
!
&
$
!
" $ %
36
More dehn|nons: er|od|c|ty
! A recurrent sLaLe $ ls per|od|c wlLh perlod 0 > 1 lf 0 ls Lhe
smallesL number such LhaL all paLhs leadlng from sLaLe $ back
Lo sLaLe $ have a lengLh LhaL ls a muluple of 0.
! l.e: for all values of n oLher Lhan k, 2k, 3k, (k>1 ls
smallesL lnLeger wlLh Lhls properLy).
! lf a recurrenL sLaLe ls noL perlodlc, lL ls called aper|od|c.
!
!
!
"
#
$
$
$
%
&
=
0 0 1
1 0 0
0 1 0
P
!
!
!
"
#
$
$
$
%
&
=
!" # $ %" # $ $
"$ # $ "$ # $ $
&$ # $ '$ # $ ($ # $
!
&
$
!
( )
0
n
ii
p =
37
Lrgod|c Markov Cha|n
! A nlLe-sLaLe lrreduclble Markov Chaln where
all sLaLes are recurrenL and aperlodlc ls sald Lo
be ergod|c
!
!
!
"
#
$
$
$
%
&
=
Rain Dry
Rain
Dry
P
40 . 60 .
20 . 80 .
!
!
!
!
!
!
"
#
$
$
$
$
$
$
%
&
=
1 0 0 0 0
60 . 0 0 40 . 0 0 0
0 60 . 0 0 40 . 0 0
0 0 60 . 0 0 40 . 0
0 0 0 0 1
4 $
3 $
2 $
1 $
0 $
4 $ 3 $ 2 $ 1 $ 0 $
P
38
16.S Long kun ropernes of Markov Cha|ns
! Steady-state probab|||nes are used Lo descrlbe
Lhe long-run behavlor of a Markov chaln
! ln some far o fuLure ume, whaL ls Lhe
probablllLy LhaL lf l observe Lhe sysLem l wlll
nd lL ln sLaLe '?
! ln oLher words, nd for a very large )
! &'() *+ )', -'(.(-),.*+)*- /0 )',+,
1./2(2*3*)*,+4
( ) !
"#
$
39
Steady-State robab|||nes
! 1heorem: LeL " be Lhe Lransluon maLrlx for an
-*% sLaLe Lrgodlc chaln. 1hen Lhere exlsLs a
vecLor

such LhaL (long-Lerm fuLure sLaLe ls lndependenL of currenL sLaLe)
lim
n
n
P
!"
=
[ ]
0 1 M
! ! ! = ! !
0 1
0 1
0 1
M
M
M
! ! !
! ! !
! ! !
" #
$ %
$ %
$ %
$ %
$ %
& '
!
!
" " "
!

! 1hls Lheorem Lells us LhaL for any lnlual sLaLe $,

! 1he vecLor " = [1
2
1
3
. 1
%
] ls oen called Lhe steady-
state d|str|bunon, or equ|||br|um d|str|bunon, for Lhe
Markov chaln.
! Lach 1
'
represenLs Lhe long-run probablllLy of arrlvlng
aL sLaLe ' ln perlod 8lC ), regardless of where you
sLarLed!
! 1he behavlor of a Markov chaln before Lhe sLeady
sLaLe ls reached ls oen called trans|ent behav|or
( )
!
"
#!
"
$ ! =
" #
!"#
41
8ack to the Weather Lxamp|e
3
0.8 0.2 0.76 0.24 0.752 0.248
0.6 0.4 0.72 0.28 0.744 0.256
! " ! " ! "
= =
# $ # $ # $
% & % & % &
P
4
0.752 0.248 0.80 0.20
0.744 0.256 0.60 0.40
! " ! "
=
# $ # $
% & % &
P
5
0.75 0.25 0.80 0.20
0.749 0.251 0.60 0.40
! " ! "
=
# $ # $
% & % &
P
6
0.75 0.25 0.80 0.20
0.75 0.25 0.60 0.40
! " ! "
= =
# $ # $
% & % &
P
!
"
#
$
%
&
=
!
"
#
$
%
&
!
"
#
$
%
&
=
!" # $ %! # $
!& # $ %' # $
& # $ ' # $
! # $ " # $
& # $ ' # $
! # $ " # $
!
!
42
I|nd|ng the Steady State robab|||nes
! 8ack Lo Lhe Chapman-kolmogorov Lquauons:
! 1herefore, when ) geLs really large( )
! Slnce we have Lo be ln some sLaLe aL any polnL
ln ume, ln long-Lerm, we have
( ) ( ) 1
0
, 0,... ; ,
M
n n
ij ik kj
k
p i j p M p
!
=
= =
"
( )
!
"
#!
"
$ ! =
" #
!"#
43
8ack to the Weather Lxamp|e
!
"
=
!
=
!
"
"
" and
State 0 = Dry and State 1 = Rain
1
3
2
0.8 0.2
0.6 0.4
! "
=
# $
% &
P
0
0,...,
M
j i ij
i
p
j M
! !
=
=
=
"
44
3
2
43
Monopo|y Lxamp|e
! Assumlng LhaL lf you land ln [all, you have Lo
sLay ln [all for 3 Lurns or unul you roll doubles,
Lhe long Lerm probablllues of each space on
Lhe Monopoly board has been calculaLed
! Ash and 8lshop (1972)
! hup://www.bewersdor-onllne.de/amonopoly/
1 MedlLerranean Ave 0.0237
9 ConnecucuL Ave 0.0237
3 8aluc Ave 0.0241
37 ark lace 0.0243
6 CrlenLal Ave 0.0233
8 vermonL Ave 0.0238
13 SLaLe Ave 0.0238
34 ennsylvanla Ave 0.0279
14 vlrglnla Ave 0.0288
29 Marvln Cardens 0.0289
32 norLh Carollna Ave 0.0294
39 8oardwalk 0.0293
27 venLnor Ave 0.0299
31 aclc Ave 0.0300
26 ALlanuc Ave 0.0301
11 SL Charles lace 0.0304
23 lndlana Ave 0.0303
21 kenLucky Ave 0.0310
16 SL !ames lace 0.0318
19 new ?ork Ave 0.0334
18 1ennessee Ave 0.0333
24 llllnols Ave 0.0333
47
48
1wo more th|ngs
! lf l and [ are recurrenL sLaLes from dlerenL
classes:
! lf a sLaLe ' ls LranslenL, Lhen
! 1he probablllLy LhaL aL some fuLure ume, ), you
are leavlng sLaLe ' ls equal Lo Lhe probablllLy
LhaL aL some fuLure ume you are enLerlng sLaLe
'!
( )
! "#$ =
! "
!
"#
!
$
( )
0 for all
n
ij
p n =
49
Lxpected Cost]keward
! lf l know Lhe long run probablllues, l can
calculaLe long run expecLed average cosL (or
reward) per unlL ume:

45'6 = cosL (reward) for belng ln sLaLe j=0,,M
"
#
= sLaLe of Lhe sysLem aL ume L
average cost = ( ) ( ) ( )
1 1
1
lim lim Pr
n n
t t
n n
t t
E C X X j C j
n
!" !"
= =
# $ # $
= = =
% & % &
' ( ' (
) )
( )
0
M
j
j
C j !
=
"
( ) ( )
1
lim Pr
n
t
n
t
X j C j
!"
=
= = # $
% &
' ( ) ( )
1
lim Pr
n
t
n
t
X j C j
!"
=
# $
= =
% &
'
30
Lxamp|e
! WeaLher Lxample: Assume LhaL on average
sales aL an ouLdoor markeL are $300 on a
ralny day, and $2000 when lL doesn'L raln.
WhaL's Lhe average sales level per day? lf Lhe
markeL runs for 30 days nexL year, whaL are
Lhe LoLal expecLed sales?

0 1
0.25, 0.75 ! ! = =
31
Lxpected Cost]keward for Comp|ex
rob|ems
! WhaL lf Lhe cosL depends noL only on Lhe
sLaLe, buL some oLher varlable?
" Cn ralny days, sales are $300 lf no crulse shlp ls ln
porL, $1300 lf Lhere ls a crulse shlp ln porL
" Cn dry days, sales are expecLed Lo be $2000 lf no
crulse shlp ls ln porL, $4000 lf Lhere ls one
! 1he probablllLy LhaL on any random day Lhere
ls a crulse shlp ln porL ls 0.30
32
More comp|ex average costs
5
#
: a random varlable lndlcaung presence of a
crulse shlp


llnd Lhe values for 0(') rsL, Lhen compuLe Lhe
average cosL per unlL ume
( ) ( )
0 0
Expected Sale (daily) = ,
M M
j t t j
j j
E C X j Y k j ! !
= =
= = " #
$ %
& &
33
8ack to the prob|em
x
L
:
C(x
L
,?
L
): 8aln ury
?
L
: no-Shlp 300 2000

Shlp 1300 4000