You are on page 1of 5

The Kelly Criterion

Adam Capulong

Williams College

Hudson River Undergraduate Math


Conference 2010

Investment Math Spring 2010

Professor Frank Morgan


Introduction

Good morning everyone. My name is Adam Capulong. I am a senior


mathematics and economics double major at Williams College. I am here to
talk to you about the Kelly Criterion, which has applications to gambling
like sitting down at the casino table or going to the horse track. When I came
across this formula, I was surprised to learn that it had only been discovered
as recently as 1956 at Bell Labs, merely 54 years ago Also, given the money-
making implications of this formula, it is unfortunate that J.R. Kelly never
used his work to make money and died at a relatively young age.
So, suppose that you have a given amount of money, your bankroll. You
sit down to play a game over and over. There are two outcomes to this game,
you either win or lose, with respective probabilites p you win and q you lose
that you know beforehand. You can bet any fraction f of your money on
the chance that you win. The payout odds are β : 1. This is not net odds
meaning if you place a bet of one dollar and the odds are 7 : 1, winning will
net you six dollars. Losing loses you the dollar. The Kelly Criterion says
that you should bet
βp − q
f=
β
Let us do an example. Suppose at a casino there is a game called ”single
die”. Rolling a 6 pays 10 : 1. Any other face loses. Assuming that it is a fair

1
die, the probability of winning is 16 . You lose with a chance of 65 . Therefore
for each roll, you should bet

10 · 16 − 5
6 1
f= =
10 12
one twelfth of your money at each roll. Doing so will make your bankroll
increase at the fastest rate possible.

Coin Tossing and Kelly’s Criterion Derived

Suppose you are playing a coin tossing game with a friend. This is not
a fair coin. That is, the probability of landing heads p is 1 ≥ p > 21 . The
probability of tails appearing q is 21 > q = 1 − p ≥ 0 by the law of to-
tal probability. Suppose further that your opponent-friend has an infinite
bankroll. Whenever heads comes up, you win the amount you bet Bk for the
kth coin flip; whenever tails comes up, you lose the same amount of money
(thus an even money bet). You start with X0 . Say you want to maximize
E(Xn ) for n coin-flips. Setting indicator variables Tk = +1 for a win and
Tk = −1 for a loss, for k = 1, 2, 3 . . .,
n
X n
X
E(Xn ) = X0 + E(Bk Tk ) = X0 + (p − q)E(Bk )
k=1 k=1

Since p − q > 0, maximizing E(Xn ) means maximizing E(Bk ) or betting


the full amount of the bankroll. You would quickly lose all our money and
be unable to bet further since the probability of ruin is

lim [1 − pn ] = 1
n→∞

. You could play as to minimize ruin (formula given by Feller 1966), but
that implies we bet 0 at each stage. Therefore, you take the middle road
of betting a fraction of current capital. So, Bk = f Xk , where 0 ≤ f ≤ 1.
This means we are assuming that our bankroll is infinitely divisible. Ruin
could still occur in the limit if you lose every bet, but we can say that for
arbitrarily tiny , limn→∞ [P r(Xn ≤ )] = 1. Our bankroll after n games is

Xn = X0 (1 + f )S (1 − f )F

where S is the number of wins and F is the number of losses and S + F = n.

2
Kelly’s breakthrough was stating the gambler’s dilemma as a maximiza-
tion problem. The ratio of Xn to X0 is growing (or decaying) exponentially,
so we take the logarithm of the right hand side. Let us analyze n1 log X n
X0
There are two reasons why we want to divide by n. First, we want to make
sure that we are not penalized by repeating the game. The second reason
will be clear later. You want to maximize
1 Xn S F
G(f ) = log = log(1 + f ) + log(1 − f )
n X0 n n
the exponential growth rate of our capital stock.
p q
G0 (f ) = − =0
1+f 1−f
when f = p − q. Also,
−p −q
G00 (f ) = 2
+ <0
(1 + f ) (1 − f )2

so g 0 (f ) is decreasing on [0, 1). Therefore, g(f ) has a unique maximum.


For games where the odds are not 1 : 1, our money after n games is

Xn = X0 (1 + βf )S (1 − f )F

for a β : 1 betting odds. In this case, the optimal fraction


βp − q
f=
β
Recall that we assumed that our bankroll is infinitely divisible. If there is
a minimum bet, then ruin is more likely; but if it is small relative to the
starting bankroll, then the chance of ruin is negligible.

Thorp’s Note on ”Long Run”

The Kelly Criterion will always dominate in the long run. Kelly provides
many examples (in all sections of his paper) of gambles where the Kelly
Criterion does not win or cannot provide a winning strategy. Thorp notes
that to compare two strategies, one must make comparisons on the same set
of data ( like the same sequence of hands). Only comparing two strategies
on two games that are identially distributed is illegitimate.

3
Application to Blackjack (Two Approaches)

A card game is vastly different from coin tossing. Compared to coin-


tossing, the even money blackjack game requires a lower fraction than the
optimal Kelly fraction in the coin-tossing due to high variance. Also, during
the game there are special events like splitting pairs and doubling down.
How many decks are used and how many players are at the table change the
advantage the gambler has at the table.
Thorp states that half of the time (which are situations that occur with
probability 0.5), there is a probability of gaining or losing X per unit bet.
P (X = 1) = 0.51, P (X = −1) = 0.49. There is a 0.5 probability of being
in an unfavorable situation, where P (X = 1) = 0.49, P (X = −1) = 0.51.
Before putting down money, the player should know whether the situation is
favorable or unfavorable. Thus, another caveat is that the player must put
down minimum ”waiting” bets until the time arises when the player can put
down a lot of money on a favorable circumstance.
For one approach, suppose that for the bad situations, the gambler bets
a token minimum of f0 . The problem then is to maximize g(f ):

g(f ) = .5(.51 log(1 + f ) + .49 log(1 − f )) + .5(.51 log(1 + f0 ) + .49 log(1 − f0 ))

The second term is constant, so there is an f ∗ = p − q that maximizes g(f ).


A smart player will play f ∗ ≤ kf0 , where k is an integral multiple (integer
multiple) of f0 . Therefore, the solution is f ∗ = min(f ∗ , kf0 ).
Another approach is to bet f for the good situations and af , 0 ≤ a ≤ 1
on the bad situations. Thus we are maximizing

g(f ) = .5(.51 log(1 + f ) + .49 log(1 − f )) + .5(.51 log(1 + af ) + .49 log(1 − af ))

Expect to gain 0.02(1−a)


1+a
. When a = 0, f ∗ = 0.02. When a = 1, f ∗ = 0. As

we decrease f from 0.02, the first term decreases slightly, but the second
term increases (becomes less negative) faster. This approach is closer to
what a professional blackjack player will do. His maximum bet should be a
reasonable integer multiple of his minimum bet.

Conconlusion

Thare are lots of applications for this formula. So, if you ever find yourself
having to bet on an outcome (It could be as simple as a bet between you and

4
your friend.) you will want to remember the Kelly Criterion to make sure
that you make the most money in the safest way possible. Thank you for
listening to my presentation. Have a good day.

You might also like