Professional Documents
Culture Documents
Dissertation
Dissertation
.........................................................................
Department of Mathematics
Dissertation
future performance....................
1
In making this submission I declare that the information
.........................................................................
Contents.
Abstract.
1. Introduction
3. An alternative theorem.
2
Abstract:
3
value. These are applications that Bayes’ theorem cannot do
on its own.
Introduction:
4
lending money to potential borrowers. Bayes’ theorem
profitability.
5
betting. [2] The Bayes model remains an useful model for
6
Chapter 2: The uses of Bayes’
theorem.
coin that is known to be fair. And there are times when past
p ( a|b )= p ( b| a¿ p(a) ¿
p(b)
occurrences.
7
For example say you have a disease which has a frequency of
one in 100. And you have a 90% accurate test for the disease.
If the test is positive, how likely are you to have the disease?
So p(a|b)= 0.9*0.01/.108=1/12.
f ( x| y ) =f ( y| x ¿ f ( x) ¿ ds ¿
∫ f y s ¿ f (s )
( |
8
Chapter 3: An alternative theorem.
One question that can be asked is, when there are two
outcomes (a and b), and after a while one has happened more
then the other (a). Then what is the probability that after a
often?
9
.............................................................. ...........
n+m n−m
After n flips there have been 2 heads and 2 tails. We will
10
If we fix t and m, what value of n will maximize the
after t flips?
....... .............................................................
.....
tails after n flips, then at the probability that given this, that
............................................................... ..........
n! n+m n−m
tails after n flips is (a !∗b !∗2n ) where a = 2 and b = 2
.
11
To have the same number of heads and tails after t flips, there
t – n−m
has to be 2 =
t−n+m
0.5t - a heads and 2 = 0.5t - b tails.
t – n −m t – n +m
2 2
( t – n ) !∗p ∗(1 – p)
=
( t–n–m
2) (
!∗
t – n+ m
2 )
!
12
Bayes’ theorem states that the probability of a given b = the
Let L(a, b) mean that after a flips there have been b heads.
And Let L(a, b: d, f) mean that after a flips there have been b,
P(L(d,f)|L(a,b:d,f))= 1 =P(L(d,f)|L(a,b:d,f))
n+m n−m
If there have been 2 heads and 2 tails, then probability
m+ n
flips is p(L(t , t/2)∨L(n ,
2
))=¿
13
((
P L n,
m+ n
2 )| ( )) ( ( ))
L t,
t
2
P L t,
t
2
P ( L (n ,
2 ))
m+ n
((
P L n,
n+m
2 )| ( ))
L t,
t
2
t
Can be determined by the following. If you have 2 heads and
t!
t
2 tails there are t !∗t
! possible ways to arrange them. The
2 2
n+m
contain 2 heads is (the number of permutations for
n+m
choosing 2 flips out of n.) (the number of permutations for
t – n−m
choosing 2 flips out of t - n. )
n!
∗( t−n ) !
n+m n−m
! !
2 2
t−n−m t – n+m
! !
2 2
n! ( t−n ) !
¿
n+ m n−m t−n−m t – n+m
! ! ! !
2 2 2 2
14
This gives P ((
L n,
m+n
2 )| ( ))=
L t,
t
2
( )
n ! ( t−n ) !
n+m n−m t−n−m t – n+ m
! ! ! !
2 2 2 2
=
( )
t!
t !∗t
!
2 2
( )
2
t
! n ! ( t−n ) !
2
n+ m n−m t−n−m t – n+m
! ! ! !∗t !
2 2 2 2
( )
2
t t t
1
t !∗p ∗(1−p )
2 2 t! !
(( )
p L t,
t
) =∫ () ()
t t
dp= 2
=
1
( )
2 0
!∗ ! t
2
t+1
2 2 ! ∗( t+1 ) !
2
n−m n+ m
( )( )
n+m
n−m
1
(
n !∗p ∗ 1− p ) 2 2 n ! ! !
((
P L n,
m+n
2 = 0 ))
∫ n+ m n−m
2
!∗
2( !
dp
) (
=
( n+1 )
) !
2
n−m n+m
!
2
!
=
1
( )( )
(n+1)
2 2
t m+n
P(L(t , )∨L(n , ))=¿
2 2
((
P L n,
m+ n
2
L t,)| ( )) ( ( ))
t
2
P L t,
t
2
=
P ( L (n ,
2 ))
m+ n
15
( )
2
t
! n ! ( t−n ) !
2
( )
2
∗1 t (
n+m n−m t – n−m t – n+m ! n+1 ) ! ( t−n ) !
! ! ! !∗t ! 2
2 2 2 2 =
n+ m n−m t – n−m t – n+m
t+1 ! ! ! !∗(t+1)!
2 2 2 2
1
( n+1)
((
P L n+2 ,
m+ n+2
2 )| ( )) ( ( ))− P ( L( n , m+n
L t,
t
2
P L t,
t
2 2 |) 2 ) (
t
2 )
L ( t , ) P L (t , )
t
=
(
P L (n+2 ,
2 ))
m+ n+2
( 2 ))
P L(n ,
m+n
( )
2
t
! (n+ 3)! ( t – n−2 ) !
2
-
n+ 2+ m n – 2 – m t – n – 2 – m t – n – 2+m
! ! ! !(t+1)!
2 2 2 2
( )
2
t
! (n+1)! ( t−n ) !
2
n+ m n – m t – n – m t – n+ m
! ! ! ! (t +1)!
2 2 2 2
( )
2
t (
! t+ 2 )( n+1 ) ! ( m ( 2 n – t +3 ) + ( n+2 ) ( n – t ) ) ( t – n – 2 ) !
2
–
2
=
4( ) !( 2 )! ( 2 )! ( 2 )! (t+ 1)!
2 – m+ n 2+m+n t – m – n m – n+t
2
t m+n ((
P L n,
m+n
2 )| (
t
)) ( (
t
L t , −a P L t , −a
2 2 ))
P(L(t , −a)∨L(n , ))= .
P ( L (n ,
2 ))
2 2 m+ n
16
t t
( )( )
t t
−a +a
1
∗ ( )
2 t ! −a ! +a !
2
(( ))
t !∗p 1− p 2 2
P L t , −a = ∫ t
t dp= 1
= t+1 ,
2 0
2 ( ) ( )
t
−a !∗ +a !
2
t
2
t
( ) ( )
−a !∗ + a ! ( t +1 ) !
2
((
P L n,
m+n
2 ))=(n+1)
1
((
P L n,
n+m
2 )| (
t
L t , −a
2 ))
t!
t t
If you have heads and + a tails there are t
2
−a
2
2 ( ) ( )t
−a !∗ + a !
2
n+m
arrange the flips so that the first n flips so contain 2 heads
n+m
is (the number of permutations for choosing 2 flips out of
t – n−m
n.) (the number of permutations for choosing 2
−a flips
out of t - n. )
n!
∗( t−n ) !
n+m n−m
! !
2 2
t−n−m t – n+ m
( −a)! ( +a)!
2 2
n! ( t−n ) !
¿
n+ m n−m t−n−m t – n+m
! !( −a)!( + a)!
2 2 2 2
17
This gives P ((
L n,
m+n
2 )| ( ))=
L t,
t
2
( )
n! (t−n ) !
n+m n−m t – n−m t – n+m
! !( −a)!( + a)!
2 2 2 2
=
(( )
t!
t
2 ) ( )
t
−a !∗ + a !
2
t m+n
P(L(t , )∨L(n , ))=¿
2 2
((
P L n,
m+ n
2 )| ( t
2 )) ( ( t
L t , −a P L t , −a
2 ))
=
( m+n2 ))
P L (n ,
18
t m+ n
P(L(t ,< )∨L(n , ))=¿
2 2
⌊ 0.25 (t – m−n)⌋
t m+ n
∑ P(L(t , −2i)∨L(n ,
2 2
))
i=0
⌊ 0.25 ( t – m−n) ⌋ ((
P L n,
m+n
2 )| (
t
2 )) ( (
t
L t , −2 a P L t , −2 a
2 ))
∑
a =0
( 2 ))
P L( n ,
m+n
⌊ 0.25 ( t – m−n) ⌋
∑
( 2t – 2 a) !( 2t +2 a)! ( n+ 1) ! ( t−n ) !
!( −a )! ( +a ) !∗( t+1 ) !
a =0 n+ m n−m t – n−m t – n+ m
!
2 2 2 2
19
Chapter 4: Uses for this alternative theorem.
has two possible outcomes and has had one of them happen
20
Graphs.
t=40;
m=4;
clear f;
for I=m/2:(t-2)/2
n=2*I;
clear d;
d=0;
d(a+1)=b/c;
21
end
f(n/2-1)=sum(d);
end
P=linspace(4,38,18);
plot(P,f);
xlabel('n');
ylabel('probability');
22
t=40;
m=6;
clear g;
for I=m/2:(t-2)/2
n=2*I;
clear d;
d=0;
23
b=factorial(t/2-2*a) * factorial(t/2+2*a);
c=factorial((n+m)/2) * factorial((n-m)/2);
c=c * factorial((t-n-m)/2-2*a) *
factorial((t-n+m)/2+2*a) * factorial(t+1);
d(a+1)=b/c;
end
g(n/2-2)=sum(d);
end
P=linspace(6,38,17);
plot(P,g);
xlabel('n');
ylabel('probability');
24
t=40;
m=8;
clear h;
for I=m/2:(t-2)/2
n=2*I;
25
clear d;
d=0;
b=factorial(t/2-2*a) * factorial(t/2+2*a);
c=factorial((n+m)/2) * factorial((n-m)/2);
c=c * factorial((t-n-m)/2-2*a) *
factorial((t-n+m)/2+2*a) * factorial(t+1);
d(a+1)=b/c;
end
h(n/2-3)=sum(d);
end
P=linspace(8,38,16);
plot(P,h);
xlabel('n');
ylabel('probability');
26
As you can see, increasing the value of m decreases the
27
Conclusion.
and medicine.
28
References.
2019.
http://www.eafit.edu.co/programas-academicos/pregrados/
ingenieria-matematica/practicas-investigativas/
Documents/sports-betting-odds.pdf
29
accessed on 29 August, 2019.
[4] https://www.investopedia.com/articles/financial-
theory/09/bayesian-methods-financial-modeling.asp
30