Professional Documents
Culture Documents
Bayesian Stat
Bayesian Stat
Bayesian Methods
1 / 31
Outline
1
Probability
Conditional Probability
Example: Special coin
Bayesian statistics
Posterior distribution
Coin example in Bayesian framework
Activity
Bayesian Methods
2 / 31
Bayesian Methods
3 / 31
Probability
Bayesian Methods
4 / 31
Probability
Conditional Probability
Conditional probability
Given events A and B in some sample space S,
P(A|B) =
P(A and B)
P(B)
P(A|B) =
P(B|A)P(A)
P(B)
Bayes Rule
Bayesian Methods
5 / 31
Probability
Conditional Probability
Am I fair?
To investigate we toss the coin!
How many times? When are we convinced whether the coin is fair
or not?
Bayesian Methods
6 / 31
Probability
Coin example
You have 4 visually identical coins in your pocket - 3 are standard
quarters and the 4th is a special coin.
The special coin appears identical to the 3 quarters, but has an
70% chance of landing heads up.
You reach into your pocket and randomly select a coin and toss it.
Suppose it lands heads up. What is the probability that the coin is
the special coin?
Bayesian Methods
7 / 31
Probability
P(H | S)P(S)
0.70 0.25
=
P(H)
P(H)
Bayesian Methods
8 / 31
Probability
0.70 0.25
= 0.318
0.55
Bayesian Methods
9 / 31
Probability
P(Q|H) =
P(B|A)P(A)
P(B|A)P(A) + P(B|Ac )P(Ac )
Bayesian Methods
10 / 31
Bayesian statistics
Bayesian Statistics
Bayesian Methods
11 / 31
Bayesian statistics
Posterior distribution
Bayes Rule
P(A|B) =
P(B|A)P(A)
P(B|A)P(A) + P(B|Ac )P(Ac )
Posterior distribution
( | x) = R
f (x | )()
)d(
)
f (x | )(
Bayesian Methods
12 / 31
Bayesian statistics
Bayesian Methods
13 / 31
Bayesian statistics
Posterior:
(0.5)3 0.682
= 0.439
(0.5)3 0.682 + (0.7)3 0.318
(0.7)3 0.318
P(S | HHH) =
= 0.561
(0.7)3 0.318 + (0.5)3 0.682
P(Q | HHH) =
Bayesian Methods
14 / 31
Bayesian statistics
In the first round we had 1/4 chance of having the special coin
Then we got more information through our first experiment (H) and
the chances of having the special coin increased to 31.8%
After obtaining more data (HHH) we again updated our knowledge
and the chances that we have the special coin is 56.1%
Notice how we used the result in the first step, i.e the posterior
probability, as our prior in the next step.
This is an example of a Sequential Bayesian analysis
Todays posterior distribution is tomorrows prior
Bayesian Methods
15 / 31
Normal Distribution
Normal density: N(, 2 )
1
2 2
(x)2
2 2
0.5
0.30
f (x) =
=2
0.3
0.20
0.4
=8
0.1
=6
0.10
0.2
=1
= 12
0.0
0.00
=4
10
15
20
Bayesian Methods
10
15
16 / 31
Example: Jeremys IQ
Jeremys IQ
Jeremy, an enthusiastic Georgia Tech student, poses a statistical
model for his scores on standard IQ tests. He thinks that, in general,
his scores (Y) are normally distributed with unknown mean and the
variance of 80. Prior (and expert) opinion is that the IQ of Georgia
Tech students, is a normal random variable, with mean 110 and the
variance 120. Jeremy took two IQ tests and scored 98 in the first and
104 in the second. What is the Bayesian estimate of Jeremys IQ?
= 101
The frequentist estimator of would be y
To find the Bayesian estimate we want to find the mean of the
posterior distribution of Jeremys IQ.
Jenn & Yifang (SAMSI & NCSU)
Bayesian Methods
17 / 31
Bayesian Methods
18 / 31
+
2
+
2
2 /n 02
2 /n 02
p(|X ) = R
Bayesian Methods
19 / 31
(
)2
1 2
2 2
f () exp
exp
2
2
2
2
2
+
2
2 /n 02
2 /n 02
2 =
1
1
+ 2
2
/n 0
1
=
02 2 /n
02 + 2 /n
Then we have
(
1
p(|X ) exp
2
Jenn & Yifang (SAMSI & NCSU)
2
Bayesian Methods
X
0
+ 2
2
/n 0
!!)
20 / 31
1
p(|X ) exp
2
2
2
X
0
+
2 /n 02
!!)
we see that
X
0
+ 2
2
/n 0
2 2 /n
= 20 2
0 + /n
X
0
+ 2
2
/n 0
02
2 /n
X
+
0
02 + 2 /n
02 + 2 /n
Bayesian Methods
21 / 31
For
| N(, 2 /n)
Likelihood: Y
is the average of Y1 , . . . , Yn , and
where Y
Prior: N(0 , 02 )
we have
Posterior: | X N(
,
2)
where
2 =
Jenn & Yifang (SAMSI & NCSU)
2
02
+ /n 0
Y
02 + 2 /n
02 + 2 /n
02 2 /n
02 + 2 /n
Bayesian Methods
22 / 31
Recall:
| N(, 2 /n = 80/2)
Likelihood: Y
Prior: N(0 = 110, 02 = 120)
The posterior distribution of Jeremys IQ is normal with mean and
variance
2
02
+ /n 0
Y
02 + 2 /n
02 + 2 /n
120
80/2
=
101 +
110 = 103.25
120 + 80/2
120 + 80/2
2 2 /n
120 80/2
2 = 2 0 2
=
= 30
120 + 80/2
0 + /n
Bayesian Methods
23 / 31
1.96
= 103.25 1.96 30 = (92.5, 114.0)
Interpretation: There is 95% chance that Jeremys IQ () is
between 92.5 and 114
That is NOT the interpretation of the frequentist confidence
interval!
Jenn & Yifang (SAMSI & NCSU)
Bayesian Methods
24 / 31
2
02
+ /n 0
Y
02 + 2 /n
02 + 2 /n
Bayesian Methods
25 / 31
2.0
1.0
0.0
0.5
Density
1.5
Likelihood
Prior
Posterior
10
15
20
Bayesian Methods
26 / 31
2.0
1.0
0.0
0.5
Density
1.5
Likelihood
Prior
Posterior
10
15
20
Bayesian Methods
26 / 31
2.0
1.0
0.0
0.5
Density
1.5
Likelihood
Prior
Posterior
10
15
20
Bayesian Methods
26 / 31
2.0
1.0
0.0
0.5
Density
1.5
Likelihood
Prior
Posterior
10
15
20
Bayesian Methods
26 / 31
2 = 2 0 2
= 1 2 02
02
0 + /n
0 + /n
The posterior variance can alternatively be written as
2
02
+ /n 0
Y
02 + 2 /n
02 + 2 /n
= 0 +
02
0
Y
2
0 + 2 /n
These alternative forms are what you will see in Nates lecture on the
Kalman Filter - but for the multivariate case.
Jenn & Yifang (SAMSI & NCSU)
Bayesian Methods
27 / 31
f (x | )()
)d(
)
f (x | )(
Bayesian Methods
28 / 31
Activity
Activity
What is the average height of NBA players?
Bayesian Methods
29 / 31
Activity
Activity
Hint for determining the prior
99.7%
4 3 2
+ + 2 + 3 + 4
Bayesian Methods
30 / 31
Activity
The end
Bayesian Methods
31 / 31