CHAPTER 3. CONTINUOUS RANDOM VARIABLES 83
‘Theorem 3.7
Suppose that X ~ N(u,07). If Y = aX +b, where a and 6 are constants, then Y ~
N(ay.+5,020?).
Proof. Suppose that X ~ N(y,07). We find the pdf of ¥ = aX +b by first finding the
distribution function of Y. ‘The same technique will be used extensively in a later section to
find distributions of functions of random variables. Let G and g be the distribution function
and paf of ¥ respectively. Note that, like X, the range of values of Y is the entire real line
R. Then, for real y
Gly)
PIY Sy)
P(X +b 30 and is quite good when np(1~p) > 10.
n to the Binomial
‘The Continuity Correction For the Normal Approximat
Te should be noted that whereas the Binomial distribution is diserete, the normal is eon-
tinuous. Thus, although it makes sense to talk about P(X = 10) if X ~ B(20,5), this
probability would be 0 if X were normal. If k is an integer (0 < k 230) = P(X > 231)
= P(X > 230.5),CHAPTER 3. CONTINUOUS RANDOM VARIABLES 85
since we are approximating the discrete Binomial distribution by the continuous normal.
‘Thus
230.5 — 210
63
P(Z 2 2.58)
=1-P(Z < 2.58)
1 = 0.9951 = 0.0049.
P(X > 231) = P(Z>
3.8 The Exponential Distribution
Definition 3.6
A random variable X is said to have an exponential distribution with parameter 4, X ~
exp(), if it’s pdf _f is given by
gem 220
I(@z)=
0 otherwise
‘Theorem 3.9
If X ~ exp(A) then E(X) = and Var(X) = 2
Proof: We prove only that E(X) = A. By definition
E(X) = l- sp(elde= [hear
fess fe
2)
Integrating by parts:
B(X) =
as]
Now
sage
= tim [-2*]
aay
ry Fal
=0 (use L’Hospital’s rule).
Moreover
clea
af je fae
= A, since the integrand is exp(2;.).CHAPTER 3. CONTINUOUS RANDOM VARIABLES 86
Hence E(X) =
Problem: Verify that exp(z,) really is a density function.
Problem: Shaw that Var(X) = 2°.
Problem: Find a formula for the k** moment, ji, = E(X*), of X if X ~ exp(A).
3.9 Residual life aged ¢ and the memoryless property of the
exponential
‘Suppose that two men ages 20 and 60 take out life insurance policies with the same company.
Both policies mature at age 60. One of the factors that the company will assess (as part
of the overall risk assessment) is the probability that each man will live to 60. These
probabilities are clearly not the same. That for the 20 year old is Prob(living to age 60/man
has already lived 20 years). The other is Prob(living to age 60/man has lived 50 years). In
general, if to is the age of the person taking out an insurance policy, the insurance company
will be interested in the conditional probabilitiy that the person lives an additional number
of years «x beyond age zo.i.e. to age to + and beyond. These probabilities will determine
the premiums paid by the person. If X is the life-length of a randomly selected person, we
are interested in the conditional probability Pr ((xp 0).
Definition 3.7 Truncated Distributions.
Let X be arv. with pdf f and df F and let zp be a particular value of X. Define the
function Fy as follows:
Pr((zoaq) 220
Pelz) =
0 otherwise.
Then Fa is a distribution function. It is called the distribution of X truncated at zo.
To prove that Fy is a valid distribution function we show that
(i) 0< Fo(x) <1 for all ze R
(ii) Fo(c) is a non-decreasing function.
(iii)
lim Fo(z) =1 and lim Fe(2) = 0.
(i)This is clear since F(z) is a conditional probability.
(i) To do this we must show that rp > t1 => Fo(z2) > Fo(z2)
But if x2 > 21 then
Fo(22) — Foori) = P (wo < X S a0 + 2)|X > 0)
~ P((a0 < X S$ x9 + 21)|X > 20)
= Plo + m1 20)
20.CHAPTER 3. CONTINUOUS RANDOM VARIABLES 87
‘The result follows.
(iii) follows from the definition of Fy(). Hence Fo(2z) is a distribution funetion.
When X denotes the life-length of a person then the extra amount « that the person lives
beyond some prescribed age zt is called the residual life at. age «q or the future lifetime age
1x9. Thus Fa is the distribution function of the residual life at age a
‘Theorem 3.10 Memoryless Property of The Exponential
Let X ~ exp(A). Then the distribution of X truncated at the point zo is the same as the
distribution of X.
Proof. We first determine the distribution function F of the exponential
F(z) = P(X <2)
Hence $(29) = P(X > xo) = e-?/*. Note that the interval B = (2p < X < 29 +2) isa
subset of the interval A= (X > 29), so that BN A= B.
F(x) = P(BIA)
P((29 20)
Plt < X Sto +2)
P(X > x0)
F(to +2) ~ F(20)
= F(@o)
(1 e“@ot)y——
aaah
s1-e
F(a)
This is called the memoryless property of the exponential distribution. By this theorem,
if the life-distribution T of a light bulb is exponential then, no matter how old it is, its
life-expectancy is the same as it was when it was just manufactured! If at the time of the
manufacture we expect it to live 1000 days, then after 900 days of use we should still expect.
it to ‘live? 1000 more days. The light-bulb does not age!
3.10 The Poisson Process
A point process is a sequence of point events which occur randomly in an interval. The Pois-
son process is the simplest model that is widely used to describe or model these processes.
Roughly speaking, a Poisson Process is one in which the ‘density’ of events (i.e. the rate atCHAPTER 3. CONTINUOUS RANDOM VARIABLES 88
which events occur) is uniform in an interval and the process is ‘memoryless’ - the proba-
bility of an event is unaffected by what happened before. A more precise characterization
is as follows:
A point process is said to be a Poisson process if:
(3) The probability that exactly one event occurs in a given small interval of length h is
proportional to the length of the interval
(ii) The probability that 2 or more events occur in a small interval is 0.
(iii) Non-overlapping or disjoint intervals behave independently ie. let Ih, Ia,..fy be a
collection of disjoint intervals and let Ej be the event that J: point events occur in
interval J. Then the events B,, Ea,..., En are independent.
If the constant of proportionality in (i) is A then the Poisson process is said to have rate 4
‘Theorem 3.11
Let X; denote the number of events occurring in an interval of length t in a Poisson Process
with rate \. Then
X; ~ Poisson()
‘Theorem 3.12
If a Poisson Process has rate \ then the mean number of events occurring in a unit interval
is 2.
Theorem 3.13 Time Between Events in a Poisson Process Let Tbe the time to first
occurrence of an event in a Poisson process with rate . Then
T~exp(1/d)
Proof: Let the density function and df of T be f and F respectively. Then for any given
t>0
Pr(T > t) = Pr(no events oceur in the interval (0, ])
ene
OF
=a
Hence,
Mt
F() =Pr(T < t)=
Therefore
f(t) = F(t) =e
Thus T ~ exp(1/2).CHAPTER 3. CONTINUOUS RANDOM VARIABLES 89
Theorem 8.14 In,a Poisson process with rate \ the mean time between successive occur
ences of events is 5.
Example 3.7 The occurrence of earthquakes in California can be modelled by a Poisson
process in which these events occur at a rate of 2 per week. What is the probability that at
least 8 earthquakes occur during the next 2 weeks?
Let X be the number of earthquakes that occur over the next two weeks. Then by Theorem
39
X ~ Poisson(2 x 2) = Poisson(4)
Hence P(X > 3)=1- P(X =0,1 or 2)=1- 5
3.11 The Gamma Distribution
‘The Gamma Function:
‘The factorial function is defined for all non-negative integers. The Gamma function is
generalization of the factorial function to the set of all positive real numbers.
Definition 3.8 The Gamma Function
T(a) = fe? 2%te*.d for a > 0.
‘Theorem 3.15
Ia > 1 then P(a) = (a—1)P(a—1)
Proof: We use integration by parts with u
1 and du
f 2° Ve* dx eee tay f oe? de.
lo 0
Now
and
Hence P(a) = (a- 1)P(a~1).CHAPTER 3. CONTINUOUS RANDOM VARIABLES 90
‘Theorem 3.16 T(1) =1
Proof: By definition,
‘Theorem 3.17
In is a positive integer then P(
Theorem 3.18
PQ) = vi
Proof: [(}) = [g°27}.e~*.dz.
Let = hy. Try this as a problem.
‘The Gamma Distribution
Definition 3.9 The continuous r.v. X is said to have a Gamma Distribution
with parameters a and 8, X ~ Gamma(a, 8), (a > 0,8 > 0) if the pdf f of X is
1
gamma(z; a, 8) = { Bry” 2°" fore>0
0
F(asa,8)
otherwise
‘Theorem 3.19
If X ~ Gamma, 8) then E(X) = af and Var(X)
Proof: We prove the theorem for E(X)
mex) = feta hed
B= f aye te he
Lag
-f{ mae
(note that the integrand closely resembles the pdf of a Gamma(a + 1,8).).
Hence
sho fay
Tey Ip FT e+)
But the integrand is gamma(sr;a +1, 8), hence the integral is 1. Therefore
E(x) IP da.CHAPTER 3. CONTINUOUS RANDOM VARIABLES 91
SE(a+1) _ Bala)
Tia)
E(X = a8
Problem: For any k > ldetermine B(X*), the k!* moment of the Gamma(a, 8) random
variable.
‘There are two special cases of the Gamma distribution. The first is the exponential distri-
bution, It is easy to see that the exponential distribution with parameter @ is the same as a.
Gamma( 1,6) distribution. The second is very important in Statistics. It is the Chi-Squared
distribution,
Definition 3.10 The Chi-Squared Distribution
‘The random variable X is said to have a Chi-squared distribution with n degrees of freedom,
X ~x*(n), if X ~ Gamma($,2). Thus if X ~ x%(n) then it has pdf f where
1
(OTe
‘The Exponential and Gamma distributions are often used to model various random phe-
nomena. The Chi-squared distribution is of great importance in Statistics.
‘Theorem 3.20
If X ~ x*(n) then E(X) =n and Var(X) = 2n.
Definition 3.11 The Beta Distribution
A continuous random variable X is said to have a Beta distribution with parameters a and
B (a> 0 and 8 > 0) if X has pdf f, where
Tat8) po-1, -
Tote a)? for O< <1
0 otherwise
f(z)=
‘Theorem 3.21
If X has a Beta distribution with parameters a and {I then
B(x)
Var(X)
proof See Exercises