You are on page 1of 95

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes

ACTL2002/ACTL5101 Probability and Statistics
c Katja Ignatieva

School of Risk and Actuarial Studies
Australian School of Business
University of New South Wales
k.ignatieva@unsw.edu.au

Week 4 Video Lecture Notes
Week 2
Week 3
Week 4
Probability: Week 1
Week 6
Review
Estimation: Week 5
Week
7
Week
8
Week 9
Hypothesis testing:
Week
10
Week
11
Week
12
Linear regression:
Week 2 VL
Week 3 VL
Week 5 VL
Video lectures: Week 1 VL

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes
Sampling with and without replacement
Population parameters

Sampling

Sampling with and without replacement
Population parameters
Random sampling: general
Sampling with replacement
Sampling without replacement
Example

Properties of the sample mean & variance
Background
Sample mean: moments
Sample variance: mean

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes
Sampling with and without replacement
Population parameters

Population parameters
Recall:
- Population: the large body of data;
- Sample: a subset of the population.

Population of size N (you have full information about the
population you are interested in).
x1 , x2 , . . . , xN are characteristics of interest, which can be:
- continuous;
- discrete.

The population mean is given by:
µ=

N
1 X
·
xi .
N
i=1

602/618

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Population parameters Population parameters Recall: the population variance is given by: σ 2 = = = N 1 X · (xi − µ)2 N 1 · N 1 · N N P = 603/618 i=1 N X i=1 N X xi2 − 2 · µ · N ! xi + N · µ2 i=1 ! xi2 − 2 · µ · N · µ + N · µ2 i=1 xi2 i=1 N X − µ2 . .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Random sampling: general Sampling Sampling with and without replacement Population parameters Random sampling: general Sampling with replacement Sampling without replacement Example Properties of the sample mean & variance Background Sample mean: moments Sample variance: mean .

Without replacement: Xi and Xj are dependent. .d. Xn .e. . i. . consider a sample (size n) of the whole underlaying population (size N). . . 604/618 . distributed for i = 1. . Select from the population with or without replacement. once xi is selected Pr(Xj = xi ) = 0. . . We have: E[Xi ] = µ. are random variables. X2 . . n.. Each sample of size n has same probability of occurring. . Values of the sample: X1 .With replacement: Xi is i.i.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Random sampling: general Random sampling: general Now.

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Random sampling: general Random sampling: general The sample mean is given by: X = n 1 X · Xi . Above result is for both with and without sampling with replacement. n i=1 Recall that the sample mean is a random variable with a sampling distribution and with mean: E[X ] = µ. X is —correct “on average”— this is called unbiased (more coverage of this later in the course). 605/618 .

· Xj  n n i=1 ∗ = 1 · n2 n X n X i=1 j=1 * using the properties of the covariance.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Random sampling: general Random sampling: general In general (for both with and without sampling with replacement). . Xj ) . the sample variance is: ! n 1 X · Xi Var (X ) = Var n i=1   n n X X 1 1 ∗ = Cov  · Xi . 606/618 j=1 Cov (Xi .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Sampling with replacement Sampling Sampling with and without replacement Population parameters Random sampling: general Sampling with replacement Sampling without replacement Example Properties of the sample mean & variance Background Sample mean: moments Sample variance: mean .

the standard error: σX = 607/618 √σ . if i = 6 j. n2 n ** using slide 606.e. i. if i = j.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Sampling with replacement Sampling with replacement Now. Xj .. Xj ) Var (X ) = 2 · n ∗∗ = 1 · n2 i=1 j=1 n X Var (Xi ) = i=1 n · σ2 σ2 = . Xj ) = Var (Xi ) = σ 2 . Thus we have: n n 1 XX Cov (Xi . consider sampling with replacement then we have:  0. Standard deviation of X . ∗ Cov (Xi . n . * using independence between Xi .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Sampling without replacement Sampling Sampling with and without replacement Population parameters Random sampling: general Sampling with replacement Sampling without replacement Example Properties of the sample mean & variance Background Sample mean: moments Sample variance: mean .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Sampling without replacement Sampling without replacement Now. Pr (Xi = xi . if i 6= j. we have:  1 1 N · N−1 . if i = j. Xj )? Var X = 2 · n i=1 j=1 Consider simple case where all the xi are different. due to sampling without replacement. Xj ) = N X N X (xi − µ) · (xj − µ) · Pr (Xi = xi . How does that affect: n n  1 XX Cov (Xi . 608/618 . We have: Cov (Xi . Xj = xj ) . consider sampling without replacement ⇒ causes dependence in the Xi . i=1 j=1 where. Xj = xj ) = 0.

. .. ... . .N}\i * j ∈ {1. N}): ∗ Cov (Xi .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Sampling without replacement Sampling without replacement We have (note: {1. .N}\i = N X 1 1 · · (xi − µ) · N N −1 i=1 = ∗∗ = 1 1 · · N N −1 N X i=1 1 1 · N N −1 X   N X (xi − µ) ·  (xj − µ) − (xi − µ) j=1 N X 1 1 σ2 · · . − (xi − µ)2 = − N N −1 N −1 i=1 609/618 (xj − µ) j∈{1. . i + 1.. . . N}\i = {1. . P ** using N (xj − µ) = 0. . N}\i because Pr(Xi = xi . Xj = xi ) = 0.. . . . . 2.. . . i − 1... Xj ) = N X X (xi − µ) · (xj − µ) · i=1 j∈{1.

what is Var (X )? Var X  ∗ = = n n 1 XX · Cov (Xi . Xj ) n2 i=1 j=1  n X 1 X · Var (X ) + i n2 i=1 * using slide 606. . Cov (Xi .. Xj ) = −σ 2 /(N − 1) for i 6= j. Xj ) . Given this dependence. continues next slide.. 610/618 j∈{1. in the case of sampling without replacement.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Sampling without replacement Sampling without replacement Thus.n}\i  Cov (Xi ...

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Sampling without replacement Sampling without replacement Thus:  Var X  Var (Xi ) Cov (Xi .Xj ) = z }| { z}|{ 1  −σ 2  2 n · σ + n · (n − 1) · ·  n2  N −1 =   σ2 n−1 · 1− . where we sample without replacement we need a finite population correction of:   n−1 1− . N −1 . n N −1      which differs from the simple sampling with replacement case above. 611/618 Thus.

this means we need to build a whole new theory involving these finite population corrections. Often not necessary.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Sampling without replacement Sampling without replacement If we sample without replacement. This implies that sampling with or without replacement is approximately the same if N is large relative to n.   n−1 Hence. if N is large relative to n we have 1 − N−1 ≈ 1. Approximation for sampling distribution of X without correction usually accurate enough (in case n small relative to N). 612/618 .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Example Sampling Sampling with and without replacement Population parameters Random sampling: general Sampling with replacement Sampling without replacement Example Properties of the sample mean & variance Background Sample mean: moments Sample variance: mean .

100 2 1250 2 − 4 = 25/4 = 6.25. 10 2 = 1.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Sampling with and without replacement Example Example sampling with and without replacement Suppose an insurer has 10 observations of terrorism insurance claims. 10 − 1 9 6 6 . The claim P size do not fit a special Summarizing P10 distribution. σ /4 = 10 10 I Using sampling without replacement the variance is:   4−1 6 25 1 2 σ /4 · 1 − = 25/4 · = =4 . statistics: x = 100 and x i=1 i i=1 i The insurer is going to forecast the 4-year ahead variance in the average claim size. 613/618 I Using sampling with replacement the variance is:   !. 250.

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Properties of the sample mean & variance Background Sampling Sampling with and without replacement Population parameters Random sampling: general Sampling with replacement Sampling without replacement Example Properties of the sample mean & variance Background Sample mean: moments Sample variance: mean .

with identical distribution. alternatively. n k=1 614/618 and recall that the sample variance is defined by: n X 2 1 S2 = · Xk − X . These outcomes (x1 . . . Suppose X1 . X2 . . Assume selected with replacement or. from a large population size. n−1 k=1 . all with the same distribution and independent. Define the sample mean by: n 1 X X = · Xk . . . .v. . xn ) are random variables.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Properties of the sample mean & variance Background Properties of the sample mean and sample variance Suppose you select randomly from a sample. Xn are n independent r. .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Properties of the sample mean & variance Sample mean: moments Sampling Sampling with and without replacement Population parameters Random sampling: general Sampling with replacement Sampling without replacement Example Properties of the sample mean & variance Background Sample mean: moments Sample variance: mean .

n Note: X is a random variable! Using the i.d.i. property the expected value of the sample mean is:  Pn  i=1 Xi E[X ] = E n n P E [Xi ] = i=1 n n·µ = n = µ.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Properties of the sample mean & variance Sample mean: moments Recall: the sample mean is given by: n P Xi X = i=1 . 615/618 .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Properties of the sample mean & variance Sample mean: moments The variance of the sample mean is given by:   Pn  i=1 Xi Var X = Var n n P Var (Xi ) = i=1 2 n 2 n·σ = n2 2 σ = . the uncertainty in the sample mean decreases as n increases. n . n Hence. 616/618 The standard deviation of the sample mean is: q σ Var (X ) = √ .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Properties of the sample mean & variance Sample variance: mean Sampling Sampling with and without replacement Population parameters Random sampling: general Sampling with replacement Sampling without replacement Example Properties of the sample mean & variance Background Sample mean: moments Sample variance: mean .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes Properties of the sample mean & variance Sample variance: mean Sample variance: mean Recall: the sample variance is defined by: n P S2 = 2 i=1 n−1 n P = Xi − X i=1 Xi2 − n · X 2 n−1 Question: why did we do that? Solution: then the expectation of the sample variance is equal to the population variance. Proof: see next slide. 617/618 .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Video Lecture Notes
Properties of the sample mean & variance
Sample variance: mean

Proof:
2

E[S ] =
=

=

#
" n
X
1
2
2
·E
Xi − n · X
n−1
i=1
!
!
n
h i
X 
2
1
2
·
E Xi
−n·E X
n−1
i=1
! 
2 
!
n
X
σ
1
·
+ µ2
σ 2 + µ2 − n ·
n−1
n
i=1

618/618  

1
· n · σ 2 + n · µ2 − σ 2 + n · µ2
=
n−1
1
=
· (n − 1) · σ 2 = σ 2
n−1
h i  

2
2
* using E X = Var X + E X , i.e., using the mean and 

variance of the r.v. X & E Xi2 = Var (Xi ) + (E [Xi ])2 .

ACTL2002/ACTL5101 Probability and Statistics: Week 4

ACTL2002/ACTL5101 Probability and Statistics
c Katja Ignatieva

School of Risk and Actuarial Studies
Australian School of Business
University of New South Wales
k.ignatieva@unsw.edu.au

Week 4
Week 2
Week 3
Probability: Week 1
Week 6
Review
Estimation: Week 5
Week
7
Week
8
Week 9
Hypothesis testing:
Week
10
Week
11
Week
12
Linear regression:
Week 2 VL
Week 3 VL
Week 4 VL
Video lectures: Week 1 VL

Week 5 VL

ACTL2002/ACTL5101 Probability and Statistics: Week 4

Last three weeks
Introduction to probability;
Distribution function;
Moments: (non)-central moments, mean, variance (standard
deviation), skewness & kurtosis;
Special univariate distribution (discrete & continue);
Joint distributions;
Dependence of multivariate distributions.

701/754

v.Multivariate case: Jacobian transformation.Multivariate case: MGF-technique.Univariate case: CDF-technique. 702/754 . .). .ACTL2002/ACTL5101 Probability and Statistics: Week 4 This week Functions of random variables: .Univariate case: order statistics (min/max/range). . .Multivariate case: Convolutions (sum of independent r.

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Introduction Introduction Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example. application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .

CDF/PMF/PDF technique. all i Techniques we will consider: .MGF technique.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Introduction Introduction Introduction Often in practice we need to consider functions of a random variable P (simplest case is Y = log (X ) for security returns or Y = Xi for portfolios of risks). . .Jacobian transformation technique. .Convolutions (for sums of random variables). 703/754 .

704/754 .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Introduction Introduction Suppose X1 . . This will allow us to consider some very important topics for insurance and financial modelling: . X2 . Xn ). say Y = u (X1 . . . t−distribution. and Jacobian transformation.Special Sampling Distributions: chi-squared distribution.Distributions of Order Statistics (the maximum claim. . . see online lecture week 5). . Xn are n random variables. the minimum loss. We consider techniques that may be used to find the distribution of functions these random variables. . mgf technique. . . F −distribution (application of cdf technique. the 95% quantile of a profit distribution).

application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Order statistics Distributions of order statistics Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example.

) random variables with common distribution function FX (x) and density fX (x). Sort these variables and denote by: X(1) < X(2) < . denote U = X(n) and V = X(1) .i. identically distributed (i. . 705/754 . . Xn } is the minimum and X(n) = max {X1 . . . . Xn are n independent. < X(n) the order statistics (also seen this in week 3). . . . .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Order statistics Distributions of order statistics Distributions of order statistics Assume X1 . In particular. . . . X(1) = min {X1 . X2 . For simplicity.d. . Xn } is the maximum. .

. · Pr (Xn ≤ u) = (FX (u))n . Xn } ≤ u) = ∗∗ Pr (X1 ≤ u ∩ . . ** Using independence. we have: FU (u) = Pr (U ≤ u) ∗ = Pr (X1 ≤ u) · Pr (X2 ≤ u) · . and the density function is: ∗∗∗ fU (u) = n · fX (u) · (FX (u))n−1 . . ∩ Xn ≤ u) = Pr (X1 ≤ u) · . 706/754 . .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Order statistics Distributions of order statistics Distribution of the maximum Interested in largest possible claim? Deriving the distribution of the maximum. . . . *** Using chain rule. . · Pr (Xn ≤ u). . . * Using Pr (U ≤ u) = Pr (max{X1 .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Order statistics Distributions of order statistics Distribution of the minimum Interested in maximum loss? Deriving the distribution of the minimum. . . . 707/754 . * Using Pr (V ≤ v ) = 1 − Pr (min{X1 . . and the corresponding density function is: ∗∗∗ fV (v ) = n · fX (v ) · (1 − FX (v ))n−1 . . Xn } > v ) = 1 − Pr (X1 > v ∩ . . we have: FV (v ) = Pr (V ≤ v ) = 1 − Pr (V > v ) ∗ =1 − (Pr (X1 > v ) · . *** Using chain rule. ∩ Xn > v ) and Xi are independent. · Pr (Xn > v )) =1 − (1 − FX (v ))n . . .

yn ? Solution: n!. . . . . . . · fX (yn ) . xn be a random (non-ordered) outcome of n draws from random variable X .. . Question: What is the probability of each one such sequence? Solution: fX (y1 ) · fX (y2 ) · . xn lead to the same sequence y1 . . . . xn .X(2) .. . . Let y1 .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Order statistics Distributions of order statistics Distribution of the k th order statistic What is the joint density of all the order statistics? Let x1 . yn ) = n! · fX (y1 ) · fX (y2 ) · .X(n) (y1 . y2 . . (using independence) The joint probability density of the order statistics is given by: fX(1) . . .. . . . . yn be the ordered numbers of x1 . . . . .. . Question: How many sequences of x1 . . 708/754 . · fX (yn ). . . .

with n1 = k − 1. the probability density of the k th order statistic is given by: fK (x) = 709/754 n! · f (x) · (FX (x))k−1 · (1 − FX (x))n−k . Hence. number of ways is (k−1)!·1!·(n−k)! In general. n2 = 1 and n! . |X{z } | {z } | {z } (k − 1)! (n − k)! | {z } 1 observa.n − k observa# possible ordering tion equals x tions smaller tions larger . and n − k observations larger than x? Solution: Use multinomial. n3 = n − k.k − 1 observa.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Order statistics Distributions of order statistics Distribution of the k th order statistic Question: How many ways can you order k − 1 smaller observations than x. one observation equal to x.

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Order statistics Example. application & exercise order statistics Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example. application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .

with x ≤ y . Y = X(n) . X2 . Note: maximum and minimum are not independent. 710/754 . . Let X1 . . y ) = (FX (y ))n − (FX (y ) − FX (x))n . . . application & exercise order statistics Example order statistics Distribution of the range. and X = X(1) . Xn be independent continuous random variables each with cumulative distribution FX (x). X(n) ≤ y =FX .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Order statistics Example.Y (x. Explain in words why the joint cumulative distribution function of the minimum X(1) and the maximum X(n) is given by:  Pr X(1) ≤ x.

Now if the minimum is less than x then at least one of the observations must be less than x. Hence by subtracting off the probability that they are all between x and y we will ensure we have the probability that the minimum is less than x. In order for y to be the maximum we require that ALL of the X 0 s be less than the maximum. application & exercise order statistics First. The probability that a random variable X will be between x and y is FX (y ) − FX (x) and the probability that they are all between x and y is (FX (y ) − FX (x))n .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Order statistics Example. Probability of this: (FX (y ))n . In fact we must exclude all the cases where all the observations are between x and y because in these cases the minimum is NOT less than x. consider the maximum.  711/754 .

. i.e. that is. 712/754 .. . with exponential distribution with parameter λ. The lifetime V of the insurance company is therefore the minimum of the Tk . Therefore.d.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Order statistics Example. T2 . Assume that the lifetimes of the branches are T1 . the insurance company will go bankrupt if any one of the branches goes bankrupt. V = min {T1 . application & exercise order statistics Application order statistics Consider an insurance company with n branches. Suppose that the branches in the system are connected in “series”. .i. . Tn which are i. Tn } . . . . . the density of V is (exponential with parameter nλ): fV (v ) =n · fT (v ) · (1 − FT (v ))n−1  n−1 =n · λ · e −λ·v · e −λ·v = (n · λ) · e −(n·λ)·v .

. i. the density of U is given by: fU (u) =n · fT (u) · (FT (u))n−1  n−1 =n · λ · e −λ·u · 1 − e −λ·u . that is. application & exercise order statistics Application order statistics Suppose that the branches are connected in “parallel”. . U = max {T1 ..e. The lifetime U of the system is therefore the maximum of the Tk . . 713/754  . Therefore. . the insurance company will go bankrupt only if all of the branches go bankrupt.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Order statistics Example. Tn } .

application & exercise order statistics Exercise order statistics Let claims size be Uniformly distributed between 0 and 1. The insurer receives 1 from the reinsurer for each claim larger than 0.3942. Question: What is the probability that the reinsurer has to make at least one payment? Solution: 1 − Pr(no payment) = 1 − (0. In a proposed new contract the reinsurer would only pay twice the second largest claim Question: What is the price contract.995)100 = 0.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Order statistics Example. 714/754 .995. when it is set to the expected value plus half a standard deviation. The insurer knows that he will receive 100 claims next year.

ACTL2002/ACTL5101 Probability and Statistics: Week 4
Order statistics
Example, application & exercise order statistics

Exercise order statistics
Solution: 
99
+
Price=2 · 100+1

1
2

·

q

99·(100−99+1)
(100+1)2 ·(100+2) 

= 1.9742

n!
· fX (x) · (FX (x))k−1 · (1 − FX (x))n−k
(k − 1)! (n − k)!
n!
=
· 1 · x k−1 · (1 − x)n−k
(k − 1)! (n − k)!
Γ(n + 1)
=
· x k−1 · (1 − x)n−k+1−1
Γ(k) · Γ(n − k + 1)

fK (x) =

This is the p.d.f. of a Beta(α = k, β = n − k + 1) distribution
k
α
= n+1
,
(with k = 99 and n = 100). Hence, E [K ] = α+β
Var (K ) =

α·β
(α+β)2 ·(α+β+1)

=

k·(n−k+1)
.
(n+1)2 ·(n+2)

Alternative use simulated quantiles, see Excel file.
715/754

ACTL2002/ACTL5101 Probability and Statistics: Week 4
The CDF Technique
Preface

Distributions of functions of random variables
Introduction
Introduction
Order statistics
Distributions of order statistics
Example, application & exercise order statistics
The CDF Technique
Preface
Examples
Exercises
The Jacobian transformation technique
Fundamentals
Example Jacobian technique
The MGF technique
Motivation
Applications & exercise of the MGF technique
Sums (Convolutions)
Convolutions
Exercise
Approximate methods
Delta-method
Distribution characteristics in samples
Exercise

ACTL2002/ACTL5101 Probability and Statistics: Week 4
The CDF Technique
Preface

The CDF Technique
Let X be a continuous random variable with cumulative
distribution function FX (·) and density function fX (·).
Suppose that Y = g (X ) is a function of X where g (X ) is
differentiable and strictly increasing. Thus, its inverse g −1 (Y )
uniquely exists. Then, we can apply the CDF technique. The
CDF of Y can be derived using:
FY (y ) = Pr (Y ≤ y ) = Pr (g (X ) ≤ y )  

= Pr X ≤ g −1 (y ) = FX g −1 (y ) ,

716/754

and its density is given by: 



FY (y ) =
FX g −1 (y )
fY (y ) =
∂y
∂y 

∂ −1
= fX g −1 (y ) ·
g (y ) .
∂y

Then. its inverse g −1 (Y ) uniquely exists. Suppose that Y = g (X ) is a function of X where g (X ) is differentiable and strictly decreasing.ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Preface Let X be a continuous random variable with cumulative distribution function FX (·) and density function fX (·). we can apply the CDF technique. The CDF of Y can be derived using: FY (y ) = Pr (Y ≤ y ) = Pr (g (X ) ≤ y )   = Pr X ≥ g −1 (y ) = 1 − FX g −1 (y ) . ∂y fY (y ) = 717/754 . and its density is given by:  ∂ ∂ FY (y ) = 1 − FX g −1 (y ) ∂y ∂y  ∂ −1 = − fX g −1 (y ) · g (y ) . Thus.

then: .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Preface Summarizing: if g (X ) is strictly monotonic function.

.

.

 .

∂ −1 −1 .

fY (y ) = fX g (y ) · .

g (y ).

.

m > 0):   −b −b g −1 (y ) = y m . Affine transformations: Y = m · X + b. i. . FY (y ) = FX y m .e. ∂y Common transformations that arise in applications: 1. . Example (strictly increasing..

.

  .

∂ −1 .

. −b fY (y ) = fX y m /|m|.

∂y g (y ).

 1 . n > 0 and y > 0:  1 1 FY (y ) = FX y n . Power transformations: Y = X n . 2. = m1 . g −1 (y ) = y n .

.

.

.

.

1 .

fX y n .

∂ −1 .

.

n −1 .

1 .

n1 −1 .

fY (y ) = · .

∂y g (y ).

= n · .

y .

. .

y .

n 718/754 ..

then: .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Preface Recall: if g (X ) is strictly monotonic.

.

.

 .

∂ −1 −1 .

fY (y ) = fX g (y ) · .

g (y ).

.

. a > 0: g −1 (y ) = .): 3. Exponential transformation: Y = e a·X . ∂y Common transformations that arise in applications (cont.

.

.

∂ −1 .

.

∂y g (y ).

4. Inverse transformation: Y = 1/X . 1 a·y .  FY (y ) = FX fY (y ) = fX  log(y ) a log(y ) a a·y . = log(y ) a . x > 0: g −1 (y ) = .

.

.

∂ −1 .

.

∂y g (y ).

1 . = 719/754 1 y. . . y2   1  y FY (y ) = FX fY (y ) = fX y1 y2 .  .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Examples Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example. application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .

if x < 1. 1/8. 9). x−1 8 .  720/754 0. We know that:  fX (x) = FX (x) =   0. if 1 ≤ x ≤ 9. Find FY (y ) and fY (y ) if X ∼ U(1. . We have g −1 (Y ) = −(Y − 4)/3. Solution: Apply special case of CDF-tecnique? Note m < 0 ⇒ g (X ) = −3 · X + 4 is strictly decreasing. if x > 9. if 1 ≤ x ≤ 9.ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Examples Example: Affine transformations: Y = mX + b Example: Let Y = −3 · X + 4. 1. if x < 1 or x > 9.

721/754   if − 23 ≤ y ≤ 1.  Distribution function of Y (be careful with Pr X ≤ g −1 (y ) . 24 and zero if y < −23 and one if y > 1. . because m < 0) : FY (y ) = Pr(Y ≤ y ) = Pr(−3 · X + 4 ≤ y ) = Pr(−3 · X ≤ y − 4)   Z9 1 y −4 = dx = Pr X ≥ − 3 8 − y −4 3  1 = x 8 = 9 − y −4 3 1 y −4 = · 9+ 8 3 1 · (23 + y ).ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Examples Support of Y : g (1) = −3 · 1 + 4 = 1 and g (9) = −3 · 9 + 4 = −23.

if − 23 ≤ y ≤ 1. if − 23 ≤ y ≤ 1. 24 .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Examples Or. 24 and zero if y < −23 and one if y > 1. alternatively: Use g −1 (y ) = FY (y ) =1 − FX g =1 − −1 y −4 −3 :  (y ) = 1 − FX  y −4 −3  y −4 −3 −1 y −1 =1− 8 −24 23 + y . if − 23 ≤ y ≤ 1. 24 24 1 fY (y ) = . Thus: = 1 1 (23 + y ) = (y − (−23)).

.

∂ .

Equivalently: ∂y g (y ). −1 and zero otherwise.

fY (y ) = . = 31 .

 .

.

∂ −1 .

1 fX g −1 (y ) · .

∂y g (y ).

if − 23 ≤ y ≤ 1. FY (y ) = 722/754 . = 18 · 13 = 24 .

: 1 2 fZ (z) = √ · e −z /2 .d.f. g (Z ) = Z 2 is no monotonic function (decreasing for z < 0. Solution: use the symmetry of the standard normal distribution! 723/754 . where Y = Z 2 . increasing for z > 0). −∞ < z < ∞. can you apply CDF-technique? Solution: Apply special case of CDF-technique? No. 1) with p. n > 0 Example: Let Z ∼ N(0. 2π Question: Find fY (y ).ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Examples Example: Power transformations: Y = X n .

e. we have:     √ √ 1 −1/2 1 −1/2 fY (y ) =fZ ( y ) · ·y − fZ (− y ) · − · y 2 2     √ 1 1 √ = fZ ( y ) · y −1/2 + fZ (− y ) · y −1/2 2 2   √ ∗ −1/2 =fZ ( y ) · y   1 = √ · y −1/2 · e −y /2 . * Using symmetry.ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Examples We have: FY (y ) = Pr (Y ≤ y )  √ √ = Pr Z 2 ≤ y = Pr (− y ≤ Z ≤ y ) √ √ =FZ ( y ) − FZ (− y ) .. i. for y ≥ 0. fZ (−a) = fZ (a). .  2π 724/754 and zero otherwise. if y ≥ 0. √ √ Using FY (y ) = FZ ( y ) − FZ (− y ).

then (log-normal)   ! 1 1 log(y ) − µ 2 √ · exp − · .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Examples Example: Exponential transformation: Y = e aX . a > 0 Example: X ∼ N(µ. Question: Derive this result. fY (y ) = 2 σ y · σ · 2π and zero otherwise. 725/754 . and g (∞) = exp(∞) = ∞. σ 2 ) and Y = e X . Support of Y : g (−∞) = exp(−∞) = 0. if y > 0. Solution: Apply special case of CDF-tecnique? Yes.

ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Examples We have X ∼ N(µ. Using fY (y ) =fX (log(y )) · = 1 y we have: 1 y 1 1 √ · exp − · = 2 y · σ · 2π 726/754 ∂ −1 (y ) ∂y g  log(y ) − µ σ 2 ! . if y ≥ 0. if − ∞ ≤ x ≤ ∞. 2 σ σ · 2π Now. if y > 0. σ 2 ). using g −1 (y ) = log(y ): FY (y ) = Pr(Y ≤ y )   = Pr e X ≤ y = Pr (X ≤ log(y )) = FX (log(y )). and zero otherwise. and g (X ) = exp(X ) thus:   ! 1 1 x −µ 2 √ · exp − · fX (x) = .  .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Exercises Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example. application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .

Let Y = exp(a · X ). Can we use the CDF-technique? 727/754 . Question: Find the probability density function and cumulative density function of Y .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Exercises Exercises: Exponential transformation (generally) Let X be a continuous distribution with probability density function fX (x) and cumulative density function FX (x). a > 0.

.ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Exercises Exercises: Exponential transformation (generally) Solution: Support Y : g (−∞) = 0.e. g (∞) = ∞. y ≥ 0. i. FY (y ) = Pr(Y ≤ y )   =Pr e a·X ≤ y = Pr (a · X ≤ log(y ))     log(y ) = Pr X ≤ log(y 1/a ) =Pr X ≤ a   =FX log(y 1/a ) . if y < 0 and   FY (y ) = FX log(y 1/a ) . if y ≤ 0 and   1 fY (y ) = fX log(y 1/a ) · . So we have: FY (y ) = 0. a·y 728/754 if y ≥ 0 if y > 0. and thus: fY (y ) = 0. .

Question: Find the distribution of Y . σ 2 ) be a normally distributed random variable with probability density function fX (x) and cumulative density function FX (x). Application: (OPTIONAL) First passing time for a Brownian motion at a fixed level α.ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Exercises Exercises: Inverse transformation: Y = 1/X . 729/754 . x > 0 Example: inverse Gaussian (Wald) distribution. Let X ∼ N(µ. Let Y = 1/X .

1 x→0 = ∞. So we have: fY (y ) = 0 if y ≤ 0 and   1 1 fY (y ) = fX · 2 . y y 730/754 . we have: FY (y ) = Pr(Y ≤ y )     1 1 =Pr ≤ y = Pr X ≥ X y     1 1 =1 − Pr X < = 1 − FX . if y > 0. →∞ In general.ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Exercises Exercises: Inverse transformation: Y = 1/X . x > 0 Solution: Support of Y : g (x → 0) = lim g () = 1 = 0. y y if y > 0. and FY (y ) = 0 if y ≤ 0.

: fX (x) = e −x (1 + e −x )2 . Question: Find the probability density function of: Y = e −X .d.f. Use special case of CDF-technique? 731/754 .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Exercises Exercise Let X be a random variable with p. for − ∞ ≤ x ≤ ∞.

ACTL2002/ACTL5101 Probability and Statistics: Week 4 The CDF Technique Exercises Solution: Yes. so that ∂ −1 (y ) ∂y g = − y1 . We have: . Support of Y: g (−∞) = e ∞ = ∞. We have: g −1 (y ) = − log(y ). g (X ) = e −X . which is a strictly decreasing function. g (∞) = e −∞ = 0.

.

.

 .

∂ −1 .

fY (y ) =fX g (y ) · .

g (y ).

.

∂y .

.

.

1.

1 y · .

− .

= 2 . = .

y.

. if 0 < y < ∞. (1 + y ) (1 + y )2 −1 732/754 and zero otherwise.

ACTL2002/ACTL5101 Probability and Statistics: Week 4 The Jacobian transformation technique Fundamentals Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example. application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .

x2 ) and u2 = g2 (x1 .6 of W+(7ed). u2 ) and x2 = h2 (u1 . Section 6. where h (u1 . u2 ) .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The Jacobian transformation technique Fundamentals Fundamentals (bivariate case) Consider the case of two continuous random variables X1 and X2 and assume that they are mapped onto U1 and U2 by the transformation: u1 = g1 (x1 . . Suppose this transformation is one-to-one so that we can invert them to get: x1 = h1 (u1 . 733/754 Note: multivariate case of CDF technique. u2 ) = g −1 (u1 . u2 ) . x2 ) .

u2 ) ∂h2 (u1 . u2 ) = det   ∂h (u . u2 ) ∂h1 (u1 . u2 ) · − · . u2 )    ∂u2 ∂h1 (u1 . ∂u1 ∂u2 ∂u1 ∂u2 provided this is not zero. u )  2 1 2  ∂u1 =       ∂h2 (u1 . u2 )   ∂u1  J (u1 . u2 ) ∂u2 . u2 ) ∂h2 (u1 .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The Jacobian transformation technique Fundamentals The Jacobian of this transformation is the determinant:    ∂h1 (u1 . 734/754 ∂h1 (u1 .

u2 ) . using the Jacobian transformation technique. h2 (u1 . u2 ) and h2 (u1 . u2 ) =fX1 . x2 ). The above technique can be easily extended to n > 2 variables. Then. u2 ))|. u2 )) · |J (h1 (u1 .U2 (u1 . 735/754 . u2 ) . the joint density of U1 and U2 is given by: fU1 .X2 (h1 (u1 . u2 ) multiplied by the absolute value of the jacobian.X2 (x1 . i. joint density of X1 and X2 evaluated in h1 (u1 ..e.ACTL2002/ACTL5101 Probability and Statistics: Week 4 The Jacobian transformation technique Fundamentals Suppose the joint density of X1 and X2 is denoted by fX1 . h2 (u1 .

x2 ) and u2 = g2 (x1 . u2 ). x2 ).ACTL2002/ACTL5101 Probability and Statistics: Week 4 The Jacobian transformation technique Fundamentals Jacobian transformation technique: procedure Procedure to find joint density of u1 = g1 (x1 . u2 ). x2 ) and u2 = g2 (x1 . u2 ) = g −1 (u1 . 736/754 . 4. x2 ): 1. Find u1 = g1 (x1 . h2 (u1 . Find the absolute value of the Jacobian of the transformation. u2 ). 2. Determine h (u1 . X2 evaluated in h1 (u1 . 3. Multiply that with the joint density of X1 . Note: If interested in marginal density of U1 : take integral over all possible values of U2 (see last week).

application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The Jacobian transformation technique Example Jacobian technique Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example.

: exp(−(x1 + x2 )).d.ACTL2002/ACTL5101 Probability and Statistics: Week 4 The Jacobian transformation technique Example Jacobian technique Example Jacobian transformation technique Let {X1 . fX1 . Solution: Find the joint density between Y1 = X1 + X2 and 1 : Y2 = X1X+X 2 1.f. X2 } be uncertainty in the claim size of home insurance and unemployment insurance. Thus: X1 = Y2 · (X1 + X2 ) = Y1 · Y2 and X2 = Y1 − X1 = Y1 · (1 − Y2 ). We have the joint  p. X1 X1 +X2 . otherwise. We have transformations: Y1 = X1 + X2 and Y2 = 737/754 2.X2 (x1 . if x1 ≥ 0 and x2 ≥ 0. Question: Find the covariance between the aggregate claim size and the proportion due to home insurance. x2 ) = 0. .

For y1 ≥ 0 and 0 ≤ y2 ≤ 1 we have: fY1 . y2 ) = exp (−(y1 · y2 + y1 · ((1 − y2 ))) · y1 = exp (−y1 ) · y1 . Hence. y2 ) = y2 y1 = −y2 y1 − y1 (1 − y2 ) = y1 . y2 ) = exp (−y1 ) · y1 . otherwise. covariance equals zero (independent!).Y2 (y1 . 738/754 .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The Jacobian transformation technique Example Jacobian technique Example Jacobian transformation technique 3. Hence:  fY1 . J(y1 . for y1 ≥ 0 and 0 ≤ y2 ≤ 1.Y2 (y1 . 0. 1 − y2 −y1 4.

ACTL2002/ACTL5101 Probability and Statistics: Week 4 The Jacobian transformation technique Example Jacobian technique 739/754 .

application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The MGF technique Motivation Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example.

Y independent): h i Mm·X +b (t) =E e t·(m·X +b) = E [exp (t · m · X ) · exp (t · b)] =E [exp (t · m · X )] · exp (t · b) = MX (m · t) · exp (t · b) h i MX +Y (t) =E e t·(X +Y ) = E [exp (t · X ) · exp (t · Y )] =E [exp (t · X )] · E [exp (t · Y )] = MX (t) · MY (t) 740/754 .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The MGF technique Motivation Moment generating function Recall from week 1: The moment generating function is defined as: h i Z ∞ MX (t) = E e t·X = fX (x) · exp (t · X ) dx −∞ Important properties (a. b ∈ R and X .

. . .xn )t fX1 . .Xn (x1 . xn ) dx1 . Xn have a joint density fX1 .g. it is unique and it uniquely determines the distribution.. Suppose we are interested in the distribution of: U = g (X1 . .g. because..ACTL2002/ACTL5101 Probability and Statistics: Week 4 The MGF technique Motivation The MGF technique This method can be effective where we recognize the m. −∞ 741/754 −∞ and determine distribution by comparing with known m. . . .g. when it exists. . . xn ).f. ... .f. . .’s.. dxn .. Xn ) .. .. of U: h i MU (t) =E e Ut Z ∞ Z ∞ = ·. where X1 .Xn (x1 . .... . . ..f.. · e g (x1 . . The MGF technique determines the distribution of U by finding the m.

. . The m. .f. . .. Xn . .g. .. . . + bn · Xn . . · MXn (bn · t) ..+bn ·Xn )·t = E e X1 ·b1 ·t · . .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The MGF technique Motivation In the special case where U is the sum of the random variables: U = b1 · X1 + . . of X1 . 742/754 if X and Y are independent. . We have also seen this in week 1 lecture.f. of U is the product of the m.g. Xn are independent. and X1 . · E e Xn ·bn ·t =MX1 (b1 · t) · .e. i. . we have: h i h i h i MU (t) =E e (b1 ·X1 +. . recall: Ma·X +b (t) =MX (a · t) · e b·t MX +Y (t) =MX (t) · MY (t).

ACTL2002/ACTL5101 Probability and Statistics: Week 4 The MGF technique Applications & exercise of the MGF technique Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example. application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 The MGF technique Applications & exercise of the MGF technique Application: Summing Poisson processes Let Xi be the i. claims arriving from females motor vehicle insured with rate λ2 .i. .g. There are n males insured and m female insured.i. The m. claims arriving from males motor vehicle insured.g. with rate λ1 and let Yi be i. of n m P P U= Xi + Yi is given by: i=1 MU (t) = i=1 n Y MXi (t) · i=1  t = e λ1 ·(e −1) 743/754 m Y MYi (t) i=1 n  · e λ2 ·(e t −1) m = e (n·λ1 +m·λ2 )·(e t −1) . which is the m. X2 are independent.f. where X1 .d. of a Poisson with parameter n · λ1 + m · λ2 .f. Question: Find the distribution of the total number of claims. Solution: Let Xi ∼ Poisson (λ1 ) and Yi ∼ Poisson (λ2 ).d.

Question: Find the distribution of asset returns when an insurance invests half its wealth in Australian and half in American government bonds. 744/754 .ACTL2002/ACTL5101 Probability and Statistics: Week 4 The MGF technique Applications & exercise of the MGF technique Application: summing independent Normal Let X1 be the yearly return on Australian governments bonds and X2 be the yearly return on American government bonds. Assume that the return on government bonds is normally distributed and the yearly return on Australian and American government bonds are independent.

  U ∼ N (µ1 + µ2 )/2. X2 again are independent.g. σ12 and X2 ∼ N µ2 .f.ACTL2002/ACTL5101 Probability and Statistics: Week 4 The MGF technique Applications & exercise of the MGF technique Application: summing independent Normal   Solution: Let X1 ∼ N µ1 . 2 which is the m. of U = (X1 + X2 ) /2 is given by: MU (t) =MX1 (t/2) · MX2 (t/2)     1 2 2 2 1 2 2 2 = exp µ1 /2 · t + σ1 /2 t · exp µ2 /2 · t + σ2 /2 t 2 2    1 2 = exp ((µ1 + µ2 )/2) t + σ1 + σ22 /22 · t 2 . where X1 . (σ1 /2)2 + (σ2 /2)2 . of another Normal with mean (µ1 + µ2 ) /2 and variance (σ1 /2)2 + (σ2 /2)2 . The m. 745/754 .f.g. σ22 .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 The MGF technique Applications & exercise of the MGF technique Exercise: dependent Normal Often independence is not a good assumption.. X2 µ2 ρσ1 σ2 σ22 Question: What would be a logical value for ρ? Question: What is the distribution of U = (X1 + X2 )/2. Can we use MGF technique? 746/754 . Now consider the asset value when X is bivariate normally distributed. In week 9 we will see a hypothesis test for independence. i. X = ∼N .e.:       X1 µ1 σ12 ρσ1 σ2 .

f. where Z1 and Z2 are independent. However. (Important result!) . of another Normal with mean (µ1 + µ2 ) /2  and variance σ12 + σ22 + 2ρσ1 σ2 /4.g. Then we have: MU (t) =M(X1 +X2 )/2 (t) = Mµ +σ Z +µ +ρσ Z +√1−ρ2 σ Z (t/2) 1 =Mµ √ 1 +µ2 +(σ1 +ρσ2 )Z1 + 1 1 2 1−ρ2 σ2 Z2 2 1 2 2 (t/2)  t 2  1 t 2 =exp (µ1 + µ2 ) · · exp (σ1 + ρσ2 ) · · 2 2 2    t 2 1 (1 − ρ2 )σ22 · exp 2 2    2 2 1 2 2 =exp ((µ1 + µ2 ) /2) t + σ + σ2 + 2ρσ1 σ2 /2 · t . recall from last week: X1 =µ1 + σ1 Z1 X2 = µ2 + ρσ2 Z1 + p 1 − ρ 2 σ2 Z2 . 2 1  747/754  which is the m.ACTL2002/ACTL5101 Probability and Statistics: Week 4 The MGF technique Applications & exercise of the MGF technique Solution: No (due to dependency).

application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Sums (Convolutions) Convolutions Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example.

and pZ (z) = X pX (x) · pY (z − x) . all x If X and Y are independent then: p (x.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Sums (Convolutions) Convolutions Sums (Convolutions): the discrete case Discrete case: Let Z = X + Y . 748/754 .. i. y ) =pX (x) · pY (y ) .e. z − x) . Y = Z − X : pZ (z) = X pX .Y (x. all x This is called the convolution of pX and pY .

−∞ If X and Y are independent: Z ∞ fZ (z) = fX (x) · fY (z − x) dx. z − x) dx. v − x) dvdx −∞ −∞ Z z Z ∞ = fX .Y (x.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Sums (Convolutions) Convolutions The Continuous Case Z ∞ Z z−x FZ (z) = fX .Y (x. −∞ −∞ Differentiate (under integral) to get: Z ∞ fZ (z) = fX .3. v − x) dxdv . −∞ 749/754 See W+(7ed) section 6.Y (x. .Y (x. y ) dydx −∞ −∞ change of variables: y =v − x Z ∞Z z = fX .

ACTL2002/ACTL5101 Probability and Statistics: Week 4 Sums (Convolutions) Exercise Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example. application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .

750/754 0 .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Sums (Convolutions) Exercise Exercise Let Xi ∼ EXP(λ) be the size of the semiannual expected discounted value of newly issued long-term disability insurance claims. Solution: Let Z = X1 + X2 .  λ · exp(λ · xi ). otherwise. If z ≥ 0 we have: Z ∞ fZ (z) = fX1 (z − x2 ) · fX2 (x2 )dx2 −∞ Z z = λ · exp (−λ · (z−x2 )) · λ · exp (−λ · x2 ) dx2 0 Z z = λ2 · exp (−λ · z) dx2 = λ2 · z · exp (−λ · z) . fXi (xi ) = 0. if xi ≥ 0. Question: Find the distribution of the annual claim size.

2 f (z) 1 Z i X i f (x ) 1.6 0.6 0.8 2 4 xi 6 0 0 2 4 z 6 .8 0.6 1.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Sums (Convolutions) Exercise Exponential density 2 Convolution 2 λ=1 λ=2 λ=3 λ=4 λ=5 λ=6 1.8 1.2 0.2 1 0.4 751/754 1.2 0 0 λ=1 λ=2 λ=3 λ=4 λ=5 λ=6 1.6 1.8 0.4 0.4 1.4 0.

application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Approximate methods Delta-method Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example.

which implies: E [Y ] ≈g (µX ) 2 ∗ Var (Y ) ≈ g 0 (µX ) · σX2 .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Approximate methods Delta-method Delta method Say you know E [X ] = µX and Var (X ) = σX2 and are interested in mean and variance of Y = g (X ). where g is non-linear. Using Taylor series: Y = g (X ) ≈ g (µX ) + (X − µX ) · g 0 (µX ) . * Using E[Y 2 ] = E[g (µX )2 +(X − µX )2 ·g 0 (µX )2 +2·g (µX )·(X − µX )·g 0 (µX )] 752/754 Very useful where you do not know the exact distribution of Y = g (X )! .

application & exercise order statistics The CDF Technique Preface Examples Exercises The Jacobian transformation technique Fundamentals Example Jacobian technique The MGF technique Motivation Applications & exercise of the MGF technique Sums (Convolutions) Convolutions Exercise Approximate methods Delta-method Distribution characteristics in samples Exercise .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Distribution characteristics in samples Exercise Distributions of functions of random variables Introduction Introduction Order statistics Distributions of order statistics Example.

Some years there are no floods (with probability q = 0.  2  P n Pn 2 i=1 xi · = 4720. 700 and i=1 xi = 400. What is the variance of flood insurance claims when the insurer is representative? Solutions: a.7.7) and some years there are floods. Sample: s 2 = 753/754 1 n−1 PN xi2 −  PN xi 2 = 4641.3. Questions: a. Population: σ 2 = b. What is the variance of the claims size since 1950? b. The realizations 1950 are given in the P60 of the claim sizesPsince 60 2 Excel file ( i=1 xi = 2. 000). i=1 xi − n · n i=1 N  i=1 N .ACTL2002/ACTL5101 Probability and Statistics: Week 4 Distribution characteristics in samples Exercise Consider an insurer offering flood insurance. The claim size when there are floods is LogNormally distributed with mean E[X ] = 150 and Var (X ) = 700.

i=1 754/754 n X ! xi i=1 c. 480 and x i=31 i i=31 i c. the distribution ( 60 x = 1. d. σ 2 = 225000/30 − 14802 /302 = 5. n2 · σ 2 /n · (1 − (n − 1)/(N − 1)) = 25331 · 25/29 = 21. What would be the variance of the aggregate claim size if he simulates with replacement? d. Same as c. Which one (with/without replacement) would you use to simulate? Solutions: Note that:  Pn Var (X ) =Var ⇒ n2 · Var (X ) =Var  1 i=1 xi = 2 · Var n n ! n X xi . Using n = 5.066. 000). without P60 assumptions 2 = 225.ACTL2002/ACTL5101 Probability and Statistics: Week 4 Distribution characteristics in samples Exercise The insurer wants to simulate the 5-year aggregate claim size using only the latest on P 30 observations. 837. but now without replacement. .. n2 · (σ 2 /n) = 25. N = 30. e. 331.