You are on page 1of 26

Mathematical Expectations

Week – 4
MKT3802 Statistical and Experimental Methods for Engineers

March 30, 2023


Outline

Mathematical Expectations March 30, 2023 2/1


Expectations and random variable

Summarizing the main features of the random variable’s behavior


with a few numbers will be helpful in solving engineering problems.
These numbers sometimes called as parameters of distribution
Parameters represent
central tendency (the mean)
magnitude of dispersion, (the variance)
skewness
sharpness
of distribution

Mathematical Expectations March 30, 2023 3/1


Central Tendency or Mean (µ)

The most important characteristics of the distribution


Sometimes mean is called as expected value, E(X), or mean of the
random variable X, or mean of the probability distribution of X
The probability distribution of X
Definition
Let X be a random variable with probability distribution px (xi ) or
f (x). The mean, or expected value, of X is
X
µX = E(X) = xi px (xi ) for discrete X
i
Z ∞
µX = E(X) = xf (x) dx for continuous X
−∞

Mathematical Expectations March 30, 2023 4/1


Example

Example
Let X be the random variable that denotes the life in hours of a certain
electronic device. The probability density function is

20000
f (x) = x > 100
x3
=0 otherwise

Find the expected life of this type of device.

Mathematical Expectations March 30, 2023 5/1


Central Tendency or Mean (µ)

Sometimes a new variable Y = g(X) may be defined from random


variable X
The mean of the random variable Y = g(X) in that case is

Definition
Let X be a random variable with probability distribution px (xi ) or
f (x). The mean, or expected value, of a new variable g(X) is
X
µg(X) = E(g(X)) = g(xi )px (xi ) for discrete X
i
Z ∞
µg(X) = E(g(X)) = g(x)f (x) dx for continuous X
−∞

Mathematical Expectations March 30, 2023 6/1


Example

Example
Suppose that the number of cars X that pass through a car wash on
any sunny Friday has the following probability distribution:

Let g(X) = 2X − 1 represent the amount of money, in dollars, paid to


the attendant by the manager. Find the attendant’s expected earnings
for this particular time period.

Mathematical Expectations March 30, 2023 7/1


Variance

The mean, or expected value, of a random variable X describes


where the probability distribution is centered
The mean does not give an adequate description of the shape
of the distribution
We need to characterize the variability (dispersion) in the
distribution
This dispersion is characterized by variance

Mathematical Expectations March 30, 2023 8/1


Variance

Definition
Let X be a random variable with probability distribution px (xi ) or
f (x) and mean µX . The variance of X is:
X
2
σX = E[(X − µX )2 ] = (xi − µX )2 px (xi ) for discrete X
i
Z ∞
2
σX = E[(X − µX )2 ] = (x − µX )2 f (x) dx for continuous X
−∞

Alternative way
An alternative and preferred formula for finding variance is
2
σX = E[X 2 ] − µ2X

Mathematical Expectations March 30, 2023 9/1


Standard Deviation and Coefficient of Variation

Definition (STD)
The positive square root of the variance, σx , is called the standard
deviation of X. q
σX = σX 2

Definition (CV)
The ratio of the standard deviation to mean is called a coefficient of
variation of X.
σX
CvX =
µX

Mathematical Expectations March 30, 2023 10 / 1


Variance

Definition
For a new variable Y = g(X) the variance is defined as follows:
2
X
σg(X) = E[(g(X) − µg(X) )2 ] = (g(x) − µg(X) )2 px (xi ) for discrete X
i
Z ∞
2
σg(X) = E[(g(X) − µg(X) )2 ] = (g(x) − µg(X) )2 f (x) dx for continuous X
−∞

Mathematical Expectations March 30, 2023 11 / 1


Joint Probability Distributions

Previous discussions are valid for 1-D random variables


Let X and Y are two random variables
The probability distribution for their simultaneous occurrence may
be our concern
Represented by a function with values p(x, y) or f (x, y) for any
pair of values (x, y)
p(x, y) or f (x, y): the joint probability distribution of X and Y.

Mathematical Expectations March 30, 2023 12 / 1


Joint Probability Distributions

Definition (Discrete case)


The function p(xi , yj ) is a joint probability distribution or probability mass function
of the discrete random variables X and Y if
1 p(xi , yj ) ≥ 0 for all (xi , yj )
P P
2
i j
p(xi , yj ) = 1
3 P (X = xi , Y = yj ) = p(xi , yj )

Definition (Continuous case)


The function f (x, y) is a joint density function of the continuous random variables X
and Y if
1 f (x, y) ≥ 0 for all (x, y)
R∞ R∞
2
−∞ −∞
f (x, y)dxdy = 1
RdRb
3 P (a ≤ x ≤ b, c ≤ y ≤ d)) ∈ A) = c a
f (x, y)dxdy

Mathematical Expectations March 30, 2023 13 / 1


Example

Example (Discrete Case)


Two ballpoint pens are selected at random from a box that contains 3
blue pens, 2 red pens, and 3 green pens. If X is the number of blue
pens selected and Y is the number of red pens selected, find
a the joint probability function p(xi , yj )
b P [(X, Y ) ∈ A], where A is the region (xi , yj )|xi + yj ≤ 1

Mathematical Expectations March 30, 2023 14 / 1


Joint Probability Distributions - Expected Value of a
function

Let X and Y be random variables with joint probability distribution


px (xi , yj ). The mean, or expected value, of the random variable
g(X, Y ) is
XX
µg(X,Y ) = E[g(X, Y )] = g(x, y)p(xi , yj )
xi yi

if X and Y are discrete, and


Z ∞ Z ∞
µg(X,Y ) = E[g(X, Y )] = g(x, y)f (x, y)dxdy
−∞ −∞

if X and Y are continuous.

Mathematical Expectations March 30, 2023 15 / 1


Marginal Distributions

Definition (Marginal Distributions)


The marginal distributions of X alone and Y alone are
X X
g(xi ) = p(xi , yj ) and h(yj ) = p(xi , yj )
yi xi

for the discrete case and


Z ∞ Z ∞
g(x) = f (x, y)dy and h(y) = f (x, y)dx
−∞ −∞

for the continuous case.


The term marginal is used here because, in the discrete case, the values
of g(x) and h(y) are just the marginal totals of the respective columns
and rows when the values of p(xi , yj ) are displayed in a rectangular
table.
Mathematical Expectations March 30, 2023 16 / 1
Marginal Distributions - Expected Value
In calculating E(X) over a two-dimensional space, one may use either
the joint probability distribution of X and Y or the marginal
distribution of X:
XX X
E(X) = xi px (xi , yj ) = xi g(xi ) discrete
xi yj xi
Z ∞ Z ∞ Z ∞
E(X) = xf (x, y)dydx = xg(x)dx continuous
−∞ −∞ −∞
Similarly, we define
XX X
E(Y ) = yj px (xi , yj ) = yj h(yi ) discrete
yj xi yj
Z ∞ Z ∞ Z ∞
E(Y ) = yf (x, y)dxdy = yh(y)dy continuous
−∞ −∞ −∞

where h(y) is the marginal distribution of the random variable Y .


Mathematical Expectations March 30, 2023 17 / 1
Conditional Probability Distribution

Definition (Conditional Distribution)


Let X and Y be two random variables, discrete or continuous. The
conditional distribution of the random variable Y given that X = x is

f (x, y)
f (y|x) = provided g(x) > 0
g(x)

Similarly, the conditional distribution of X given that Y = y is

f (x, y)
f (x|y) = provided h(y) > 0
h(y)

for the continuous case.

Mathematical Expectations March 30, 2023 18 / 1


Statistical Independence

Definition
Let X and Y be two random variables, discrete or continuous, with
joint probability distribution f (x, y) and marginal distributions g(x)
and h(y), respectively. The random variables X and Y are said to be
statistically independent if and only if

f (x, y) = g(x)h(y)

for all (x, y) within their range

Mathematical Expectations March 30, 2023 19 / 1


Covariance
The covariance between two random variables is a measure of the
nature of the association between the two.

Definition (Discrete Case)


Let X and Y be random variables with joint probability distribution
p(xi , yj ). The covariance of X and Y is
XX
σXY = E[(X − µX )(Y − µY )] = (xi − µX )(yj − µY )p(xi , yj )
i j

Definition (Continuous Case)


Let X and Y be random variables with joint density function f (x, y).
The covariance of X and Y is
Z ∞ Z ∞
σXY = E[(X − µX )(Y − µY )] = (x − µX )(y − µY )f (x, y)dxdy
−∞ −∞

Mathematical Expectations March 30, 2023 20 / 1


Covariance

Alternative way
The alternative and preferred formula for covariance is

σXY = E(XY ) − µX µY

Derivation is similar to derivation for 1-D random variable.


Try yourself!

Mathematical Expectations March 30, 2023 21 / 1


Correlation coefficient

Alternative way
Covariance provide simple information regarding the nature of the
relationship between X and Y
The magnitude of covariance does not indicate anything
It is not dimensionless
Magnitude depends on units selected
Dimensionless version of the covariance called as the correlation
coefficient
σXY
ρXY =
σX σY

Mathematical Expectations March 30, 2023 22 / 1


Means and Variances of Linear Combinations of
Random Variables

Mathematical Expectations March 30, 2023 23 / 1


Chebyshew’s Theorem
Variance of a random variable tells us something about the
variability of the observations around the mean
if σ is large, we expect the area under a probability density
function to be more spread out
The Russian mathematician P. L. Chebyshev (1821–1894)
discovered that the fraction of the area between any two values
symmetric about the mean is related to the standard deviation.

Definition (Chebyshev’s Theorem)


The probability that any random variable X will assume a value within
k standard deviations of the mean is at least 1 − 1/k 2 . That is
1
P (µX − kσX < X < µX + kσX ) ≥ 1 −
k2
where k is a strictly positive real number.

Mathematical Expectations March 30, 2023 24 / 1


Example

for k = 2,
X has a probability of at least 3/4 of falling within two standard
deviations of the mean.
Alternatively, 3/4 or more of the observations of any distribution lie
in the interval µX ± 2σX
for k = 3
at least eight-ninths of the observations of any distribution fall in
the interval µX ± 3σX

Example
A random variable X has a mean of 8, a variance of 9, and an unknown
probability distribution. Find P (−4 > X > 20) and P (|X − 8| ≥ 6)

Mathematical Expectations March 30, 2023 25 / 1


End of Lesson

Questions?

Mathematical Expectations March 30, 2023 26 / 1

You might also like