Professional Documents
Culture Documents
Assignment
Assignment
Assignment
Definition:
Let X and Y be two rrvs on probability space (,A , P). A two-dimensional random
variable (X, Y) is a function mapping (X, Y ) : ! R2, such that for any numbers x, y e R:
{! 2 | X (!) _x and Y (!) _y} 2 A
A bivariate probability is the probability that a certain event will occur there are two independent
random variables in your scenario.
X Y 1 2 3
0 0.184 0.216 0.238
1 0.133 0.106 0.123
Table No.2 Joint Probability Mass function pxy (x,y)
Properties
Property 1.1. Let X and Y be two discrete rrvs on probability space (,A, P) with joint pmf pXY
, then the following holds :
• pXY (x, y) _ 0, for x 2 X() and y 2 Y ()
• Px 2X(P y2Y () pXY (x, y) = 1
Property 1.2. Let X and Y be two discrete rows on probability space (,A, P)
with joint pmf pXY . Then, for any subset S _ X() × Y.
Example 2.
Consider the following joint probability mass function : pXY (x, y) = xy2 13 1S(x,
y)with S = {(1, 1), (1, 2), (2, 2)}
Show that pXY is a valid joint probability mass function.
Answer.
The joint probability mass function of X and Y is given by the following table :
X Y 1 2
1 0.077 0.307
2 0 0.615
Hence, pXY is indeed a valid joint probability mass function. Because sum of all values is equal
to 1.
Example 3.
What is P(X + Y _ 3)?
Answer:
Let us denote B the set of values such that X + Y _ 3 :
B = {(1, 1), (1, 2), (2, 1)}
P(X + Y _ 3) = pXY (1, 1) + pXY (1, 2) + pXY (2, 1)
= 0.077 + 0.307 + 0
= 0.38
Example 4.
Give the marginal probability mass functions of X and Y .
Answer.
The marginal probability mass function of X is given by :
= 0.077 + 0.307
= 0.38
= 0 + 0.615
= 0.615
= 0.077 + 0
= 0.077
= 0.307 + 0.0615
= 0.922
Example 5.
What are the expected values of X and y ?
Answer.
The expected value of X is given by:
= 1. 0.077 + 2.0.615
= 1.615
Example 6.
Are X and Y independent?
Answer:
It is easy to see that X and Y are not independent, since,
= 0.047
Since 0.047 is not equal to 0 therefore it is not equal to pXY (2, 1).
(Joint probability density function). Let X and Y be two continuous rrvs on probability space (,A,
P). X and Y are said to be jointly continuous if there exists a function fXY such that, for any
Borel set on R2.
P((X, Y ) 2 B) =Z fXY (x, y) dx dy
Then function fXY is called the joint probability density function of X and Y .
Let X and Y be two continuous rrvs on probability space (,A, P) with joint pdf fXY , then the
following holds :
• fXY (x, y) _ 0, for x, y 2 R
(Marginal distributions). Let X and Y be two continuous rrvs on probability space (,A, P) with
joint pdf fXY . Then the pdf of X alone is called the marginal probability density function of X
and is defined by:
fX(x) = Z 1 −1 fXY (x, y) dy, for x 2 R
Similarly, the pdf of Y alone is called the marginal probability density function
of Y and is defined by:
fY (y) =Z 1−1 fXY (x, y) dx, for y 2 R
The Correlation Coefficient
Covariance
Definition
(Covariance). Let X and Y be two rrvs on probability space (,A, P). The covariance of
X and Y , denoted by Cov(X, Y ), is defined as follows :
Cov(X, Y ) = E[(X − E[X])(Y − E[Y ])] (15)
upon existence of the above expression
• If X and Y are discrete rrv with joint pmf pXY , then the covariance of X
and Y is :
Cov(X, Y ) =Xx2X()Xy2Y ()(x − E[X])(y − E[Y ])pXY (x, y) (16)
• If X and Y are continuous rrv with joint pdf fXY , then the covariance of
X and Y is :
Cov(X, Y ) =Z 1−1Z 1−1(x − E[X])(y − E[Y ])fXY (x, y) dx dy
Theorem:
Let X and Y be two rrvs on probability space (,A, P). The covariance of X and Y can
be calculated as :
Cov(X, Y ) = E[XY ] − E[X]E[Y ]
Properties of Covariance
Here are some properties of the covariance. For any random variables X and Y , we have :
1. Cov(X, Y ) = Cov(Y,X)
2. Cov(X,X) = Var(X)
3. Cov(aX, Y ) = aCov(X, Y ) for a 2 R
Correlation
Definition:
(Correlation). Let X and Y be two rrvs on probability space (,A, P) with
respective standard deviations _X =p Var(X) and _Y =p Var(Y ).The correlation of X and Y ,
denoted by _XY.
Interpretation of Correlation:
The correlation coefficient of X and Y is interpreted
as follows:
1. If _XY = 1, then X and Y are perfectly, positively, linearly correlated.
2. If _XY = −1, then X and Y are perfectly, negatively, linearly correlated. X and Y are also said to
be perfectly linearly anti correlated
3. If _XY = 0, then X and Y are completely, linearly uncorrelated. That is, X and Y may be
perfectly correlated in some other manner, in a parabolic manner, perhaps, but not in a linear
manner.
4. If 0 < _XY < 1, then X and Y are positively, linearly correlated, but not perfectly so.
5. If −1 < _XY < 0, then X and Y are negatively, linearly correlated, but not perfectly so. X and Y
are also said to be linearly anti correlated.
Theorem:
Let X and Y be two rrvs on probability space (,A, P). If X and Y are independent,
then
Cov(X, Y ) = _XY = 0
Note, however, that the converse of the theorem is not necessarily true! That is, zero correlation
does not imply independence.
Conditional Distributions
Let X and Y be two discrete rrvs on probability space (,A, P) with joint pmf pXY and respective
marginal pmfs pX and pY . Then, for y 2 Y (), the conditional probability mass function of X
given Y = y.
(Conditional Expectation). Let X and Y be two discrete rrvs on probability space (,A, P). Then,
for y 2 Y (), the conditional expectation of X given Y = y.
Properties
(Linearity of conditional expectation). Let X and Y be two discrete rrvs on probability space (,A,
P). If c1, c2 2 R and g1 : X() ! R and g2 : X() ! R are piecewise continuous functions. Then, for y
2 Y (), we have :
E[c1g1(X) + c2g2(X)|Y = y] = c1E[g1(X)|Y = y] + c2E[g2(X)|Y = y]
Let X and Y be two discrete rrvs on probability space (,A, P). If X and Y are independent, then,
for y 2 Y (), we have : pX|Y (x|y) = pX(x) for all x 2 X() (26) Similarly, for x 2 X(), we have :
pY |X(y|x) = pY (y) for all y 2 Y ()
Let X and Y be two discrete rrvs on probability space (,A, P). If X and Y are independent, then,
for y 2 Y (), we have : E[X|Y = y] = E[X] (28)
Similarly, for x 2 X(), we have :
E[Y |X = x] = E[Y ]