You are on page 1of 6

Joint probability distributions:

1) Problem:
Table: Joint probability distribution of X and Y
Y
30
10/56

X=xi
10/56

30

19/56

18/56

37/56

60

6/56

9/56

15/56

90

1/56

1/56

Y=yi

29/56

24/56

10/56

X
0

90

60

X takes the value 0, 30, 60, 90


Y takes values 30, 60, 90

Pij = P[ X=xi Y=yj ] = (i, j)th cell value

P[X=xi ] : Marginal distribution of X

P[Y= yi ] : Marginal distribution of Y

Example:
Let xi = 0. Thus, P[X= 0] = P[X=0Y=30]+P[X=0Y=60]+P[X=0Y=90]
= 10/56 + 0 + 0 = 10/56
Q1. Find the following:
P[X=30], P[X=60], P[X=90]
Marginal distribution of Y; Marginal expectation of X; Marginal expectation of Y;
Marginal variance of X; marginal variance of Y.

Joint pmf of X and Y:


If the tabular representation of the joint distribution is inadequate (e.g. with infinite
number of values of X or Y or both), then the cell probabilities could be generated from a
function f(x,y)= P[X=x, Y=y] so that

f(x,y) = 1, f(x,y) 0
Example:
1. Is the following function a pmf?

Solution:

f ( x , y ) 0 x y f ( x , y ) =

f ( x , y ) =p

2.

f ( x , y )=

x+ y

2x

xy
, x=0,1,2,3 y =1,2,3,4
60

1
[ x . y ] =1 . Hence it is a pmf.
60

, x=0,1; y =0,1,2, (Work yourself: Hint px q1-x is pmf of

Bernoulli distribution)
Marginal pmf:
The marginal pmf of Y is obtained from the joint pmf as follows:

f ( x , y )=f ( y )=P[Y = y ]
x

Similarly, f ( x) can be defined.


Marginal expectation of X:
E [ X ] = xf (x )
x

, where f (x) is the marginal pmf.

Example:
f ( x , y ) =p x+ y q 2x , x=0,1; y =0,1,2, The marginal distribution of X is given as
f ( x )= p x+ y q2 x = px q1 x
y

Conditional Probability:

q
= p x q 1x
. E[X] = p.
1 p

P[X=x|Y=y]=

P [ X =x , Y = y ]
.
P[Y = y ]

Eaxmple: From the above table we observe the following:


P[ X=x/Y=30] = 10/29, if x=0
19/29, if x=30
1
, if x=60
1
, if x=9
Q2. Find the following:
P[X=x|Y=60], P[X=x| Y=90]

(Work out on your own)

Conditional Mean and Variance:


E(X|Y=y)= m(y)=

xP [X =xY = y ]
x

2
2
, V(X/Y=y) = h(y)= E [ X Y = y ]E [ XY = y ]

Joint distribution characteristics:

E (XY) =

XY P[X=x, Y=y]

= 0[30* P[X=0,Y=30] + ] + 30[ 30. P[X=30, Y= 30] + .] + 60[60.P[X=30,


Y=60].]+ 90[90.P[X=30, Y=90]+.]
COVARIANCE:
Motivation: If all the points are distributed over quadrants I and III (i.e of same sign), then their
product XY >0. This indicates that X and Y change in the same direction. Similarly, if they are
distributed over quadrats II and IV, then they change in opposite direction.
The following figure indicates how X & Y correspond to each other.
Fig:

If X and Y are centred about their mean then expectation of the centred variables is known as
covariance.

Cov (X, Y) = xy =

[x-E(X)] [y-E(y)] f(x,y)

xy f(x,y) - E(X)E(Y)

Example:
1. Calculate Cov(x,y) in the example in the table given (Use EXCEL) (Work on your own)
Correlation Coefficient:
If X and Y are standardized to make them unit free and comparable, then the covariance between

the standardized X & Y is called the correlation coefficient ( xy ).


xy =

where

Cov ( X ,Y )
xy
x y

are standard deviations of X and Y respectively.

Important Results:
a.

xy

is symmetric

b.

xy

ranges within [-1,1]

c.

xy

measures the strength of linear relationship


XE [ X ] Y E [ Y ]
=
Y =E [ Y ] + y ( XE [ X ] ) ,
x
y
x

Proof : x , y =1

i. e linear relationship.
INDEPENDENCE OF RANDOM VARIABLES:
X and Y are independent if
P[X=xY=y] = P[X=x] P[Y=y]
i.e. f ( x , y ) =f ( x ) f ( y )
Consequences:

E(XY) =

xy P[X=x, Y=y]

xy P[X=x] P[Y=y]

x P[X=x]

yP[Y=y]

=E(x) E(y)

Sum law of expectation:

E(aX+bY) =aE(X)+bE(Y)

Extension to n variables:
n

E(C1X1+ C2X2+ C3X3+..+ CnXn) =

i 1

Ci E(Xi)

H.W

Variance of sum:
V(X+Y) =V(X)+V(Y)+2COV(X,Y)

(H.W)

Extension:
n

V(C1X1+ C2X2+ C3X3+..+ CnXn) =

i 1

Ci 2 V(Xi) + 2

Ci CjCov(Xi, Yj)

X Y independent Cov (X , Y )=0 , but not the converse.

Continuous case for multiple variables


Joint pdf, marginal pdf and conditional pdfs are defined in a similar way to joint pmf and related
discrete counterparts. Here the sum is replaced by integration and pdf does not indicate
probability. It is just a function integrating which over a set A one may get the probability of the
set X A .
Joint pdf of X and Y:
Suppose X & Y are continuous and

A R

is any set. The probability that ( x , y ) A

be obtained by integrating a function f(x,y) so that

f(x,y) 0 &

f ( x , y ) dxdy =1
R

Similarly other results from discrete distributions can be extended here.

Likelihood of independent rvs


Likelihood is the joint pdf/pmf of the variables.
n

f(x1, x2..,xn) =
Function of a Random variable:

g ( x ) P [ X=x ]

E[g(x)] =

g ( x ) f ( x ) dx

V[g(x)] = E[g2(x)] E2[g(x)]

i 1

f(xi)

could

You might also like