You are on page 1of 5

Independence of random variables

Let the random variablesX and Y have the joint p.d.f f (x, y) and the Marginal
p.d.f f (x) and f (y) respectively.The random variables X and Y are said to be
Independent if and only if

f (x, y) = f (x).f (y)

That is, the joint p.d.f is given by the product of the corresponding values
of the marginal distributions.

Example 1
Suppose the joint p.d.f of X and Y is given by;
(
12xy()1 − y, 0 < x < 1, 0 < y < 1
f (x, y) =
0, elsewhere

Determine whether or not X and Y are independent.

Solution
Find the marginals as;

ˆ1 ˆ1 ˆ1
f (x) = f (x, y)dy = 12xy(1 − y)dy = 12 (xy − xy 2 )dy
0 0 0

1
xy 2 xy 3
 hx xi hxi
= 12 − = 12 − = 12 = 2x
2 3 2
0 3 6
(
2x, 0 < x < 1
f (x) =
0, elsewhere
Similarly,

ˆ1 ˆ1 ˆ1
f (y) = f (x, y)dx = 12xy(1 − y)dx = 12 (xy − xy 2 )dy
0 0 0

1
x2 y x2 y 2 y y2
  
= 12 − = 12 − = 6y(1 − y)
2 2 0 2 2
(
6y(1 − y), 0 < y < 1
f (y) =
0, elsewhere

1
Now, multiplying f (x) and f (y) gives,
(2x)(6y(1 − y)) = 12xy(1 − y)

= f (x, y)
Therefore,X and Y are independent

Bivariate expectation
Let Xand Y be joint distributed random variablbles with joint pdf f (x, y).Let
U (x, y) be a function of X nd Y ,then the expected value of U (x, y) is defined
by:

ˆ∞ ˆ∞
i) E[U (x, y)] = U (x, y) f (x, y) dxdy [f or continous case]
−∞ −∞
XX
ii) E [U (x, y)] = U (x, y) f (x, y) [f or discrete case]
y x

ˆ∞ ˆ∞ ˆ∞
r r
X r f (x)dx T he rth moment of f (x, y)
 
iii)E(X ) = X f (x, y)dxdy =
−∞ −∞ −∞

ˆ∞ ˆ∞ ˆ∞
s s
Y s f (y)dy T he sth moment of f (x, y)
 
iv)E(Y ) = Y f (x, y)dxdy =
−∞ −∞ −∞
r and s are non-negative integers
Therefore the mean X and Y are respectively given as:
ˆ∞
µx = E(X) = xf (x)dx
−∞

ˆ∞
µy = E(Y ) = yf (y)dy
−∞
The joint moments or the raw product moments of X andY are defined as
ˆ∞ ˆ∞
E[X r Y s ] = X r Y s f (x, y) dxdy
−∞ −∞

If r=s=1,then
ˆ∞ ˆ∞
E[XY ] = XY f (x, y) dxdy.
−∞ −∞

2
Example 1
Let X and Y be Jointly distributed random variables with pdf
(
P x+y (1 − P )2−x−y , x = 0, 1 y = 0, 1
f (x, y) = .
0, elsewhere

Compute,i) E(X) (ii) E(X + Y ) (iii) E(XY ).

Solution
i)Find the marginal distribution of X,i.e f (x) as;
1
X
f (x) = P x+y (1 − P )2−x−y
y=0

= px (1 − p)2−x + p1+x (1 − p)1−x

= px (1 − p)1−x [1 − p + p] = px (1 − p)1−x , x = 0, 1.
Now,
1
X 1
X
i) E(X) = xf (x) = xpx (1 − p)1−x = p
x=0 x=0

Note also E(Y ) = p


1 X
X 1
ii) E(X + Y ) = (x + y)px+y (1 − p)2−x−y.
x=0 y=0

= 0 + p(1 − p) + p(1 − p) + 2p2 = p − p2 + p − p2 + 2p.

= 2p
OR E(X + Y ) = E(X) + E(Y ) = p + p = 2p.
1 X
X 1
iii) E(XY ) = (xy)px+y (1 − p)2−x−y.
x=0 y=0

0 + 0 + 0 + p2
= p2 .
OR since X and Y are independent,then E(XY ) = E(X).E(Y )
Hence,
E(XY ) = p × p = p2 .

3
Example 2
suppose the joint pdf of X and Y is given as.
(
4xy, 0 < x < 1, 0 < y < 1
f (x, y) =
0, elsewhere

Compute:(i) E(X) (ii) E(XY ) (iii) E(3X + 2Y )

Solution
ˆ1 ˆ1 ˆ1 ˆ1
(i) E(X) = x4xy dxdy = 4x2 y dxdy
0 0 0 0

ˆ1 1 ˆ1  2 1
4x3 y

4y 4y
= dy = dy =
3 0 3 6 0
0 0

4 2
= = .
6 3
ˆ1 ˆ1 ˆ1 ˆ1
(ii) E(XY ) = xy4xy dxdy = 4x2 y 2 dxdy
0 0 0 0

ˆ1  1 ˆ1  3 1
4x3 y 2 4y 2 4y
= dy = dy =
3 0 3 9 0
0 0

4
=
9

ˆ1 ˆ1 ˆ1 ˆ1
(ii) E(3X + 2Y ) = (3x + 2y)4xy dxdy = (12x2 y + 8xy 2 ) dxdy
0 0 0 0

ˆ1 ˆ1 1
4y 3

 3 1
= 4x y + 4x2 y 2 0 dy = 2 2
(4y + 4y )dy = 2y +
3 0
0 0

4 1
=3 . =2+
3 3
(Question:Find the marginals of X and Y and show that X nd Y are inde-
pendent.)

4
Exercise
Let X and Y have the bivariate density given by
(
2, 0 < x < y < 1
f (x, y) =
0, elsewhere

Compute;(i) f (x) (ii) f (y) (iii) E(X) (iv) E(Y ) (v) E(XY )

1 Regression and Correlation;


1.1 Calculation of regression and correlation coefficient
for Bivariate data;

2 Distribution of function of random variables


2.1 distribution functon technique
2.2 transformation of variables
2.3 moment generating function technique

3 Bivariate normal distribution

You might also like