You are on page 1of 36

PART-A

1).How interval conditioning is different from point conditioning?

Conditional distribution function of a random variable X given an event B is defined as


𝑥 𝑥
𝐹𝑋 ( ) = 𝑃 {𝑋 ≤ }
𝐵 𝐵
𝑃{𝑋 ≤ 𝑥 ∩ 𝐵}
=
𝑃(𝐵)

Point Conditioning: Distribution function of one random variable X conditioned by a second


random variable Y at specific point y.
Event B is defined by
𝐵 = {𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦}
Where ∆𝑦 a small quantity is tends to approach 0.

Interval Conditioning: Distribution function of one random variable X conditioned by a


second random variable Y for certain interval.
Event B is defined by

𝐵 = {𝑦𝑎 ≤ 𝑌 ≤ 𝑦𝑏}

Where ya and yb are real numbers.

2. Define marginal distribution and marginal density function?

The distribution function of one random variable can be obtained by setting the value of other
variable to infinity in joint cumulative distribution function FXY (x, y) . The functions
FX (x) or FY ( y) obtained in this manner are called Marginal Distribution Functions of X and
Y respectively.

Marginal distribution function of X is given by,

Let FXY (x, y) y  FXY (x, )


  P X  x, Y  
  PX  x
FXY (x, )  FX (x)
y x

But, joint CDF is, FXY (x, y) 


 f
y x-
XY (x, y)dxdy

 The CDF of X alone, independent of Y can be obtained, by setting y   in joint CDF,
we get,
 x

FX (x) 
i. e,
 f
y x-
XY (x, y)dxdy

This is referred to as Marginal Distribution of X.

milarly, the marginal distribution function of Y is given by,

Let FXY (x, y) x   FXY (, y)

 P X  , Y  y
 PY  y
FXY (, y)  FY ( y)

The CDF of Y alone, independent of X can be obtained, by setting x   in joint CDF,


we get,
y 

FY ( y) 
i. e,
 f
y x-
XY (x, y)dxdy

This is referred to as Marginal Distribution of Y.

The corresponding density of X alone, independent of Y is,


d
   d   x 
fX (x)  FX (x)     XY f (x, y)dxdy


dx dx  y  x -
  


fX (x) 
f
y -
XY (x, y)dy

This is referred to as Marginal Density of X.


Similarly, the marginal density of Y is given by,


fY ( y)  fXY (x, y)dx

3. Statistically independent random variables X and Y have moments m10 =2, m20=14,
m02=12, m11=-6. Find µ 22.

Ans. Given data,


m10  EX   2,
 
m 20  E X  14,
2

 
m02  E Y2  12,
m11  EXY   6
22  ?
X and Y are statistically independent random variables,
E XY  E X E Y
- 6  2EY 
 EY  3


Now, 22  E X - m x 2 Y - m y 2 
 EX - m x  2 E Y - m  2
y
X and Y are independen t RV' s
 
 E X 2  EX 
2
EY   EY  
2 2


 14  22 12  32 
 10x3
22  30
4. State Central limit theorem.
Central Limit Theorem:
Central limit theorem states that the probability density of the sum of large number of independent
random variables approaches a Gaussian density function.
If X , X , X .......... X is a sequence of ‘n’ random variables with EX   m and VarX    2
1 2 3 n i i i i

n
where i  1, 2 ..... n, and if S n   X i , then as n   . Sn follows Gaussian distribution with mean
i1

m   m and variance  2   2 .
n n

i i
i1
5. Write two properties of joint distribution function of random variables.

(1) Joint CDF is always non-negative and it is lies in between o and 1.


i. e, 0  FXY (x, y)  1
FXY (,)  PX  , Y   0

(2) FXY (x, )  PX  x, Y   PX  x


FXY (x, )  FX (x) is called marginal distribution of ‘X’.
 x

i.e, FX (x) 
 f
y x-
XY (x, y)dxdy

FXY (, y)  PX  , Y  y PY  y


FXY (, y)  FY ( y) It is called marginal distribution of ‘Y’.
y 

i. e, FY ( y) 
 f
y x-
XY (x, y)dxdy

(3) Joint CDF is a non-decreasing function of both x and y , if and only if


x1  x2 , and y1  y2 , then
y2 x2

Px1  X  x2 , y1  Y  y2 
 f
y y1 xx1
XY (x, y)dxdy

Px1  X  x 2 , y1  Y  y 2   FXY (x 2 , y 2 )  FXY (x1 , y1 )  FXY (x1 , y 2 )  FXY (x 2 , y1 )

6. Two statistically independent random variables X and Y have mean values


E[X] =2 and E[Y] =4. They have second moments E[X2] =8 and E [Y2] =25. Find varianceof
W=3X-Y.

If X and Y are two statistically independent random variables then

Var (aX+bY) =a2

Var(X) +b2 Var(Y)

Var(X) = E[X2]-

(E[X])2= 8 - 4 = 4

Var(Y) = E [Y2]-(E[Y])2 =25 -16 = 9

Var (W) = Var (3X-Y) = 32 Var(X) + 12 Var(Y)

= 9*4 + 1*9

= 45

7. Write short notes on Jointly Gaussian random variables.


8. Obtain the expression for conditional density f x(X/B) where event B is defined as
{y a  Y  yb } .
9. When N random variables are said to be jointly Gaussian?
PART-B

1. Define joint CDF? State and prove its properties?

Let X and Y be two Random variables then the cumulative distribution function for two
Random variables X and Y, is defined as

FXY(x,y) = P(X≤x,Y≤y).

Properties:-

1. FXY(-∞,-∞) = 0

Proof: FXY (x,y) = P(X≤x,Y≤y)

FXY(-∞,-∞) = P(X≤-∞,Y≤-∞)

=0
2. FXY(+∞,+∞) = 1

Proof: FXY(x,y) = P(X≤x,Y≤y)

FXY(+∞,+∞) = P(X≤+∞,Y≤+∞)

=1

3. FXY(x,-∞) = 0

Proof:

The joint CDF of two random variables X and Y is given as


𝑛
FXY(x,y) = P(X≤x,Y≤y) = ∑𝑚
𝑖=1 ∑𝑗=1 𝑃(𝑥𝑖, 𝑦𝑗)
Where P(xi,yj) is the joint probability i.e., the probability that X=xi and Y=yj jointly.

Since FY (-∞) = P(Y≤-∞) = 0 i.e., there is no value less than -∞, and hence P(Y≤-∞)
= 0.So,the joint probability of P(X≤x,Y≤-∞) is zero.

Thus, FXY(x,-∞) = 0

Similarly FXY(-∞,y) = P(X≤-∞,Y≤y) = 0.

1. The joint CDF FXY(x,y) is a non decreasing function of both x and y and is also boundedas
0 ≤ FXY(x,y) ≤ 1.

Similarly to the CDF of a single random variable X 0 ≤ FX(x) ≤ 1

Joint CDF of two random variable FXY(x,y) is also lies between 0 and 1.

5. P(x1<X≤x1,y1<Y≤y2) = FXY(x2,y2) + FXY(x1,y1) - FXY(x1,y2) - FXY(x2,y1).

Proof:

In the case of a single random variable P(x1<X<x2) = P(X≤x2) - P(X≤x1)

= FX(x2) - FX(x1)

Similarly, in the case of the two random variables,

P(x1<X≤x1 ,y1<Y≤y2) = FXY (x2 ,y2 ) + FXY(x1,y1) - FXY(x1,y2) -

FXY(x2 ,y1)

Ex:-Consider two random variables X and Y,each taking values 1,2 and 3. Now ,we
want tocompute P(1<X≤3;1<Y≤3).

This is equal to P(X=2,Y=2) + P(X=2,Y=3) + P(X=3,Y=2) + P(X=3,Y=3).

Now, consider
FXY(x2 ,y2) + FXY(x1,y1) – FXY (x1 ,y2 ) – FXY(x2,y1)

=P(X≤x2,Y≤y2) + P(X≤x1,Y≤y1) – P(X≤x1,Y≤y2) – P(X≤x2,Y≤y1)

=P(1,1) + P(1,2) + P(1,3) + P(2,1) + P(2,2) + P(2,3) + P(3,1) + P(3,2) + P(3,3) + P(1,1) –
P(1,1) – P(1,2) – P(1,3) – P(1,1) – P(2,1) – P(3,1)

=P(2,2) + P(2,3) + P(3,2) + P(3,3)


Thus P(1<X≤3,1<Y≤3) = FXY (3,3) + FXY(1,1) – FXY (1,3) – FXY(3,1)

6. Marginal Distribution Functions

FXY(x,∞) = FX(x) FXY(∞,y) = FY(y)

Proof:

FXY(x,∞) = P(X≤x;Y≤∞)

Since (Y≤∞) is a certain event , P(X≤x,Y≤∞) = P(X≤x) =

FX(x) Similarly, FXY(∞,y) = P(X≤∞;Y≤y) = P(Y≤y) ,FY(y)

Since (X≤∞) is a certain event.

Thus to find the individual CDF of a random variable from the joint CDF, set the other
to ∞.The individual CDF, thus obtained is referred to as marginal CDF of that random
variable.

Thus ,FX(x) obtained from F XY(x,y), by setting Y to ∞. i.e., F XY(x,∞) is called


marginal CDFof X , computed independent of Y , similarly ,FY(y).

2. The joint probability density function of f(x,y) is given by f(x,y)= 8xy 0≤x≤1, 0≤y≤x
Find the marginal density of X and Y.
Find the conditional density functions of X and Y.
Verify that whether X and Y are independent.
3.The joint probability function of two discrete random variables X and Y is given by f(x,y)=cxy for
x=1,2,3 and y=1,2,3 and equals zero otherwise. Find (a) Constant c (b) P(X=2,Y=3) (c)
P(1≤X≤2,Y≤2) (d) P(X≥2) (e) P(Y=3).

4. Derive an expression for probability density of sum of two random variables?


5. Explain the joint moments of random variables?
6.Let X and Y be independent random variables such that X = 1 with prob.1/3 Y =2 with prob.1/3
= 0 with prob.2/3 = -3 with prob.1/4
Find (a) E(3X+2Y) (b) E(2X2-Y2) (c) E(XY)
7. A random variable X has X  3, X2  11and  2  2. For a new random variable
X

Y=2X-3, find: (i) Y (ii) Y2 (iii)  2

Given X̅ = -3 , X̅ 2=11 and σ 2=2

Y=2X-3
( i)Y̅ = E[Y]=E[2X-3]

=E[2X]-3E[1]

=2E[X]-3

Y̅ =-9

(ii)Y̅ 2= E[Y2]=(E[2X-
3]2)

=E[4X2+9-12X]

=4E[X2]+9E[1]-12E[X]

=4(11)+9-12(-3)
=44+9+36

=89

σx2=2

σ2Y= E[Y 2]- (E[Y])2

=89-(-9) 2

=89-81

σ 2Y = 8

8.Two random variables Y1 and Y2 related to arbitrary random variables X and Y byco-ordinate
rotation Y1= Xcosθ+Ysinθ, Y2=-Xsinθ+Ycosθ.
i) Find the covariance function of Y1 and Y2. ii) For what value of ɵ, the random variables Y1 and
Y2 are uncorrelated.
9. Define joint characteristic function?
10.Explain jointly Gaussian density function for two random variables X and Y.
11.Let X and Y are two random variables having joint density function
f(x,y)= x+y for 0≤x≤1, 0≤y≤1
0 otherwise
find (a) Var(X) (b) Var(Y) (c) Cov(X,Y) (d) Correlation coefficient ρ
12.Two random variables X and Y have a joint characteristic function
ФX,Y(ω1, ω2)= exp(-2 ω12 - ω22) show that X and Y are zero mean random variables and
uncorrelated.
13.The joint probability density function of f(x,y) is given byf(x,y)= Ae-(x+y) 0≤x≤y, 0≤y≤Ꝏ

Find the value of A


Find the marginal density of X and Y.
Verify that whether X and Y are independent.
14.(a) Write about joint distribution function of two random variables. Discuss its
properties.
(b) Explain conditional distribution function of two random variables.
(c ) The joint probability density of two random variables x and y is given as
f(x,y) = C(2x+y) 0≤x≤1,0≤y≤2
=0 elsewhere
i)Find the value of C ii)Find the marginal functions of X and Y.
15.a) Explain conditional density function of two random variables.

b)The characteristic function for a Guassian random variable X, having mean value of 0,
is Φx( ω) =exp(-σ x2ω2/2). Find all the moments of the X.

[UNIT-2 QUESTION]

You might also like