You are on page 1of 10

# PTSP‐UNIT IV    Questions & Answers

GRIET‐ECE    1
UNIT 4
1. Define joint distribution and joint probability density function for the two
random variables X and Y.
Ans:
Let F
X
(x) and F
¥
(y) represent the probability distribution functions of two
random variables X and Y respectively.
F
X
(x) = P(X ¸ x)
and F
¥
(y) = P(¥ ¸ y)
The probability of a joint event {X ¸ x, ¥ ¸ y} is the joint probability
distribution function F
X,¥
(x, y) defined as
F
X,¥
(x, y) = P{X ¸ x, ¥ ¸ y]
The joint probability density function ¡
X,¥
(x, y) may be defined as the
second derivative of the joint probability distribution function.

¡
X,¥
(x, y) =
ð
2
ð
x
ð
j
F
X,¥
(x, y)
2. A joint probability density function is
f(x,y) =
1
ah
for 0 < x < a, 0 < y < b
= 0 elsewhere.
Find the joint probability distribution function.
Ans:
¡(x, y) =
1
ub
for 0 < x < a, 0 < y < b
= 0 elsewhere
F

(x, y) = ] ] ¡(x, y)Jx Jy
¡
0
x
0

= ] ]
1
ub
Jx Jy
¡
0
x
0

www.jntuworld.com
www.jntuworld.com
GRIET‐ECE    2
=
¡
ub
] Jx =

ub
x
0

· F

(x, y) = u ¡or x < u; y < u
=

ub
¡or u < x < o, u < y < b
= 1 ¡or x ¸ o; y ¸ b
3. (a) Let X and Y be random variable with the joint density function.
¡(x, y) = 2 ¡ur û < x < y < 1
Find marginal and conditional density functions.
(b) Distinguish between joint distribution and marginal distribution.
Ans:
(a) ¡(x) = ] 2 Jy = 2(1 -x)
1
x

¡(y) = ] 2 Jx = 2
1
0

¡(
x
¡
) =
](x,¡)
](¡)
=
2
2
= 1
¡(
¡
x
) =
](x,¡)
](x)
=
1
1-x
¡or u < x < 1

(b) If (X,Y) is a two dimensional discrete random variable such that P(X = x
ì
, ¥ = y
]
) = P
ì]

then P
ì]
is called probability mass function or probability function of (X,Y) provided the
following conditions are satisfied
1. P
ì]
¸ u ¡or oll i onJ ] 2. ∑ ∑ P
ì]
= 1
] ì

The set of triples {x
ì
, y
]
, P
ì]
], i=1, 2,….m, j=1, 2,…..n is called joint probability
distribution of (X,Y).

P(X=x
ì
)=∑ P
ì] ]
is called marginal probability function of X. It is defined for
X=x
1
, x
2
… and denotes as P
ì
.

The collection of pairs {x
ì
, P
ì
], i=1,2,3,….. is called the marginal
probability distribution of X.

www.jntuworld.com
www.jntuworld.com
GRIET‐ECE    3

4. Distinguish between point conditioning and interval conditioning?
Ans:
We know that the conditional probability of A given B is written as
P [
A
B
¸ =
P(ArB)
P(B)

This concept can very well applied to conditional distribution function also. Let A be an
event (X ≤ x) for the random variable X. If B is given, the conditional distribution function of the
random variable X is

F
X
[
X
B
¸ = P j
X<x
B
[ =
P|(X<x)rB]
P(B)

The event B can be defined from some characteristic of the physical experiment. It may be
defined in terms of the random variable X or some other random variable other than X. If the
event B is defined in terms of another variable Y i.e., if the random variable X is conditioned by a
second random variable Y where (Y < y) it is called point conditioning.
F
X ¥ ⁄
[
X
¥
¸ = P|
(X<x)
(¥<¡)
]
=
P|(X<x),(¥<¡)]
P(¥<¡)

Provided, P(Y≤ y) ≠ 0
If the random variable Y is defined in such a way that its value lies between two constants
y
u
and y
b
. (y
u
¸ ¥ ¸ y
b
) then this is called interval conditioning.
The corresponding conditional density function is given by
¡
X ¥ ⁄
[
x
¡
¸ =
ð|P
X Y ⁄
(x ¡)] ⁄
ðx

=
]
xj
(x,¡)
]
Y
(¡)
, ¡
¥
(y) > u
If X is given, ¡
¥ X ⁄
[
¡
x
¸ =
ð|P
Y X ⁄
(¡ x)] ⁄
ð¡

=
]
XY
(x,¡)
]
X
(x)
, ¡
X
(x) > u
www.jntuworld.com
www.jntuworld.com
GRIET‐ECE    4
5. Let X and Y be jointly continuous random variables with joint density
function ¡(x, y) = xy exp[-
x
2
+y
2
2
¸ ¡ur x > u, y > u
= 0 otherwise
Check whether X and Y are independent. Find P(X ¸ 1, Y ¸ 1)
Ans:
¡(x, y) = xy exp[-
x
2

2
2
¸ ¡or x > u, y > u
= 0 otherwise
¡
X
(x) = ] ¡(x, y)Jy = xc
-
x
2
2
] yc
-
j
2
2
Jy
«
0
«
0

Let
¡
2
2
= t =
2¡ d¡
2
= Jt
· ¡(x) = xc
-
x
2
2
] c
-t
Jt = xc
-
x
2
2
«
0

Similarly, ¡(y) = ] ¡(x, y)Jx = yc
-
j
2
2
«
0

Since ¡(x, y) = ¡(x). ¡(y), X onJ ¥ are independent.
(i) P(X ¸ 1, ¥ ¸ 1) = ] ] ¡(x, y)Jx Jy
1
0
1
0

= ] ¡(x)Jx. ] ¡(y)Jy
1
0
1
0

= ] xc
-
x
2
2
Jx ] yc
-
j
2
2
Jy
1
0
1
0

Consider ] xc
-
x
2
2
Jx
1
0

Let -
x
2
2
= t = x Jx = Jt
Upper limit for t =
1
2

Lower limit for t = u

] xc
-
x
2
2
Jx = ] c
-t
Jt = |-c
-t
]
0
1 2 ⁄
= 1 -c
-
1
2
1
2
0
1
0

Similarly ] yc
-
j
2
2
Jy = 1 -c
-
1
2
1
0

· P(X ¸ 1, ¥ ¸ 1) = (1 - c
-1 2 ⁄
)
2

www.jntuworld.com
www.jntuworld.com
GRIET‐ECE    5
6.If X and Y are two random variables which are Gaussian, if a random
variable Z is defined as Z =X + Y, Find ¡
Z
(Z).
Ans:
Let X and Y be two normalized Gaussian random variables i.e.,
o
x
2
= o
¡
2
= 1 and m
x
= m
¡
= u
¡(x) =
1
√2n
c
-x
2
2 ⁄
and ¡(y) =
1
√2n
c

2
2 ⁄

¡(Z) = ¡(x) - ¡(y)
= ] ¡
X
(Z -y). ¡
¡
(y)Jy
«

=
1
2n
] c
-
(Z-j)
2
2
. c
-
j
2
2
Jy
«

=
1
2n
] expj-
1
2
(Z
2
+2y
2
- 2Zy)[ Jy
«

=
1
2n
] exp_-
1
2
j[2y
2
+
z
2
2
-2Zy¸ +
z
2
2
[_ Jy
«

=
1
2n
] exp_-
1
2
_
z
2
2
+ [√2y -
z
√2
¸
2
__ Jy
«

=
1
√2n
. c
-z
2
4 ⁄
.
1
√2n
] exp_-
1
√2
[√2y -
z
√2
¸
2
_ Jy
«

Let P = √2y -
z
√2
= JP = √2. Jy
¡
z
(Z) =
1
√2n
c
-z
2
4 ⁄
.
1
√2n
] c
-P
2
2 ⁄
.
dP
√2
«

=
1
√2n
c
-z
2
4 ⁄
.
1
√2
|
1
√2n
] c
-P
2
2 ⁄
JP]
«

From the property of Gaussian pdf,

1
√2n
] c
-P
2
2 ⁄
. JP = 1
«

· ¡
z
(Z) =
1
√2n.√2
. c
-z
2
4 ⁄

So, Z is also a Gaussian Random Variable with zero mean and variance=2.
www.jntuworld.com
www.jntuworld.com
GRIET‐ECE    6
7. Two independent random variables X and Y have the probability density
functions respectively
¡
x
(x) = xe
-x
, x > u;
¡
y
(y) = 1, û ¸ y ¸ 1;
= 0 otherwise.
Calculate the probability distribution and density functions of the random
variable Z = X + Y.
Ans:
¡
x
(x) = x. c
-x
for x ¸ u = x. c
-x
u(x)
¡
¡
(y) = 1 for u ¸ y ¸ 1 = u(y) -u(y - 1)
¡(z) = ¡
x
(x) - ¡
¡
(y)
= ] x. c
-x
. u(x)|u(z -x) - u(z - x -1)]Jx
«

= ] xc
-x
. u(x)u(z - x)Jx - ] xc
-x
. u(x). u(z - x - 1)Jx
«

«

Consider ] xc
-x
. u(x). u(z - x)Jx = ] xc
-x
Jx
z
0
«

= x ] c
-x
Jx - 1. ] c
-x
Jx
z
0
z
0

= -zc
-z
- |c
-z
- 1]
= -z. c
-z
-c
-z
+ 1
= 1 -c
-z
|1 + z] ¡or z > u
Consider ] xc
-x
u(x)u(z -x - 1)Jx = ] xc
-x
Jx
z-1
0
«

= -xc
-x
·
0
z-1
+ |-c
-x
]
0
z-1

= -(z - 1)c
-(z-1)
+ 1 - c
-(z-1)

= 1 -c
-(z-1)
|1 + (z -1)]
= 1 -zc
-(z-1)
¡or (z - 1) > u
www.jntuworld.com
www.jntuworld.com
GRIET‐ECE    7
· ¡(z) = 1 - (1 +z)c
-z
¡or u ¸ z ¸ 1
= 1 -zc
-(z-1)
¡or z > 1
8. If ¡
X,Y
(x, y) = û. 5e
(-|x|-|y|)
where X and Y are two random variables, if
Z=X + Y, find ¡
z
(Z).
Ans:
We know that p.d.f of two random variables is the convolution of p.d.f’s of
individual random variables.
· ¡
z
(z) = ¡
X
(x) - ¡
¥
(y)
= ¡
X
(x) - ¡
¥
(z -x)
= ] ¡
X
(x)¡
¥
(z - x)Jx
u
-u

But ¡

(xy) = u.Sc
-x-¡

=
1
√2
c
-|x|
.
1
√2
c
-|¡|

= ¡
X
(x). ¡
¥
(y)
Thus ¡
X
(x) =
1
√2
c
-|x|

and ¡
¥
(y) =
1
√2
c
-|¡|

· ¡
¥
(z -x) =
1
√2
c
-(z-x)
| Z = X + ¥ = y = z - x]
Assuming that the random variable x changes between –α and +α.
We get ¡
z
(z) = ] ¡
X
(x)¡
¡
(z -x)Jx
u
-u

= ]
1
√2
c
-x
.
1
√2
c
-z
. c
x
Jx
u
-u

=
1
2
c
-z
] Jx
u
-u

=
1
2
. c
-z
. 2o
= oc
-z

· ¡
z
(z) = oc
-z

www.jntuworld.com
www.jntuworld.com
GRIET‐ECE    8
9. Explain how to determine PDF of sum of two random variables.
Ans:
Let X and Y be the two random variables. Let ‘Z’ be the another random variable
which is sum of X and Y i.e.,
Z =X + Y
Let P

(x, y) be the joint probability density function of X and Y. Let Z be a
particular value belongs to the random variable Z. The sample space of the random variable Z
can be represented by using the X-Y plane as shown in fig 4.9.1 below.

The density function of Z can be determined by assuming any one of the random
variables, say ‘X’ is fixed i.e., X may take any value between -∞ to +∞.
The CDF of the random variable ‘Z’ is given by
¡
z
(z) = P(Z ¸ z)
= P(-∞ ¸ ∞, -∞ ¸ y ¸ Z -X)
¡
z
(z) = ] ] P

(x, y)JxJy
«

z-X

www.jntuworld.com
www.jntuworld.com
GRIET‐ECE    9
P
z
(z) =
dP
Z
(z)
dz
=
dP
Z
(z)

=
d

|] ] P

(x, y)Jx Jy]
«

z-X

P
z
(z) = ] P

(x, Z - X)Jx
«

Since X and Y are independent
P
z
(z) = ] P
X
(x). P
¥
(z -x)Jx
«

· Convolution of any two density functions like X and Y is a density function of Z.
Therefore the density function of Z is given by the convolution of density function of X and Y.

10. Explain central limit theorem. State and prove central limit theorem.
Ans:
The central limit theorem states that the random variable X which is the sum of
large number of random variables always approaches the Gaussian distribution irrespective of
the type of distribution each variable process and their amount of contribution into the sum.
Let us consider two random variables X
1
and X
2
as shown in figure 4.10.1 (a) and
(b). The new random variable
X
3
= X
1
+ X
2

gives the triangular density is convolved with a rectangular density of
figure 4.10.1 (a) to create new random variable xy, the xz result approximately follows the
Gaussian distribution as shown in figure 4.10.1 (d).

P
z
(z) = P
X
(x) - P
¥
(y)
www.jntuworld.com
www.jntuworld.com
GRIET‐ECE    10

www.jntuworld.com
www.jntuworld.com