You are on page 1of 20

# Question Bank

## PROBABILITY AND RANDOM PROCESS

UNIT I
RANDOM VARIABLES
PART A
1. Define probability mass function.
2. If Var (X) = 4, find Var (3X+8), where X is a random variable.
3. X and Y are independent random variable with variance 2 and 3. Find the
variance of 3X+4Y.
4. Let X be a random variable with E(X) = 1, and E{X(X-1)} = 4. Find Var X
and Var (2-3X).
5. Define probability density function.
6. A continuous random variable X has probability density function given by
f ( x) = 3 x 2 , 0 x 1. Find K such that P ( X > K ) = 0.05.
Cxe x

if x > 0

if x 0

## 7. A random variable X has the p.d.f f(x) given by f ( x) =

Find the value of C and c.d.f. of X.

8. The first four moments of a distribution about X are 1,4,10 and 45 resp.
Show that the mean is 5, variance is 3, 3 = 0 and 4 = 26.
9. For a binomial distribution mean is 6 and S.D. is 2 . Find the first two
terms of the distribution.

10. Determine the binomial distribution for which the mean is 4 and the variance
is 3.
11. Define poisson distribution.
12. If X is a poisson variate such that P(X = 2) = 9P(X = 4) + 90P(X = 6),find
the variance.
13. The moment generating function of a random variable X is given by
M X (t ) = e 3( e 1) . Find P(X=1).
t

## 14. Find the moment generating function of the geometric distribution.

15. Define Negative Binomial distribution.
16. Find the moment generating function of a uniform distribution.
17. If the random variable X is uniformly distributed over (-1,1), find the density
function of Y = sin

.
2

18. If X is uniformly distributed in
, , find the probability distribution
2

function of y = tan x.
19. Define exponential distribution.
20. Define Gamma distribution.
21. The life time of a component measured in hours is Weibull distribution with
parameter = 0.2, = 0.5. Find the mean lifetime of the component.
22. Define Normal distribution.
2 x,
0

0 < x <1
otherwise

## Find the pdf of Y = (3X+1).

PART B
1. Let X be a discrete r.v whose cumulative distribution function is
0
1 / 6

F ( x) =
1 / 2
1

for x < 3
for 3 x < 6
for 6 x < 10
for x 10

## (a)Find P( X 4), P (5 < X 4), P( X = 3), P( X = 4).

(b)Find the probability mass function.
2. A man draws 3 balls from an urn containing 5 white and 7 black balls. He
gets Rs. 10 for each white ball and Rs.5 for each black ball. Find his
expectation.
3. 6 dice are thrown 729 times. How many times do you expect atleast three
dice to show 5 or 6 ?
2 x, 0 < x < 1
find the p.d.f of
0, elsewhere

Y = (3X+1)

## 5. A continuous random variable X has the p.d.f f ( x) = Kx 2 e x , x 0. find the

r th moment of X about the origin. Hence find the mean and variance of X.
6. If X has the probability density f ( x) = Ke 3 x , x > 0. find K, P(0.5 X 1) and
the mean of X.

1 x / 2
e
, x>0
7. Let the r.v X hjave the p.d.f f ( x) = 2
find the moment
0,
otherwise

## generating function, mean and variance of X.

8. Define Binomial distribution, obtain its MGF, mean and variance.
9. The probability of a bomb hitting a target is 1/5. Two bombs are enough to
destroy a bridge. If six bombs are aimed at the bridge, find the probability
that the bridge is destroyed ?
10. Describe Binomial B(n, p) distribution and obtain the moment-generating
function. Hence compute (i) the first four moments and (ii) the recursion
relation for the central moments.
11. State the condition under which the poisson distribution is a limiting case
of the binomial distribution and show that under these conditions the
binomial distribution is approximated by the poisson distribution.
12. Obtain the first four moments about origin of the poisson distribution?
13. If X and Y are independent poisson random variables, show that the
conditional distribution of X given X+Y is a binomial distribution.
14. A and B shoot independently until each has hit own target. The
probabilities of their hitting the target at each shot 2/5 and 5/7 resp. Find
the probability that B will require more shots than A.
15. If a random variable X has a negative binomial distribution, obtain the
mean and variance of X.
16. Describe negative binomial distribution X~NB (k, p) where X=number of
failures preceding the k th success in a sequence of Bernoulli trials and
p=probability of success. Obtain the MGF of X, mean and variance of X.

17. If a poisson variate X is such that P(X = 1) = 2P(X = 2). Find P(X = 0) and
var (X). If X is a uniform random variable in [-2,2], find the p.d.f of Y = x
and E[Y].
18. If X is a uniform random variable in the interval (-2, 2) find the p.d.f of
Y = X 2.
19. Find the moment generating function of the exponential distribution
x

1
f ( x) = e c , 0 x < , c > 0. Hence find its mean and S.D.
c

## 20. Find the moment generating function of an exponential random variable

and hence find its mean and variance.
21. Suppose the duration X in minutes of long distance calls from your home,
1 x / 5
e , for x > 0
follows exponential law with PDF f ( x) = 5
0,
otherwise

## find P[ X > 5], p[3 X 6], mean of X and variance of X.

22. The daily consumption of milk in excess of 20,000 gallons is
approximately exponentially distributed with = 3000. The city has a daily
stock of 35,000 gallons. What is the probability that of two days selected at
random, the stock is insufficient for both days.
23. What is MGF of a random variable? Derive the M.G.F, mean, variance and
the first four moments of a Gamma distribution.
24. Suppose that the lifetime of a certain kind of an emergency backup battery
(in hrs) is a random variable X having the Weibull distribution
= 0.1, = 0.5. Find (a) the mean lifetime of these batteries, (b) the
probability such a battery will last more than 300 hrs. (c) the probability
that such a battery will not last 100 hrs.
25. Find the moment generating function of Normal distribution.

26. If X and Y are independent random variables each following N (0, 2), find
the probability density function of Z = 2X + 3Y.

UNIT II
TWO DIMENSIONAL RANDOM VARIABLES
PART A
1. The joint pdf of two random variables X and Y is given by
f XY ( x, y ) =

1
y
x( x y ); 0 < x < 2; x < y < x and otherwise find f Y / X
8
x

## 2. The joint pdf of a biliariate R.V. (X,Y) is given by,

4 xy, 0 < x < 1, 0 < y < 1
. Find P(X+Y < 1).
f ( x) =
otherwise
0,
2

3. The joint pdf of the R.V. (X,Y) is given by f ( x) = Kxye ( x + y ) x > 0, y > 0. Find
the value of K and prove also that X and Y are independent.
4. State the basic properties of joint distribution of (X,Y) when X and Y are
random variables.
5. If the point pdf of (X,Y) is given by f ( x, y ) = e ( x + y ) x 0, y 0. find E[XY].
6. If two random variables X and Y have probability density function
f ( x, y ) = e ( 2 x + y ) for x, y > 0. Evaluate k.

## 7. If X and Y are random variables having the joint density function

1
f ( x, y ) = (6 x y ); 0 < x < 2; 2 < y < 4, find P(X+Y<3).
8

## 8. Give an example for conditional distribution.

9. Distinguish between correlation and regression.
10. The two equations of the variables X and Y are x = 19.13-0.87y and
y = 11.64-0.50x. Find the correlation co-efficient between X and Y.
11. The regression equations of X and Y is given by 3X +Y = 10 and 3X +4Y
= 12. Find the co-efficient of correlation between X and Y.
12. Find the acute angle between the two lines of regression.
13. State the equations of the two regression lines. What is the angle between
them?
14. The regression equation of X on Y and Y on X are respectively 5x-y=22
and 64x-45y=24. Find the means of X and Y.
15. X and Y are independent random variables with variance 2 and 3. Find the
variance of 3X + 4Y.
16. State central limit theorem.
17. State the central limit theorem for independence and identically distributed
random variables.
PART B
1. The joint probability mass function of X and Y is
P(x,y)

0.1

.04

.02

.08

.02

.06

.06

.14

.30

## Compute the marginal PMF of X and of Y, P[ X 1, Y 1] and check if X and

Y are independent.
2. The joint probability mass function of (X, Y) is given by
P( x, y ) = k (2 x + 3 y ); x = 0,1,2. y = 1,2,3. Find the marginal probability
distribution of X : {i, pi* }.
3. If X and Y are random variables having the joint density function
1
f ( x, y ) = (6 x y ); 0 < x < 2; 2 < y < 4
. Find (i) P( X < 1 Y < 3)
8
= 0, otherwise

## (ii) P( X + Y < 3) (iii) P( X < 1 / Y < 3)

4. Find the marginal density function of X, if the joint density function of two
continuous random variable X and Y is
f ( x, y ) = 2(2 x y ); 0 x y 1
= 0, otherwise

## 5. If the joint pdf of two random variable (X,Y) is given by

f ( x, y ) = 2; 0 < x < 1; 0 < y < x < 1.
. Find the marginal density function of X
= 0, otherwise

and Y.
6. If the joint density function of the two random variables X and Y be
f ( x, y ) = e ( x + y ) ; x 0, y 0
. Find (i) P(X<1) and (ii) P(X+Y<1).
= 0, otherwise

7. Find the covariance of the two random variables whose pdf is given by
f ( x, y ) = 2; for x > 1, y > 0, X + Y < 1
= 0, otherwise

8. Calculate the correlation co-efficient for the following heights (in inches) of
fathers X their sons Y.
X

65

66

67

67

68

69

70

72

67

68

65

68

72

72

69

71

9. Suppose that the two dimensional random variables (X,Y) has the joint
p.d.f.

## f ( x, y ) = x + y; 0 < x < 1; 0 < y < 1.

. Obtain the correlation co-efficient
= 0, otherwise

between X and Y.
10. Two independent random variables X and Y are defined by
f ( x) = 4ax; 0 x 1
= 0, otherwise

f ( y ) = 4by; 0 y 1
. Show that U=X+Y and
= 0, otherwise

## V=X-Y are uncorrelated.

11. The joint p.d.f of random variables X and Y is given by
f ( x, y ) = 3( x + y ); 0 < x 1; 0 y 1, x + y 1
Find (i) the marginal P.d.f of X
= 0, otherwise

## (ii) P[X+Y<1/2] (iii)cov.(X,Y)

12. The random variable [X,Y] has the following joint p.d.f.
1
( x + y ); 0 x 2 and 0 y 2
(i) Obtain the marginal distribution
2
= 0, otherwise

f ( x, y ) =

## of X, (ii) E[ X ] and E[ X 2 ] (iii) Compute co-variance [X,Y].

13. Two random variables X and Y are defined as Y=4X + 9. Find the
correlation coefficient between X and Y.

## 14. The regression equations of two random variables X and Y are

x

5
33
20
107
y+
= 0 and y
x+
= 0. The standard deviation of X is 3. Find
4
5
9
9

## the standard deviation of Y.

15. Calculate the correlation co-efficient and obtain the lines of regression
from the data given below:
X

62

64

65

69

70

71

72

74

126

125

139

145

165

152

180

208

16. Following table gives the data on rainfall and discharge in a certain river.
Obtain the line of regression of y on x.
Rainfall
(inches)
(X):

1.53

1.78

2.60

2.95

3.42

Discharge
(100 C.C)
(Y):

33.5

36.3

40.0

45.8

53.5

## 17. If the equations of the two lines of regression of y on x and x on y are

respectively, 7x-16y+9=0; 5y-4x-3=0, calculate the co-efficient of
correlation, x and y.
18. The joint probability density function of the two random variables X and Y
be f ( x, y ) = e ( x + y ) ; x > 0, y > 0 . Find the p.d.f of U =

X +Y
.
2

## 19. If X and Y are independent random variables each normally distributed

with mean zero and variance 2 , find the density functions of
R=

Y
X 2 + Y 2 and = tan 1 .
X

20. If X and Y are independent random variables each following N (0,2), find
the probability density function of Z=2X+3Y.
21. The random variables X and Y have joint p.d.f.
xy
; 0 x 1; 0 y 2
(i) Are X and Y independent? (ii) Find
3
= 0, otherwise

f ( x, y ) = x 2 +

## the conditional p.d.f. of X given Y.

22. The joint p.d.f. of a bivariate random variable (X,Y) is given by
f ( x, y ) = kxy; 0 < x < 1; 0 < y < 1
where K is a constant (i) find the value of k.
= 0, otherwise

## (ii)Find P[X+Y<1] (iii) Are X and Y independent random variables.

Explain.
23. A random sample of size 100 is taken from a population whose mean is 60
and the variance is 400. Using Central limit theorem, with what probability
can we assert that the mean of the sample will not differ from = 60 by
more than 4?
24. State and prove Central limit theorem.
25. A distribution with unknown mean has variance equal to 1.5. Use
central limit theorem to find how large a sample should be taken from the
distribution in order that the probability will be atleast 0.95 that the sample
mean will be within 0.5 of the population mean.

UNIT III
CLASSIFICATION OF RANDOM VARIABLES
PART A
1. Give an example for a continuous time random process.
2. State the four types of a stochastic processes.
3. Define stationary process.
4. Define Widesnse stationary and strict sense stationary random processes.
5. Give an example of stationary random process and justify your claim.
6. When is a random process said to be erodic.
7. State the properties of an erodic process.
8. Give an example of an erodic process.What is Markov process and give
example.
9. Define Markov chain and one-step transition probability.

10. What is Markov chain? When can you say that a Markov chain is
homogeneous?
11. Find the nature of the states of the Markov chain with the transition
0 1 0
probability matrix 1 / 2 0 1 / 2
0 1 0

## 12. Define irreducible Markov chain? And state chaman-Kolmogorow

theorem.
13. Explain any two applications of a binomial process.
14. State any two properties of Poisson process.What will be the superposition
of n independent Poisson processes with respective average rates
1 , 2 ,K n.

## 15. Define Gaussian process.

16. Define sine-wave process.
17. State any two applications for a sine-wave process.
18. Define Birth process.
19. State and prove any one properties of Normal process.

PART B
1. Consider the random process X (t ) = cos(t + ), where is a random variable
with density function f ( ) = 1 / ,
stationary or not.

2
2

## 2. Consider the random process X (t ) = cos( 0 t + ), where is uniformly

distributed in the interval to . Check whether X(t) is stastionary or not?
3. Show that the random process X (t ) = A cos(t + ), is wide sense stationary if
A & are constant and is uniformly distributed random variable in
(0,2 ) .
4. Give a random variable Y with characteristic function (W ) E[e iwy ] and a
random process defined by X (t ) = cos(t + y ), show that [X(t)] is stationary
in the wide sense of (1) = (2) = 0.
5. Two random process X(t) and Y(t) are defined by
X (t ) = A cos t + B sin t and Y (t ) = B cos t A sin t show that X(t) and Y(t)

## are jointly wide-sense stationary if A and B are uncorrelated random

variables with zero means and the same variances and is constant.
6. Consider a random process X (t ) = U cos t + V sin t where U and V are
independent random variables each of which assumes the values -2 and 1
with probabilities 1/3 and 2/3 respectively. Show that X(t) is wide sense
stationary and not strict sense stationary?
7. Prove that the random process X (t ) = A cos(t + ), where A and are
constants and is uniformly distributed random variable in (0,2 ) is
correlation ergodic.
8. Given that WSS random process X (t ) = 10 cos(100t + ), where is uniformly
distributed over ( , ) is correlation ergodic.
9. Consider a Markov chain with 2 states and transition probability matrix
P=

3 / 4 1/ 4
. Find the stationary probabilities of the chain.
1/ 2 1/ 2

10. Find P(n) for the homogeneous Markov chain with the following transition
probability matrix P =

1 a
a
where 0 < a < 1, 0 < b < 1.
b 1 b

11. Define a Markov chain. How you would clarify the states and identify
different classes of a Markov chain. Give an example to each class.
12. Write a short note on Binomial processes.
13. The inter arrival time of a Poisson process (i.e) the interval between two
successive occurrence of a Poisson process with parameter has an
exponential distribution with mean

## 14. The difference of two independent poisson process is not a Poisson

process.
15. Find the mean and autocorrelation of the Poisson process.
16. Derive the distribution of Poisson process and find its mean and variance.
17. Describe Poisson process. State and establish its properties.
18. If {X(t)} is Gaussian process with (t ) = 10 and C (t1 , t 2 ) = 16e t t , find the
probability that (i) X (10) 8 (ii) X (10 X (6) 4.
1

19. Write a short note on sine-wave process. For the sine wave process
X (t ) = Y cos 0 t , < t < , 0 =constant, the amplitude Y is a random
variable with uniform distribution in the interval 0 to 1. Check whether the
processes is stationary or not.
20. Derive the Balance equation of the brith and death process.

UNIT IV
CORRELATION AND SPECTRAL DENSITIES
PART A
1. Define autocorrelation function and prove that for a wide sense stationary
process {X(t)}, R XX ( ) = R XX ( ).
2. State any two properties of an auto correlation function.
3. The power spectral density of a random process {X(t)} is given by
S XX ( ) = ;

<1

= 0, otherwise

## 4. Explain the concept of auto correlation function.

5. Define Cross-correlation function and state any two of its properties.
6. Explain the concept of cross correlation function.
7. If R( ) = e 2 is the auto correlation function of a random process X(t),
obtain the spectral density of X(t).
8. Define spectral density.

given by

S XX ( ) = 1;

w < w0

= 0, otherwise

## .Find the auto correlation function of the

process.
10. The autocorrelation function of a wide sense stationary process is given by
2
R( ) = 2 e
. Determine the power spectral density of the process.
11. What is meant by spectral analysis?
12. The power spectral density of a wide sense stationary process is given by
b
(a w ); w a
a
. Find the auto correlation function of the process.
= 0,
w >a

S ( w) =

## 13. Define cross spectral density and give example.

14. Explain cross power spectrum.
15. State Wienar-Khinchine relation.
PART B
1. Given that the auto correlation function for a stationary ergodic process
with no periodic components is R( ) = 25 +

4
. Find the mean and
1 + 6 2

## variance of the process {X(t)}.

2. Consider a random process X (t ) = B cos(50t + ) where B and are
independent random variables. B is a random variable with mean 0 and
variance 1. is uniformly distributed in the interval { , ). Find mean and
auto correlation of the process.

3. Derive the mean auto correlation and auto covariance of Poisson process.
4. If {X(t)} is a wide sense stationary process with auto correlation function
R XX ( ) and if Y (t ) = X (t + a ) X (t a), show that
RYY ( ) = 2 R XX ( ) R XX ( + 2a ) R XX ( 2a).

## 5. The auto correlation function for a stationary process X(t) is given by

2

R XX ( ) = 9 + 2e

## . Find the mean of the random variable Y = X (t )dt and

0

variance of X(t).
6. Find the mean and auto correlation of the Poisson process.
7. Consider two random process

2

## variable uniformly distributed in (0,2 ) . Prove that R XX (0) RYY (0) R XY ( ) .

8. Prove that the random processes X(t) and Y(t) defined by
X (t ) = A cos 0 t + B sin 0 t and Y (t ) = B cos 0 t A sin 0 t are jointly widesense stationary if A and B are uncorrelated zero mean random variables
with the same variance.
9. Given that a process X(t) has the auto correlation function

R XX ( ) = Ae
cos( 0 ) where A > 0, > 0 and 0 are real constants, find the
power spectrum of X(t).
2

## 10. Show that S YY ( ) = H ( ) S XX ( ) where S XX ( ) and S YY ( ) are the power

spectral density functions of the input X(t) and the output Y(t) and H ( )
is the system transfer function.

11. The cross power spectrum of real random process X(t) and Y(t) is given
by

jb
; <1
. Find the cross-correlation function.
w
= 0, elsewhere

S XX ( ) = a +

12. Calculate the power spectral density of a stationary random process whose
auto correlation is R XX ( ) = e
13. If the cross-correlation of two processes {X(t)} and {Y(t)} is
R XY (t , t + ) =

AB
[sin( 0 ) + cos( 0 (2t + ))] where A,B and 0 are constants.
2

## Find the cross power spectrum.

UNIT V
LINEAR SYSTEMS WITH RANDOM VARIABLES
PART A
1.
2.
3.
4.
5.

## Describe a linear system.

Define a system. When is it called linear system?
State the properties of a linear filter.
Describe a linear system with an random input.
Give an example for a linear system.
PART B

6. Show that the inter arrival time of a Poisson process with intensity obeys
an exponential law.
7. Show that for an input output system ( X (t ), y (t ), y (t )); S YY ( w) = S XX ( w). H (w)
where H (w) is the system transfer function, and the input X is wide sense
stationary.

## 8. Show that S YY ( w) = H ( ) S XX ( ) where S XX ( ) and S YY ( ) are the power

spectral density functions of the input X(t) and the output Y(t) and H ( ) is
the system transfer function.