You are on page 1of 10

SUBJECT NAME

: Probability & Random Process

SUBJECT CODE

: MA 2261

MATERIAL NAME

: Formula Material

MATERIAL CODE

: JM08AM1007

UNIT-I (RANDOM VARIABLES)


1) Discrete random variable:
A random variable whose set of possible values is either finite or countably
infinite is called discrete random variable.
Eg: (i) Let X represent the sum of the numbers on the 2 dice, when two
dice are thrown. In this case the random variable X takes the values 2, 3, 4, 5, 6,
7, 8, 9, 10, 11 and 12. So X is a discrete random variable.
(ii) Number of transmitted bits received in error.
2) Continuous random variable:
A random variable X is said to be continuous if it takes all possible values
between certain limits.
Eg: The length of time during which a vacuum tube installed in a circuit
functions is a continuous random variable, number of scratches on a surface,
proportion of defective parts among 1000 tested, number of transmitted in
error.
3)

Sl.No. Discrete random variable


1

2
3

Continuous random variable

p( x i ) 1

f ( x )dx 1

F ( x) P X x

F ( x) P X x

f ( x )dx

Mean E X xi p( xi )

Mean E X

xf ( x )dx

E X x p( xi )
2

2
i

E X 2

Var X E X 2 E X

Moment = E X r xir pi
i

M.G.F

f ( x )dx

Var X E X 2 E X

Moment = E X r
M.G.F

f ( x )dx

M X t E e tX e tx p( x )
x

M X t E e tX

tx

f ( x )dx

4) E aX b aE X b
5) Var aX b a 2 Var X
6) Var aX bY a 2 Var X b2Var Y
7) Standard Deviation Var X
8) f ( x ) F ( x )
9) p( X a ) 1 p( X a )
10) p A / B

p A

p B

, p B 0

11) If A and B are independent, then p A B p A p B .


12) 1st Moment about origin = E X = M X t
(Mean)

t 0
2nd Moment about origin = E X 2 = M X t

t 0
r
t
The co-efficient of
= E X r
(rth Moment about the origin)
r!
13) Limitation of M.G.F:
i) A random variable X may have no moments although its m.g.f exists.
ii) A random variable X can have its m.g.f and some or all moments, yet the
m.g.f does not generate the moments.
iii) A random variable X can have all or some moments, but m.g.f does not
exist except perhaps at one point.
14) Properties of M.G.F:
i)
If Y = aX + b, then MY t e bt M X at .
ii)
iii)

M cX t M X ct , where c is constant.

If X and Y are two independent random variables then


M X Y t M X t M Y t .

15) P.D.F, M.G.F, Mean and Variance of all the distributions:


Sl.
Distributio
M.G.F
P.D.F ( P ( X x ) )
No.
1

n
Binomial

Poisson

Geometric

nc x p x q n x

e x
x!
x 1
q p (or) q x p

Mean

Variance

np

npq

pe t
1 qe t

1
p

q
p2

q pe
t

e t 1

Uniform

Exponential

Gamma

Normal

1
, a xb

f ( x) b a
0,
otherwise
e x , x 0, 0
f ( x)
otherwise
0,

f ( x)
f ( x)

e x x 1
, 0 x , 0
( )
1

1 x

e bt e at
( b a )t

a b ( b a )2
2
12

1
(1 t )

t 2 2
2

16) Memoryless property of exponential distribution


P X S t / X S P X t.
17) Function of random variable: fY ( y ) f X ( x )

dx
dy

UNIT-II (RANDOM VARIABLES)


1)

ij

1 (Discrete random variable)

f ( x , y )dxdy 1 (Continuous random variable)

2) Conditional probability function X given Y P X xi / Y yi


Conditional probability function Y given X P Y yi / X xi
P X a / Y b

P x, y
P( y)
P x, y
P( x)

P X a,Y b
P (Y b )

3) Conditional density function of X given Y,

f ( x / y)

f ( x, y)
.
f ( y)

Conditional density function of Y given X,

f ( y / x)

f ( x, y)
.
f ( x)

4) If X and Y are independent random variables then


f ( x , y ) f ( x ). f ( y )

(for continuous random variable)

P X x , Y y P X x .P Y y (for discrete random variable)


d b

5) Joint probability density function P a X b, c Y d f ( x , y )dxdy .


c a

b a

P X a , Y b f ( x , y )dxdy
0 0

6) Marginal density function of X, f ( x ) f X ( x )

f ( x , y )dy

Marginal density function of Y, f ( y ) fY ( y )

f ( x , y )dx

7) P ( X Y 1) 1 P ( X Y 1)
8) Correlation co efficient (Discrete): ( x , y )

Cov ( X , Y )

1
XY XY , X
n

Cov ( X , Y )

X Y

1
X 2 X 2 , Y

9) Correlation co efficient (Continuous): ( x , y )

1
Y 2 Y 2

Cov ( X , Y )

X Y

Cov( X , Y ) E X , Y E X E Y , X Var ( X ) , Y Var (Y )


10) If X and Y are uncorrelated random variables, then Cov ( X , Y ) 0 .
11) E X

xf ( x )dx ,

E Y

yf ( y )dy , E X , Y

xyf ( x , y )dxdy .

12) Regression for Discrete random variable:


Regression line X on Y is x x bxy y y ,
Regression line Y on X is y y byx x x , byx

bxy

x x y y
y y
2

x x y y
x x

Correlation through the regression, bXY .bYX

Note: ( x , y ) r ( x , y )

13) Regression for Continuous random variable:


Regression line X on Y is x E ( x ) bxy y E ( y ) ,

bxy r

x
y

Regression line Y on X is y E ( y ) byx x E ( x ) ,

b yx r

y
x

Regression curve X on Y is

x E x / y

x f x / y dx

y E y / x

Regression curve Y on X is

y f y / x dy

14) Transformation Random Variables:


fY ( y ) f X ( x )

dx
dy

(One dimensional random variable)

x
u
fUV ( u, v ) f XY ( x , y )
y
u

x
v
y
v

(Two dimensional random variable)

15) Central limit theorem (Liapounoffs form)


If X1, X2, Xn be a sequence of independent R.Vs with E[Xi] = i and Var(Xi) = i2, i
= 1,2,n and if Sn = X1 + X2 + + Xn then under certain general conditions, Sn
n

i 1

i 1

follows a normal distribution with mean i and variance 2 i2 as


n.

16) Central limit theorem (Lindberg Levys form)


If X1, X2, Xn be a sequence of independent identically distributed R.Vs with E[X i]
= i and Var(Xi) = i2, i = 1,2,n and if Sn = X1 + X2 + + Xn then under certain
general conditions, Sn follows a normal distribution with mean n and variance
n 2 as n .

Note: z

S n n

( for n variables),

( for single variables)

UNIT-III (MARKOV PROCESSES AND MARKOV CHAINS)


1)

Random Process:
A random process is a collection of random variables {X(s,t)} that are
functions of a real variable, namely time t where s S and t T.

2)

Classification of Random Processes:


We can classify the random process according to the characteristics of time t
and the random variable X. We shall consider only four cases based on t and X
having values in the ranges -< t < and - < x < .
Continuous random process
Continuous random sequence
Discrete random process
Discrete random sequence

Continuous random process:


If X and t are continuous, then we call X(t) , a Continuous Random Process.
Example:
If X(t) represents the maximum temperature at a place in the
interval (0,t), {X(t)} is a Continuous Random Process.
Continuous Random Sequence:
A random process for which X is continuous but time takes only discrete values is
called a Continuous Random Sequence.
Example: If Xn represents the temperature at the end of the nth hour of a day, then
{Xn, 1n24} is a Continuous Random Sequence.
Discrete Random Process:
If X assumes only discrete values and t is continuous, then we call such random
process {X(t)} as Discrete Random Process.
Example: If X(t) represents the number of telephone calls received in the interval
(0,t) the {X(t)} is a discrete random process since S = {0,1,2,3, . . . }
Discrete Random Sequence:
A random process in which both the random variable and time are discrete is called
Discrete Random Sequence.
Example: If Xn represents the outcome of the nth toss of a fair die, the {Xn : n1} is a
discrete random sequence. Since T = {1,2,3, . . . } and S = {1,2,3,4,5,6}

3)

Condition for Stationary Process: E X ( t ) Constant , Var X ( t ) constant .


If the process is not stationary then it is called evolutionary.

4)

Wide Sense Stationary (or) Weak Sense Stationary (or) Covariance Stationary:
A random process is said to be WSS or Covariance Stationary if it satisfies the
following conditions.
i) The mean of the process is constant (i.e) E X ( t ) constant .
ii)

5)

Auto correlation function depends only on (i.e)


RXX ( ) E X ( t ). X ( t )

Time average:
1
The time average of a random process X ( t ) is defined as X T
2T

X (t ) dt .

6)

7)

1
If the interval is 0,T , then the time average is X T X ( t ) dt .
T 0
Ergodic Process:
A random process X ( t ) is called ergodic if all its ensemble averages are

interchangeable with the corresponding time average X T .


Mean ergodic:
Let X ( t ) be a random process with mean E X ( t ) and time average X T ,
then X ( t ) is said to be mean ergodic if X T as T (i.e)

E X (t ) Lt X T .
T

Note: Lt var X T 0 (by mean ergodic theorem)


T

8)

Correlation ergodic process:


The stationary process X ( t ) is said to be correlation ergodic if the process

YT .
Y (t ) is mean ergodic where Y ( t ) X ( t ) X ( t ) . (i.e) E Y (t ) TLt

9)

Where YT is the time average of Y ( t ) .


Auto covariance function:
C XX ( ) RXX ( ) E X ( t ) E X ( t )

10) Mean and variance of time average:


T
1
Mean:
E X T E X ( t ) dt
T 0
Variance:

1
Var X T
2T

2T

2 T

RXX ( )C XX ( ) d

11) Markov process:


A random process in which the future value depends only on the present value
and not on the past values, is called a markov process. It is symbolically
represented by P X (t n1 ) xn1 / X (t n ) xn , X (t n1 ) xn1 ... X (t 0 ) x0

P X ( t n1 ) xn1 / X ( t n ) xn
Where t0 t1 t2 ... tn tn1
12) Markov Chain:
If for all n , P X n an / X n1 an1 , X n 2 an 2 ,... X 0 a0

P X n an / X n1 an1 then the process X n , n 0,1, 2, ... is called the

markov chain. Where a0 , a1 , a2 ,...an ,... are called the states of the markov chain.
13) Transition Probability Matrix (tpm):
When the Markov Chain is homogenous, the one step transition probability is
denoted by Pij. The matrix P = {Pij} is called transition probability matrix.
14) Chapman Kolmogorov theorem:
If P is the tpm of a homogeneous Markov chain, then the n step tpm P(n) is
n

equal to Pn. (i.e) Pij( n ) Pij .


15) Markov Chain property: If 1 , 2 , 3 , then P and

1 2 3 1 .
16) Poisson process:
If X ( t ) represents the number of occurrences of a certain event in (0, t ) ,then

the discrete random process X ( t ) is called the Poisson process, provided the
following postulates are satisfied.
(i)
(ii)
(iii)
(iv)

P 1 occurrence in ( t , t t ) t O t

P 0 occurrence in ( t , t t ) 1 t O t

P 2 or more occurrences in (t , t t ) O t
X ( t ) is independent of the number of occurrences of the event in any
interval.

17) Probability law of Poisson process: P X ( t ) x

et t

,
x!
Mean E X ( t ) t , E X 2 ( t ) 2 t 2 t , Var X ( t ) t .

UNIT-IV (CORRELATION AND SPECTRAL DENSITY)


RXX - Auto correlation function

S XX - Power spectral density (or) Spectral density

x 0,1, 2, ...

RXY - Cross correlation function


S XY - Cross power spectral density
1)

Auto correlation to Power spectral density (spectral density):


S XX

R e

XX

2)

Power spectral density to Auto correlation:


1
RXX
2

3)

S e d
i

XX

Condition for X ( t ) and X ( t ) are uncorrelated random process is

C XX ( ) RXX ( ) E X ( t ) E X ( t ) 0
4)

Cross power spectrum to Cross correlation:


RXY

5)

1
2

S e d
i

XY

General formula:
i)

ax
e cos bx dx

e ax
a cos bx b sin bx
a 2 b2

ii)

ax
e sin bx dx

e ax
a sin bx b cos bx
a 2 b2
2

iii)

a a2

x ax x
2
4

iv)

sin

e i e i
2i

v)

cos

e i e i
2

UNIT-V (LINEAR SYSTEMS WITH RANDOM INPUTS)

1) Linear system:
f is called a linear system if it satisfies

f a1 X 1 ( t ) a2 X 2 (t ) a1 f X 1 (t ) a2 f X 2 (t )
2) Time invariant system:
Let Y ( t ) f X ( t ) . If Y ( t h) f X ( t h) then f is called a time
invariant system.
3) Relation between input X ( t ) and output Y ( t ) :

Y (t )

h(u) X (t u) du

Where h( u ) system weighting function.


4) Relation between power spectrum of X ( t ) and output Y ( t ) :
SYY ( ) S XX ( ) H ( )

If H ( ) is not given use the following formula H ( )

j t

5) Contour integral:

e imx
ma
a 2 x 2 a e
a
1 e

6) F 1 2

2
2a
a

(One of the result)

(from the Fourier transform)

---- All the Best ----

h( t ) dt

You might also like