Professional Documents
Culture Documents
A Random Variables are describes the events on given sample was introduced,
discussed basic operation like expectation, moments, variance, ect.
Mathematical Expectation:
̅ = 𝑬[𝑿] = ∫ 𝒙 𝒇(𝒙)𝒅𝒙
𝒎𝒙 = 𝑿
−∞
𝑬 [ 𝑿 ] = ∑ 𝒙𝒊 𝑷 ( 𝒙𝒊 )
𝒊=𝟎
PROPERTIES OF EXPECTATION
1.If a random variable X is a constant i.e X =’a’ then E[a] = ‘a’, where a is a
constant.
We know that
∞
𝑬[𝑿] = ∫ 𝒙 𝒇(𝒙)𝒅𝒙
−∞
𝑬[𝒂] = ∫ 𝒂 𝒇(𝒙)𝒅𝒙
−∞
𝑬[𝒂] = 𝒂 ∫ 𝒇(𝒙)𝒅𝒙
−∞
We know that area under the curve of Probability Density Function is equal to
one i.e
∞
∫ 𝒇(𝒙)𝒅𝒙 = 𝟏
−∞
Then
𝑬 [ 𝒂] = 𝒂 ∗ 𝟏
𝑬[𝒂] = 𝒂
𝑬[𝒂𝑿] = ∫ 𝒂𝒙 𝒇(𝒙)𝒅𝒙
−∞
𝑬[𝒂𝑿] = 𝒂 ∫ 𝒙 𝒇(𝒙)𝒅𝒙
−∞
𝑬[𝒂𝑿] = 𝒂 ∫ 𝒙𝒇(𝒙)𝒅𝒙
−∞
𝑬[𝒂𝑿] = 𝒂𝑬[𝑿]
3. If ‘a’ and ‘b’ any two constants, then E[a X +b]=a E[X]+b
∞
∞ ∞
𝑬[𝒂𝑿 + 𝒃] = 𝒂 ∫ 𝒙𝒇(𝒙)𝒅𝒙 + 𝒃
−∞
Where area under the curve of Probability Density Function is equal to one i.e
∞
∫ 𝒇(𝒙)𝒅𝒙 = 𝟏
−∞
Then
𝑬[𝒂𝑿 + 𝒃] = 𝒂𝑬[𝑿]
∞ ∞
MOMENTS:
𝒎𝒏 = 𝑬[𝑿𝒏 ] = ∫ 𝑿𝒏 𝒇(𝒙)𝒅𝒙
−∞
and 𝒏
𝝁 𝒏 = 𝑬[(𝑿 − 𝒎𝟏 ) ]
∞
𝝁𝒏 = ∫−∞(𝑿 − 𝒎𝟏 )𝒏 𝒇(𝒙)𝒅𝒙 n=0,1,2,3,4---
𝒎𝒏 = 𝑬[𝑿𝒏 ] = ∫ 𝑿𝒏 𝒇(𝒙)𝒅𝒙
−∞
𝒎𝟎 = 𝑬[𝑿𝟎 ] = ∫ 𝑿𝟎 𝒇(𝒙)𝒅𝒙
−∞
𝒎𝟎 = 𝑬[𝟏] = ∫ 𝒇(𝒙)𝒅𝒙 = 𝟏
−∞
If n =1 is called first moment about origin and also called Average Value of
Random Variable or Expected Value of Random Variables
∞
𝒎𝒏 = 𝑬[𝑿𝒏 ] = ∫ 𝑿𝒏 𝒇(𝒙)𝒅𝒙
−∞
𝒎𝟏 = 𝑬[𝑿𝟏 ] = ∫ 𝑿𝟏 𝒇(𝒙)𝒅𝒙
−∞
𝒎𝟏 = 𝑬[𝑿] = ∫ 𝒙 𝒇(𝒙)𝒅𝒙
−∞
If n =2 is called second moment about origin and also called Mean Square
Value of Random Variable or Average Power of Random Variables and root of
Mean Square is called RMS Value.
∞
𝒎𝒏 = 𝑬[𝑿𝒏 ] = ∫ 𝑿𝒏 𝒇(𝒙)𝒅𝒙
−∞
𝒎𝟐 = 𝑬[𝑿𝟐 ] = ∫ 𝑿𝟐 𝒇(𝒙)𝒅𝒙
−∞
Higher Order Moments are calculated using Moment Generating Function and
Characteristic Function of random variables x
𝑀𝑣 (𝑥 ) = 𝐸 [𝑒 𝑣𝑥 ]
𝑀𝑣 (𝑥 ) = 𝐸 [𝑒 𝑣𝑥 ] = ∫ 𝒆𝒗𝒙 𝒇(𝒙)𝒅𝒙
−∞
CHARACTERISTIC FUNCTION:
where 𝜔=0
𝝁𝒏 = ∫(𝑿 − 𝒎𝒙 )𝒏 𝒇(𝒙)𝒅𝒙
−∞
𝝁𝟎 = ∫ 𝒇(𝒙)𝒅𝒙 = 𝟏
−∞
If n =1 is first Moments about Mean Value or first Central Moments
∞
∞ ∞
∞ ∞
𝝁𝟏 = 𝑬[𝑿] − 𝑬[𝒎𝟏 ] = 𝒎𝒙 − 𝒎𝒙 = 𝟎
We known that
∞
𝒎𝒙 = ∫ 𝒙𝒇(𝒙)𝒅𝒙
−∞
If n=2 is Second Moment Mean Value and also called Variance of Random
Variables and denoted as 𝜎𝑥2
∞
x1 and x2 are shown in figure as f(x1 )and f(x2 ) with same mean value E[X]
As shown in figure, both functions are symmetrical about mean. f(x1 ) is deviate
less from E[X],where as the values of f(x2 ) deviate more about from E[X].so the
If n=3 is called third moment about mean value or third central moment or
skew and skew measures the Asymmetry from the mean value
∞
If n=4 is called fourth moment about mean value or fourth central moment or
kurtosis
∞
For a given random variable X with and variance 𝝈𝟐𝒙 , States that
So here introduces the probability function for two random variables. The
Consider two random variables X and Y, with elements {x} and {y} in the xy
plane. The ordered pair of numbers {x,y} is called a random vector in the two
dimensional product space or joint sample space. Let two events be A ={X≤ x}
and B={Y≤ y}. hen the Joint Probability Distribution Function For the joint
𝐹(𝑥, 𝑦) = 𝑃 {X ≤ x, Y ≤ y} = 𝑃(𝐴 ∩ 𝐵)
Function is given
𝑁 𝑀
1.𝐹(−∞, ∞) = 0
2. 𝐹 (∞, ∞) = 1
3. 𝐹 (−∞, −∞) = 0
4. 𝐹 (−∞, 𝑦) = 0
5. 𝐹 (𝑥, −∞) = 0
6.F(x, y) is a monotonic and non decreasing function of both ‘x’ and ‘y’
𝐹 (∞, 𝑦) = 𝐹 (𝑦)
𝐹 (𝑥, ∞) = 𝐹(𝑥 )
JOINT PROBABILITY DENSITY FUNCTION
δ2 F(x,y)
f(x, y) = ;f(x,y) refers as Joint Probability Density Function
δxδy
∞ ∞
1.f(x,y)≥0
∞ ∞
2.∫−∞ ∫−∞ 𝑓 (𝑥, 𝑦)𝑑𝑥𝑑𝑦 = 1
∞ ∞
3. 𝐹(𝑥, 𝑦) = ∫−∞ ∫−∞ 𝐹(𝑥, 𝑦)𝑑𝑥𝑑𝑦
𝑥 ∞
4.𝐹(𝑥 ) = ∫−∞ ∫−∞ 𝐹 (𝑥, 𝑦)𝑑𝑥𝑑𝑦
∞ 𝑦
5.𝐹(𝑦) = ∫−∞ ∫−∞ 𝐹(𝑥, 𝑦)𝑑𝑥𝑑𝑦
∞
6 𝑓(𝑥 ) = ∫−∞ 𝑓 (𝑥, 𝑦)𝑑𝑦
∞
7.𝑓 (𝑦) = ∫−∞ 𝑓(𝑥, 𝑦)𝑑𝑥
STATISTICAL INDEPENDENCE
P(A∩B)=P(A)P(B)
events A={X≤x}, and B={Y≤y}for two real numbers x and y.Thus X and Y
F(x,y)=F(x)F(y)
f(x,y)=f(x)f(y)
SUM OF TWO RANDOM VARIABLES
Let ‘W’ be Random Variable equal to the sum of two random variables ‘X’ will
be the input signal and ‘Y’ will be the Noise Signal
W= X + Y
F(x) and F(y) are Probability Distribution Functions of ‘X’ and ‘Y’ respectively,
then the Probability Distribution Functions of ‘W’ is given as
The Joint Probability Density Function of random variables ‘X’ and ‘Y’ is f(x,y)in
the region X+Y≤w ,Then the Probability Distribution Function
∞ 𝑥
𝐹 (𝑤 ) = ∫ ∫ 𝑓 (𝑥, 𝑦)𝑑𝑥𝑑𝑦
−∞ −∞
But w = x+y
x =w-y
∞ 𝑤−𝑦
𝐹 (𝑤 ) = ∫ ∫ 𝑓 (𝑥, 𝑦)𝑑𝑥𝑑𝑦
−∞ −∞
Since X and Y are Independent Random Variables.
𝐹 (𝑤 ) = ∫ ∫ 𝑓(𝑥 )𝑓(𝑦)𝑑𝑥𝑑𝑦
−∞ −∞
∞ 𝑤−𝑦
𝐹 (𝑤 ) = ∫ 𝑓(𝑦) ∫ 𝑓 (𝑥 ) 𝑑𝑥𝑑𝑦
−∞ −∞
We Know that
𝑑𝐹 (𝑤 )
𝑓 (𝑤 ) =
𝑑𝑤
∞ 𝑤−𝑦
𝑑𝐹 (𝑤 ) 𝑑
( )
𝑓 𝑤 = = [ ∫ 𝑓(𝑦) ∫ 𝑓 (𝑥 ) 𝑑𝑥𝑑𝑦]
𝑑𝑤 𝑑𝑤
−∞ −∞
Apply the differentiating using Leibniz’s Rule, The Probability Density Function
is
∞ 𝑤−𝑦
𝑑𝐹 (𝑤 ) 𝑑
𝑓 (𝑤 ) = = ∫ 𝑓 (𝑦 ) [ ∫ 𝑓 (𝑥 ) 𝑑𝑥] 𝑑𝑦
𝑑𝑤 𝑑𝑤
−∞ −∞
We Know that
𝑏(𝑤) 𝑏(𝑤)
𝑑 𝑑𝑓 (𝑥 ) 𝑑𝑏 𝑑𝑎
[ ∫ 𝑓 (𝑥, 𝑤 ) 𝑑𝑥] = ∫ 𝑑𝑥 + 𝑓(𝑏(𝑤 )) − 𝑓(𝑎(𝑤 ))
𝑑𝑤 𝑑𝑤 𝑑𝑤 𝑑𝑤
𝑎(𝑤) 𝑎(𝑤)
∞ 𝑤−𝑥
𝒅𝑭(𝒘) 𝑑𝑓 (𝑥 ) 𝑑 (𝑤 − 𝑥 ) 𝑑(−∞)
𝑓 (𝒘) = = ∫ 𝑓 (𝑦 ) [ ∫ 𝑑𝑥 + 𝑓(𝑤 − 𝑥 ) − 𝑓 (−∞) ] 𝑑𝑦
𝒅𝒘 𝑑𝑤 𝑑𝑤 𝑑𝑤
−∞ −∞
𝑑 (𝑤 − 𝑥 ) 𝑑 (−∞) 𝑑𝑓 (𝑥 )
= 1, = 0 𝑎𝑛𝑑 =0
𝑑𝑤 𝑑𝑤 𝑑𝑤
∞ 𝑤−𝑥
𝒅𝑭(𝒘) 𝑑𝑓 (𝑥 )
𝑓 (𝒘) = = ∫ 𝑓 (𝑦 ) [ ∫ 𝑑𝑥 + 𝑓(𝑤 − 𝑥 ) ∗ 1 − 𝑓 (−∞) ∗ 0] 𝑑𝑦
𝒅𝒘 𝑑𝑤
−∞ −∞
∞ 𝑤−𝑥
𝒅𝑭(𝒘)
𝑓 (𝒘) = = ∫ 𝑓(𝑦) [ ∫ 0 ∗ 𝑑𝑥 + 𝑓 (𝑤 − 𝑥 )] 𝑑𝑦
𝒅𝒘
−∞ −∞
∞
𝒅𝑭(𝒘)
𝑓 (𝒘) = = ∫ 𝑓(𝑦) 𝑓(𝑤 − 𝑥 )𝑑𝑦
𝒅𝒘
−∞
∞
𝑓 (𝑤 ) = 𝑓 (𝑥 ) ⊗ 𝑓(𝑦)
Hence, The Probability Density Function of the sum of two statistically
independent random variables is equal to the convolution of their Individual
Probability Density Functions.
Consider that there are ‘N’ statistically independent random variables Xn,
n=0, 1, 2, 3, 4, 5 …N
𝑌 = 𝑋1 + 𝑋2 + 𝑋3 + 𝑋4 + 𝑋5 + ⋯ … … … … … . . +𝑋𝑁 .
Then the Probability Density Function of the sum of ‘N’ statistically
independent random variables is equal to the convolution of their Individual
Probability Density Functions.
It state that the Probability Density Function of the sum of ‘N’ statistically
independent random variables is approaches the Gaussian Density Function as
N tends infinity
JOINT MOMENTS
𝑚𝑛𝑘 = 𝐸 [𝑋 𝑛 𝑌 𝑘 ]
∞ ∞
∞ ∞ ∞ ∞
𝟏 𝟏] 𝟏 𝟏
𝒎𝟏𝟏 = 𝑬[𝑿 𝒀 = ∫ ∫ 𝑿 𝒀 𝒇(𝒙, 𝒚)𝒅𝒙𝒅𝒚 = ∫ ∫ 𝑿𝒀𝒇(𝒙, 𝒚)𝒅𝒙𝒅𝒚
−∞ −∞ −∞ −∞
𝒎𝟏𝟏 𝒊𝒔 𝒄𝒂𝒍𝒍𝒆𝒅 𝑪𝒐𝒓𝒓𝒆𝒍𝒂𝒕𝒊𝒐𝒏 𝒐𝒇 𝑿 𝒂𝒏𝒅 𝒀. 𝑰𝒕 𝒊𝒔 𝒅𝒆𝒏𝒐𝒕𝒆𝒅 𝒂𝒔 𝑹𝑿𝒀
𝒌
𝝁𝒏𝒌 = 𝑬 [(𝒙 − 𝒎𝒙 )𝒏 (𝒚 − 𝒚𝒚 ) ]
∞ ∞
𝒌 𝒌
𝝁𝒏𝒌 = 𝑬 [(𝒙 − 𝒎𝒙 )𝒏 (𝒚 − 𝒎𝒚 ) ] = ∫ ∫ (𝒙 − 𝒎𝒙 )𝒏 (𝒚 − 𝒎𝒚 ) 𝒇(𝒙𝒚)𝒅𝒙𝒅𝒚
−∞ −∞
Where n and k are positive integer, the sum of (n+k) is called order of
Moments
First Order Joint Moments About Mean Value µ01 and µ10 are
∞ ∞
𝟎 𝟎
𝝁𝟏𝟎 = 𝑬 [(𝒙 − 𝒎𝒙 )𝟏 (𝒚 − 𝒎𝒚 ) ] = ∫ ∫ (𝒙 − 𝒎𝒙 )𝟏 (𝒚 − 𝒎𝒚 ) 𝒇(𝒙𝒚)𝒅𝒙𝒅𝒚
−∞ −∞
∞ ∞
∞ ∞
∞ ∞
𝟏 𝟏
𝝁𝟎𝟏 = 𝑬 [(𝒙 − 𝒎𝒙 )𝟎 (𝒚 − 𝒎𝒚 ) ] = ∫ ∫ (𝒙 − 𝒎𝒙 )𝟎 (𝒚 − 𝒎𝒚 ) 𝒇(𝒙𝒚)𝒅𝒙𝒅𝒚
−∞ −∞
∞ ∞
𝟏 𝟏
𝝁𝟎𝟏 = 𝑬 [(𝒚 − 𝒎𝒚 ) ] = ∫ ∫ (𝒚 − 𝒎𝒚 ) 𝒇(𝒙𝒚)𝒅𝒙𝒅𝒚
−∞ −∞
∞ ∞
Second Order Joint Moments About Mean Value µ02 ,µ20 and µ11are
∞ ∞
𝟎 𝟎
𝝁𝟐𝟎 = 𝑬 [(𝒙 − 𝒎𝒙 )𝟐 (𝒚 − 𝒎𝒚 ) ] = ∫ ∫ (𝒙 − 𝒎𝒙 )𝟐 (𝒚 − 𝒎𝒚 ) 𝒇(𝒙𝒚)𝒅𝒙𝒅𝒚
−∞ −∞
∞ ∞
∞ ∞
𝟐 𝟐
𝝁𝟎𝟐 = 𝑬 [(𝒙 − 𝒎𝒙 )𝟎 (𝒚 − 𝒎𝒚 ) ] = ∫ ∫ (𝒙 − 𝒎𝒙 )𝟎 (𝒚 − 𝒎𝒚 ) 𝒇(𝒙𝒚)𝒅𝒙𝒅𝒚
−∞ −∞
∞ ∞
𝟐 𝟐
𝝁𝟎𝟐 = 𝑬 [(𝒚 − 𝒎𝒚 ) ] = ∫ ∫ (𝒚 − 𝒎𝒚 ) 𝒇(𝒙𝒚)𝒅𝒙𝒅𝒚 = 𝝈𝟐𝒚
−∞ −∞
∞ ∞
𝟏 𝟏
𝝁𝟏𝟏 = 𝑬 [(𝒙 − 𝒎𝒙 )𝟏 (𝒚 − 𝒎𝒚 ) ] = ∫ ∫ (𝒙 − 𝒎𝒙 )𝟏 (𝒚 − 𝒎𝒚 ) 𝒇(𝒙𝒚)𝒅𝒙𝒅𝒚
−∞ −∞