You are on page 1of 42

Measurement 


Signal Processing

Chapter 3. Expectation and Probability

28‐Mar‐23

1 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Learning Objectives
 Know the meaning of and differences between a population 
and a sample.
 Know the definition of expectation & probability. 
 Know the calculation of expectation & probability. 
 Know the definitions for a histogram.
 Know how a histogram can be constructed.
 Know the relationships between histogram, probability and 
probability density function.

2 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Basic Concepts in Probability

 Each trial results in an 
outcome.  The specific 
outcome occurred comprise 
the set of occurrence for the 
event of getting that specific 
outcome. The set  Prob. of non-occurrence
corresponding to NO 
occurrence is denoted as its  Prob. of occurrence
null set or complement of 
that event (àor A’).
3 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Population and Sample
 Population: the complete collection of all members relevant to 
a particular issue
 Sample: a subset of the population, which is obtained by a 
process of random selection with equal probability. 

Statistics of the sample 
provides sample mean 
x
value(    ) and sample 
S2x
variance       ,which can be 
used to estimate true mean 
value () and true variance 2
through statistical inference.

4 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Population and Sample
 For a finite population, the population mean of a property is 
equal to the arithmetic mean of the given property while 
considering every member of the population.
 The sample mean(local) may differ from the population mean 
(global), especially for a small sample group(why?).

Usually we have no idea about 
the exact population mean
since number N is not available.

5 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Population and Sample(ex.)
How many people are there in the class?

How many people in this picture?

6 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Mean
 The arithmetic mean (or simply "mean") of a sample , usually 
denoted by    , is the sum of the sampled values divided by the 
x
number of items in the sample
1 N
x= å xi
N i =1

 Put it simply, this is also referred to as sample mean(or simply 
“average”)
1 N x1 + x 2 +  + x N
x= å xi =
N i =1 N

 For example, the arithmetic mean of the values 1,2,3,4,5,6 is
1 6 x + x2 ++ x6 1 + 2 ++ 6
x = å xi = 1 = = 3.5
6 i =1 6 6

7 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Mean(ex.)
 Considering a single die

Value 1 2 3 4 5 6
P(x) 1/6 1/6 1/6 1/6 1/6 1/6

1 6 x + x2 ++ x6 1 + 2 ++ 6
 Global mean x = å xi = 1 = = 3.5
6 i =1 6 6

 Local mean (pick two samples)
1+2 3+4 5+6
x= = 1.5 x= = 3.5 x= = 5 .5
2 2 2

 Local mean (pick five samples)
1+2 + 3 + 4 + 5 2+3+4+5+6 1+2 + 3 + 5 + 6
x= = 3 .0 x= = 4 .0 x= = 3 .4
5 5 5
8 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Expectation
 In probability and statistics, mean and expected value are 
used synonymously to refer to one measure of the central 
tendency either of a probability distribution or of the random 
variable characterized by that distribution.
m = E [x ]

 In the case of a discrete probability distribution of a random 
variable X, the mean is equal to the sum over every possible 
value x weighted by the probability of that value
N
m = å i =1 x i Pr (x i ) m = E [x ] N
E [x ] = åi =1 x i Pr (x i )

Pr( ) : probability E[ ] : expectation

 “” is a measure of location or central tendency.
9 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Expectation
 Suppose random variable X can take value x1 with probability p1, 
value x2 with probability p2, and so on, up to value xk with probability pk. 
 Then the expectation of this random variable X is defined as
k
taking a weighted sum
E (X ) = x1Pr (x 1 ) + x 2Pr (x 2 ) + x k Pr (x k ) = å x i Pr (x i )
i =1 of the values xi of X
 Note that all the probabilities add up to one, that is
k
å P (x k ) = Pr (x1 ) + Pr (x 2 ) +  + Pr (x k ) = 1
i =1

 Thus the expected value can be viewed as the weighted average 
represented as follows
k x P (x ) + x 2Pr (x 2 ) +  + x k Pr (x k )
E (X ) = å x i Pr (x i ) = 1 r 1
i =1 1
x P (x ) + x 2Pr (x 2 ) + x k Pr (x k )
= 1 r 1 m
Pr (x 1 ) + Pr (x 2 ) +  + Pr (x k )

10 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


What does it MEAN??

11 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Expectation(ex.)
 Probability of two dice averaging value:
Sample average Sample average Sample average average xk Pr(xk)
(1,1) 1.0 (3,1) 2.0 (5,1) 3.0 1.0 1/36
(1,2) 1.5 (3,2) 2.5 (5,2) 3.5 1.5 2/36
(1,3) 2.0 (3,3) 3.0 (5,3) 4.0 2.0 3/36
(1,4) 2.5 (3,4) 3.5 (5,4) 4.5 2.5 4/36
(1,5) 3.0 (3,5) 4.0 (5,5) 5.0 3.0 5/36
(1,6) 3.5 (3,6) 4.5 (5,6) 5.5 3.5 6/36 11
(2,1) 1.5 (4,1) 2.5 (6,1) 3.5 4.0 5/36
(2,2) 2.0 (4,2) 3.0 (6,2) 4.0 4.5 4/36
(2,3) 2.5 (4,3) 3.5 (6,3) 4.5 5.0 3/36
(2,4) 3.0 (4,4) 4.0 (6,4) 5.0 5.5 2/36
(2,5) 3.5 (4,5) 4.5 (6,5) 5.5 6.0 1/36
(2,6) 4.0 (4,6) 5.0 (6,6) 6.0 11
N 36 å Pr (x k ) = 1
å x å x 1 + 1.5 + 2.0 +  + 5.5 + 6.0 = 3.5
m = i =1 i = i =1 i =
i =1
N 36 36
æ1ö æ2ö æ1ö
m = åi =1 x i P (x i ) = 1.0çç ÷÷÷ + 1.5çç ÷÷÷ +  + 6.0çç ÷÷÷ = 3.5
11
è 36 ø è 36 ø è 36 ø 1       2        3       4       5        6
1.5   2.5     3.5    4.5    5.5
12 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Expectation(ex.)
 Probability of two dice summation value
Sample Sum Sample Sum Sample Sum Sum xk Pr(xk)
(1,1) 2 (3,1) 4 (5,1) 6 2 1/36
(1,2) 3 (3,2) 5 (5,2) 7 3 2/36
(1,3) 4 (3,3) 6 (5,3) 8 4 3/36
(1,4) 5 (3,4) 7 (5,4) 9 5 4/36
(1,5) 6 (3,5) 8 (5,5) 10 6 5/36
(1,6) 7 (3,6) 9 (5,6) 11 7 6/36 11
(2,1) 3 (4,1) 5 (6,1) 7 8 5/36
(2,2) 4 (4,2) 6 (6,2) 8 9 4/36
(2,3) 5 (4,3) 7 (6,3) 9 10 3/36
(2,4) 6 (4,4) 8 (6,4) 10 11 2/36
(2,5) 7 (4,5) 9 (6,5) 11 12 1/36
(2,6) 8 (4,6) 10 (6,6) 12 11
N 36 å Pr (x k ) = 1
å x å x 2 + 3 + 4 +  + 10 + 11 + 12 = 252 = 7
m = i =1 i = i =1 i = i =1
N 36 36 36
æ1ö æ2ö æ2ö æ1ö
m = å i =1 x i P (x i ) = 2 ⋅ çç ÷÷÷ + 3 ⋅ çç ÷÷÷ +  + 11⋅ çç ÷÷÷ + 12 ⋅ çç ÷÷÷ = 7
11
è 36 ø è 36 ø è 36 ø è 36 ø 2  3   4   5  6  7   8  9 10 11 12
13 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Discrete series number : E [x ] = åiN=1 x i Pr (x i )
Expectation
 For a continuous variable with a probability density function 
(PDF), let’s say P(x), then the expected/mean value E can be 
computed as E [X ] =
¥
xP (x )dx
ò-¥
represents the probability that x is in an
 For non‐negative region  infinitesimal range of width dx around x
¥
E [X ] = ò0 xP (x )dx
 Under a specific region of interest [T1 T2], it can be modified 
to read T
E [X ] = òT 2 xP (x )dx
1

 Thus we can interpret the formula for E[X] as a weighted 
integral of the values x of X, where the weights are the 
probabilities P(x)dx.
14 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Expectation(ex.)
 Ex. Let X be an uniform distribution data; that is, a random 
signal X[0,1], where the probability density function P(x)=1, 
please find the expectation E[X]
2 1
1 1 x 1
E [X ] = ò0 xP (x )dx = ò0 x ⋅ 1dx = =
2 0
2

 It can be found that the for the uniform distribution, the  
corresponding mean is at the midpoint of the range!

 Ex. Let X have range [0,2] and the probability density function 


P(x)=3x2/8. Please find E[X]
2
2 2 3 2 3 2 3 3x 4
E [X ] = ò0 xP (x )dx = ò0 x ⋅ x dx = ò0 x dx = = 1.5
8 8 32 0

15 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Expectation(ex.)
 Ex. A continuous variable X is exponentially distributed with a 
give parameter , where the probability density function (PDF) 
is described by 
P (x ) = le -lx := l exp (-lx )
increasing 
 Find the expected(mean) value
¥ ¥
E [X ] = ò0 xP (x )dx = ò0 x le -lxdx
Integration by part
ò udv = uv - ò vdu ( ¥
= - uv 0 - ò0 vdu
¥
)
figure; plot(0,0); hold
= - xe( -lx ¥
0
-ò 0
¥ -lx
e dx ) for Lambda = 1:0.25:2
x=0:0.01:5;
u = x  du = dx æ 1 ¥ö P = Lambda*exp(‐Lambda*x);
÷÷ = 1
¥
= -ççxe -lx + e -lx ÷ø l plot(x,P)
v = e -lx  dv = -le -lxdx è 0 l 0
end
Thus, expected value is the  xlabel('x')
reciprocal of the rate  ylabel('P(x)')

16 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Histogram
 A histogram (直方圖) is an accurate representation of the 
distribution of collected numerical data.
 A histogram is able to provide central tendency of the signal 
and the occurrence frequency of data
 Ex. A Frequency Histogram is a special graph that uses vertical columns 
to show frequencies (how many times each score occurs):

17 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Histogram
 Normalized frequency distribution can be calculated by considering
Normalized Freq = counts/total_sample  Probability Pr(x)
Normalized Freq =  Pr(x) = 1
 It displays the distribution of the probabilities of occurrence.
 The probability density function (PDF), represented by P(x), can be 
calculated by 
P(x) = Normalized Freq / Interval Size(dx)
category  # of students Normalized Freq =
PDF: p(x) = Freq / width, dx = 20
(score range) (40 students) Quantity / total num
E (1‐20) 2 0.05  (= 2/40) 0.0025 (= 0.05/20)
D (21‐40) 8 0.2 (= 8/40) 0.01 (= 0.2/20)
C (41‐60) 16 0.4 (= 16/40) 0.02 (= 0.4/20)
B (61‐80) 10 0.25 (= 10/40) 0.0125 (= 0.25/20)
A (81‐100) 4 0.1 (= 4/40) 0.005 (= 0.1/20)
18 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Histogram
 Consider a score vector : 
 Score=[0,1,1,2,2,2,3,3,3,3,4,4,4,4,4,5,5,5,5,5,5,6,6,6,6,6,7,7,7,7,8,8,8,9,9,10];
 Totally 36 samples.
 Please draw the counter freq. and normalized freq. histogram 
using 11 intervals~

19 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


close all
Score=[0,1,1,2,2,2,3,3,3,3,4,4,4,4,4,5,5,5,5,5,5,...

Histogram 6,6,6,6,6,7,7,7,7,8,8,8,9,9,10];
Interval = 11;
[Count, Bin] = hist(Score, Interval);
Since there are 11 intervals for the Score, the  Freq_normalized = Count/sum(Count);
%% Width of interval
width of the Interval can be calculated by  dx = (max(Score) ‐ min(Score)) / Interval;

max(X ) - min(X ) %% Probability density (function) calculation


dx = PDF = Freq_normalized/dx;
# of interval
%% Draw the frequency histogram
figure; 
hist(Score,Interval);
grid on;
dx xlabel('value')
ylabel('Counter or Frequency')

%% Draw the normalized frequency histogram
figure
bar(Bin, Freq_normalized)
N grid on;
å bark _ area = 1 xlabel('x(t)')
ylabel('Normalized Frequency')
k =1
%% Draw the Probability density
figure
When the samples tends to infinity and the  bar(Bin, PDF)
grid on;
resolution of the histogram dx0,it forms a  xlabel('x(t)')
ylabel('Probability density (function)')
continuous PDF.
20 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Probability and its Density Function
 The probability ‘Pr(X)’ of the random variable X falling within a 
particular range of values is given by the integral of this 
variable’s density over that range p (x )
b
Pr [a £ X £ b ] = òa P (x )dx

PDF
 where P(x) is a probability density function (PDF).
 Note that the value of PDF could be greater than 1!. b
òa P (x )dx
Pr [a £ X £ b ] P [x £ X £ x + Dx ]
P (x ) = lim or lim r
Dx 0 Dx Dx 0 Dx Dx
 It shows that the probability is given as the area under the 
curve (i.e., PDF) between the values of interest.
 Note that the total area (i.e., probability) under the curve must 

equal 1! Pr [-¥ £ X £ +¥ ] = ò-¥ P (x )dx = 1

21 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Probability and its Density Function
 The probability that X traveled between x1 and x2 is the area 
under the PDF ‘p(x)’ from x1 to x2 .
x
Pr [x 1 £ X £ x 2 ] = òx 2 P (x )dx » å P (x )Dx
1 
PDF

 Based on the distribution, it can be found that the smaller the 
increment x we take, the more accurate is the probability.
P (x )

Pr [-¥ £ X £ +¥ ] Dx
+¥ rectangular integration
= ò-¥ P (x )dx = 1
approximation

x1 x2 X
22 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Probability and its Density Function (Ex.)
 Example: Determine the value of k that makes the given 
function P(x) as a PDF on the interval 0 ≤ x ≤ 2.
x=0:0.01:2;
k=1/10;

P (x ) = k (3x 2 + 1)
pdf= k*(3*x.^2+1);
figure
plot(x,pdf); grid on;
xlabel('x'); ylabel('PDF p(x)');
set(gca,'FontSize',16)

 By definition, if P(x) is a PDF under the given interval, 
it must satisfy  b
Pr [a £ X £ b ] = òa P (x )dx = 1

 Therefore, one has
2
Pr [ 0 £ X £ 2 ] = ò0 k (3x + 1)dx = k (x + x ) = 10k
2 2 3
0

 To guarantee  1
Pr [ 0 £ X £ 2 ] = 1  k =
10
23 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Probability and its Density Function (Ex.)
 f (x ) = x 3 (10 - x ) / 5000
Consider                                      for the range 0 ≤ x ≤ 10 f (x ) = 0
and                  
for all other values of x. Please answer the following questions:
 (a). Show that f(x) is a PDF x=0:0.01:10;
f = x.^3.*(10‐x)/5000;
 (b). Find Pr(1 ≤ x ≤ 4 ) figure
plot(x,f);
 (c). Find Pr(x 6 ) grid on;
xlabel('x')
ylabel('PDF p(x)')
set(gca,'FontSize',16)

10
1 10 3 1 æç 1 1 5 ö÷
 (a). Pr [ 0 £ X £ 10 ] = ò x (10 - x )dx = ç 10x 4
- x ÷÷ = 1
5000 0 è
5000 4 5 ø0
4
 (b). 1 æç 1 4 1 5 ö÷
Pr [1 £ X £ 4 ] = 10x - x ÷ = 0.0866 = 8.66%
5000 çè 4 5 ÷ø 1
 (c). 1 æç 1
10
1 5 ÷ö
Pr [X ³ 6 ] = ç 10x 4
- x ÷÷ = 0.663 = 66.3%
5000 è 4 5 ø6
24 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Probability Density Function
 Therefore, for a continuous random variable x, the probability 
lies in the interval [a, b]  and [‐, +] can be represented 
respectively by
b ¥
Pr (a £ x £ b ) = òa P (x )dx = 1 Pr (-¥ £ x £ ¥) = ò-¥ P (x )dx = 1

probability pro. density  probability pro. density 


(a value) function (a value) function

 Remark : if P(x) is said to be a PDF, then the accumulation 
from the given range must be one.
 Apparently, the PDF of a continuous random variable allows us 
to calculate the probability of a value falling within an 
interested interval.
25 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Probability Density Function
 So far, it has been shown that the PDF of a continuous random 
variable allows us to calculate the probability of a value falling 
within an interval.
x
Pr [x 1 £ X £ x 2 ] = òx 2 P (x )dx » å P (x )Dx
1 
PDF
 For the discrete case, the PDF is related to the frequency 
distribution, which is calculated by the following process
1 æç K n ö

K f K
P (x ) = lim ç lim å ÷÷ = lim å i
= lim å fi*
Dx 0 Dx çèN ¥ i =1 N ø N ¥,Dx 0 i =1 Dx N ¥,Dx 0 i =1

 where probability Pr(x) within an interested range

fi := ni / N ³ 0 represents a normalized frequency.
fi* := fi / Dx ³ 0 represents a frequency density.

26 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Probability Density Function(ex.)
 Consider the PDF of a sinewave close all; clear; clc;
t=0:0.000001:1;
x=sin(2*pi*t);
Interval = 1000;
[Count, Bin] = hist(x, Interval);
PDF can be greater than 1 Total_N = sum(Count);
dx = 2/Interval;
Frequency = Count/Total_N;
Px = Frequency/dx;
figure
bar(Bin, Px, dx); grid on;
xlabel('x(t)')
ylabel('P(x)')

Count:
143567 61266 48350 41984 38166 ……. 38166 41984 48350 61266 143567

Bin:
-0.9500 -0.8500 -0.7500 -0.6500 -0.5500 …….. 0.5500 0.6500 0.7500 0.8500 0.9500

27 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Uniform Distribution
 The uniform distribution (or rectangular 
distribution) is a family of symmetric probability 
distributions
 The uniform distribution has random variable X 
restricted to a finite interval [a, b] and P(x) has 
constant density over the interval.
P (x ) a +b
mean=
2
1
b -a

Area = 1

x
a b
28 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Uniform Distribution(ex.)
 Histogram of uniform noise close all; clear; clc;
%% Generate uniform noise from 0~100
X = round(100*rand(100000,1));
Ind_Num=[];
muX = mean(X);
%% Plot histogram via hist
figure
hist(X,101)
grid on;
xlabel(' X values')
ylabel('Counts')

%% Plot histogram(another method)
range = 0:100;
mean  50 for i=range;
ind = find(X ==i);
ind_num = length(ind);
Ind_Num = [Ind_Num;ind_num];
end
figure
bar(range,Ind_Num)
grid on;
axis([0 100 0 1200])
xlabel(' X values')
ylabel('Counts')

29 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Summary of Types of Distribution

Negative skewness Positive skewness

David P. Doane, Lori E. Seward
30 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Properties of Expectation
 The following properties of E(X) for continuous variables are 
the same as for discrete ones.
Continuous Version Discrete Version
b n
m = E [X ] = òa xP (x )dx m = E [X ] = å x i P (x i )
i =1

 P1. For given constants “” and “”
E [aX + b ] = aE [X ] + b = amX + b

 P2. If “X” and “Y” are random variables on a sample space 
E [X +Y ] = E [X ] + E [Y ] = mX + mY

 P3. If “X” and “Y” are independent random variables, then


E [XY ] = E [X ]⋅ E [Y ] = mX ⋅ mY

31 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Proof of Properties of Expectation
 Proof of P1: E [aX + b ] = aE [X ] + b = amX + b
 By definition, we first consider
+¥ +¥
E [aX ] = ò-¥ axP (x )dx = a ⋅ ò-¥ xP (x )dx
a Î

= a ⋅ ò-¥ xP (x )dx = a ⋅ E [X ]

E [X ]
 As a result, 

E [aX + b ] = ò-¥ (ax + b )P (x )dx Definition of
a , b Î
+¥ +¥
a PDF
= ò-¥ a ⋅ xP (x )dx + ò-¥ b ⋅ P (x )dx
+¥ +¥
= a ⋅ ò-¥ xP (x )dx + b ⋅ ò-¥ P (x )dx
 
E [X ] 1

= a ⋅ E [X ] + b = a ⋅ mX + b Q.E.D
32 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Proof of Properties of Expectation
 Proof of P2:
+¥ +¥
E [X +Y ] = ò-¥ ò-¥ (x + y )P (x , y )dxdy
+¥ +¥
= ò-¥ ò-¥ (xP (x , y ) + yP (x , y ))dxdy
+¥ +¥ +¥ +¥
= ò-¥ ò-¥ xP (x , y )dxdy + ò-¥ ò-¥ yP (x , y )dxdy
+¥ +¥ +¥ +¥
= ò-¥ x ò-¥ P (x , y )dy dx + ò-¥ y ò-¥ P (x , y )dx dy
 
PX (x ) PY (y )
+¥ +¥
= ò-¥ xPX (x )dx + ò-¥ yPY (y )dy
= E [X ] + E [Y ] Q.E.D
 Note that the above result applied the fact that 

PX (x ) = ò-¥ P (x , y )dy  marginal PDF X of (X,Y)

PY (y ) = ò-¥ P (x , y )dx  marginal PDF Y of (X,Y)

https://en.wikipedia.org/wiki/Marginal_distribution
33 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Proof of Properties of Expectation
 Proof of P3: E [XY ] = E [X ]E [Y ] = mX ⋅ mY
 Again by definition, we have
+¥ +¥
E [XY ] = ò-¥ ò-¥ xyP (x , y )dxdy P (x , y )  joint probability
+¥ +¥
= ò-¥ ò-¥ xyP (x )P (y )dxdy
+¥ +¥
= ò-¥ xP (x )dx ⋅ ò-¥ yP (y )dy
= E [X ]⋅ E [Y ]
Q.E.D
 Note that the above result applied the fact that if “X” is 
independent of “Y”, then the PDF satisfies
Independence
P (x , y ) = P (x )P (y )
(probability theory)
https://en.wikipedia.org/wiki/Independence_(probability_theory)
34 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Homework
 Q1. As shown in the figure, please prove that the expectation 
of a uniform distribution is (5%) ¥
Hint : E [X ] = ò-¥ xP (x )dx
b +a 1
E [X ] = P (x ) =
2 b -a
 Also show that the corresponding  variance is (5%)
2
(b -a )
Hint :Var (X ) = E éêX 2 ùú - E [X ]
2
Var [X ] = ë û
12
P x  ¥
Hint : E êéX 2 úù = ò-¥ x 2P (x )dx
ë û
1
ba

x
a b
35 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Homework
 Q2. Is the following function a PDF or not for x[0, +] ? (5%)

P (x ) = le -lx := l exp (-lx )

 Please give a proof.
 Hint : recall the condition for a function which can say to be a 
PDF.

The PDF above is not the PDF!

36 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Homework
1 -x 2 /2
P ( )
x = e
 Q3. Consider the PDF                              for the range [‐,+]. 
2p
Please find the E[X] (10%)

+¥ +¥ 1 -x 2 /2
E [X ] = ò-¥ xP (x )dx = ò-¥ x ⋅ e dx = ?
2p

1
 Hint : use the change of state variable u = x2
2

37 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Homework
 Q4. Consider a Gaussian distribution, where the PDF is of the 
form
1 æ (x - m)2 ö÷
f (x | m, s ) =
2
exp ççç- ÷÷
s 2p çè 2s ÷÷ø
2

 Please find the E(X) (10%)

E (X ) = ò-¥ xf (x | m, s 2 )dx = ?
¥

(x - m)
u =
 Hint : a change of state variable                    is needed.
2s

38 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Homework
 Q5. Let X be a continuous random variable with the following 
density function. (10%)

ìkx -2 for 1 £ x £ 2
ï
P (x ) = ï
í
ï
î 0 otherwise
ï

 (a). If P(x) is said to be a PDF, please find k=?
 (b). Based on (a), please find the E(X) 

39 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


N 2
i =1( i
å x - m)
Hint : standard deviation s =
Homework N

 Q6. Consider a white noise, which is generated by
t = 0:0.001:100;
noise = randn(length(t),1);
figure
plot(t,noise);
hold on; grid on;
xlabel('Time(s)')
ylabel('Noise')

 Please complete the following questions:  (30%)
 Draw the counter based histogram with 100 intervals.
 Draw the normalized frequency based histogram with 100 intervals.
 Draw the PDF histogram with 100 intervals.
 What is the mean value and the standard deviation?
 Provide your MATLAB Code

40 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)


Homework
 Q7. Consider a probability density function (15%)
ìï30x 2 (1 - x )2 0 £ x £ 1
P (x ) = ïí ,
ïï 0 otherwise
î
 Please prove that 
¥
Pr (-¥ £ x £ ¥) = ò-¥ P (x )dx = 1

 Also find the probability for the x 
falling within the following range
Area=1
 (a). Pr(x 1/4) 
 (b). Pr(x 1/2)
 (c). Pr(1/4 x 1/2) 
 (d). Pr(1/2 x 1)
41 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)
Homework
 Q8. Given two random signals “S1” & “S2”, 
S1 = a1X + b1 , a1 Î , b1 Î 
S 2 = a2Y + b2 , a2 Î , b2 Î 

where “X” and “Y” are independent
 Please proof the following result. (10%)

E (S1S 2 ) = a1a2mX mY + a1b2mX + a2b1mY + b1b2

 where
mX = E (X ), mY = E (Y )

42 Dr. C. C. Peng (ccpeng@mail.ncku.edu.tw)

You might also like