Professional Documents
Culture Documents
RVSP
RVSP
02/May/2007
1 of 30
ganesh.crl@gmail.com
Notes :
Probability
Part-1
02/May/2007
Random
Variables with
Applications to
Signal
Processing
Random
Random
Processes
Processes with
with
Applications
Applications to
to
Signal
Signal
Processing
Processing
Part-2
Part-3
2 of 30
Notes :
Contents 1. Probability
Probability
Why study Probability
Four approaches to Probability definition
A priori and A posteriori Probabilities
Concepts of Joint, Marginal, Conditional and Total Probabilities
Bayes Theorem and its applications
Independent events and their properties
Tips and Tricks
Example
02/May/2007
3 of 30
Notes :
02/May/2007
4 of 30
Notes :
5 of 30
Notes :
02/May/2007
6 of 30
Notes :
02/May/2007
7 of 30
Notes :
Probability .Part 1 of 3
02/May/2007
8 of 30
Notes :
9 of 30
Notes :
1. {R.G.Brown},pp1
2. {S&W},pp2
3. {S&W},pp2
4. {Papoulis}, pp1 [ 4.1Add Electron Emission, telephone calls,
queueing theory, quality control, etc.]
5. Extra: {Peebles} pp2: [How do we characterize random signals:
One:how to describe any one of a variety of a random phenomena
Contents shown in Random Variables is required; Two: how to bring
time into the problem so as to create the random signal of interest-Contents shown in Random Processes is required] ALL these
CONCEPTS are based on PROBABILITY Theory.
Probability as Intuition
Probability as the ratio of
Favorable to Total Outcomes
Probability as a measure
of Frequency of Occurrence
P[A] =
nE
n
P[A] = Lim E
n n
02/May/2007
10 of 30
Notes :
10
(i)
+ An) = 1
02/May/2007
11 of 30
Notes :
11
A posteriori Probability
Reasoning from the observed facts
02/May/2007
12 of 30
Notes :
1. {S&W}, pp3
2. Also called Prior Probability and Posterior Probability
3. Their role; Bayes Theorem: Prior: Two types: Informative Prior and
Uninformative(Vague/diffuse) Prior; Refer {Kemp},pp41-42
12
+ An be a partition of A and
+ Bn be a partition of B
02/May/2007
13 of 30
Notes :
1. {R.G.Brown} pp12-13
2. Conditional probability, in contrast, usually is explained through
relative frequency interpretation of probability see for example
{S&W} pp16
13
Event B1 Event B2
Event Bn
Event A1
P ( A1 B1 ) P( A1 B2 )
P( A1 Bn ) P(A1)
Event A2
P ( A2 B1 ) P ( A2 B2 )
P ( A2 Bn ) P(A2)
Event An
P ( An B1 )
P ( An Bn )
Marginal
Probabilities
P(B1)
P(B2)
02/May/2007
P(Bn)
P(A2)
SUM = 1
14 of 30
Notes :
14
Event B1 Event B2
Event Bn
Event A1
P ( A1 B1 ) P( A1 B2 )
P( A1 Bn ) P(A1)
Event A2
P ( A2 B1 ) P ( A2 B2 )
P ( A2 Bn ) P(A2)
Event An
P ( An B1 )
P ( An Bn )
Marginal
Probabilities
P(B1)
P(B2)
P(Bn)
P(A2)
SUM = 1
02/May/2007
15 of 30
Notes :
15
P( A | B) =
provided
P ( AB )
,
P( B)
P( B) 0.
02/May/2007
16 of 30
Notes :
16
+P(B|An)P(An)
02/May/2007
17 of 30
Notes :
1. {S&W} pp20
2. Average because expression looks likes averaging; Total because
P(B) is sum of parts
3. In shade is Total Probability Theorem
17
One form:
P ( AB )
,
P( B)
P( A | B) =
hence,
P( A | B) =
P ( B | A) =
P ( AB )
P ( A)
P ( B | A) P ( A )
P(B)
Other form:
P ( Ai | B ) =
P ( B | Ai ) P ( Ai )
n
i =1
02/May/2007
P ( B | Ai ) P ( Ai )
18 of 30
Notes :
1. {Peebles} pp16
2. What about P(A) and P(B); both should not be zero or only P(B)
should not be zero?
3. Ais form partition of Sample Space A; B is any event on the same
space
18
BSC
Transmit
0 or 1
Receive
0 or 1
P(0r) = ? &
P(1r) = ?
P(0r|1t) = 0.1
Symmetric; 0t is no different
System Errors (BER)? Out of 100 Zeros/Ones
received, how many are in errors?
02/May/2007
19 of 30
Notes :
1. {Peebles} pp17
2. BSC Transition Probabilities
19
P(0t|0r) = 0.857
P(1t|0r) = 0.143
P(0t|1r) = 0.069
P(0t|0r) = 0.931
02/May/2007
Notes :
1. {Peebles}
2. Average BER of the system is [(14 x 60 % )+ (6.9 x 40%) ] =
11.16% > 10% Erroneous Channel effect. This is due to unequal
probabilities of 0t and 1t.
3. What happens if 0t and 1t are equi-probable? P(1t|0r) = 10% =
P(0t|1r); and average BER of the system is [(10 x 50 % )+ (10 x
50%) ] = 10% = Erroneous Channel effect
4. Add: Bayesian methods of inference involve the systematic
formulation and use of Bayes Theorem. These approaches are
distinguished from other statistical approaches in that, prior to
obtaining the data, the statistician formulates degrees of belief
concerning the possible models that may give rise to the data. These
degrees of belief are regarded as probabilities. {Kemp} pp41
Posterior odds are equal to the likelihood ratio times the prior odds.
[Note:Odds on A = P(A)/P(Ac); Ac= A compliment]
20
P ( AB ) = P ( A ) P ( B )
02/May/2007
Test
for
Independece
21 of 30
Notes :
1. {Peebles} pp19
2. Can two independent events be mutually exclusive? Never (see the
first point in the slide; when both P(A) and P(B) are non-zero, how
can P(AB) be zero? ).
21
P ( AB ) = P ( A ) P ( B )
P ( BC ) = P ( B ) P ( C ) P ( ABC ) = P ( A ) P ( B ) P ( C )
P ( AC ) = P ( A ) P ( C )
02/May/2007
22 of 30
Notes :
1. {Peebles} pp19-20
2. How many Equations are needed for N Events to be independent?
2^n 1 n (add 1+n to nc2++ncn and find what it is and subtract
the same from that)
22
02/May/2007
23 of 30
Notes :
1. {Peebles} pp20
23
24 of 30
Notes :
24
Example
simple textbook examples to practical problems of interest
Day-trading strategy : A box contains n randomly numbered balls (not
1 through n but arbitrary numbers including numbers greater than n).
Suppose a fraction of those balls are initially
say m = np ; p < 1
drawn one by one with replacement while noting the numbers on those
balls.
The drawing is allowed to continue until a ball is drawn with a
number larger than the first m numbers.
02/May/2007
25 of 30
Notes :
1. Example and all notes relating to this example are taken with
humble gratitude in mind from S.Unnikrishnan Pillais Web support
for the book A. Papoulis, S.Unnikrishnan Pillai, Probability,
Random Variables and Stochastic Processes, 4th Ed: McGraw Hill,
2002
25
Example
Let X = ( k + 1) stdrawn ball has the largest number among all n
k
balls, and the largest among the first k balls is in the group of first m
balls, k > m.
and
26 of 30
Notes :
26
Example
Notice that A and B are independent events, and
hence
1 m 1 np p
P ( X k ) = P ( A) P ( B ) =
nk
n k
k =m
k =m
np k
np
= p ln p.
02/May/2007
27 of 30
Notes :
27
Example
Maximization of the desired probability with respect to
d
p gives
( p ln p ) = (1 + ln p ) = 0
dp
or
p = e1 0.3679.
The maximum value for the desired probability
of drawing the largest number also equals 0.3679
02/May/2007
28 of 30
Notes :
1.
Interestingly the above strategy can be used to play the stock market.
2.
Suppose one gets into the market and decides to stay up to 100 days. The stock
values fluctuate day by day, and the important question is when to get out?
3.
According to the above strategy, one should get out at the first opportunity after 37
days, when the stock value exceeds the maximum among the first 37 days. In that
case the probability of hitting the top value over 100 days for the stock is also
about 37%. Of course, the above argument assumes that the stock values over the
period of interest are randomly fluctuating without exhibiting any other trend.
4.
Interestingly, such is the case if we consider shorter time frames such as inter-day
trading. In summary if one must day-trade, then a possible strategy might be to get
in at 9.30 AM, and get out any time after 12 noon (9.30 AM + 0.3679 6.5 hrs =
11.54 AM to be precise) at the first peak that exceeds the peak value between 9.30
AM and 12 noon. In that case chances are about 37% that one hits the absolute top
value for that day! (disclaimer : Trade at your own risk)
5.
Authors note: The same example can be found in many ways in other contexts,
e.g., Puzzle No.34 The Game of Googol from {M.Gardner}; the ancient Indian
concept of Swayamvara to name a few.
28
What Next?
Part - 2
02/May/2007
Part - 3
29 of 30
Notes :
29
References
1. A. Papoulis, S.Unnikrishnan Pillai, Probability, Random Variables
and Stochastic Processes, 4th Ed: McGraw Hill,2002. {Papoulis}
2. Henry Stark, John W.Woods, Probability and Random Processes with
Applications to Signal Processing,3rd Ed: Pearson Education, 2002. {S&W}
3. Peebles Peyton Z., Jr, Probability, Random Variables and Random
Signal Principles,2nd Ed: McGraw Hill,1987. {Peebles}
4. Norman L.Johnson, Adrienne W.Kemp, Samuel Kotz, Univariate
Discrete Distributions, 3rd Ed: Wiley, 2005. {Kemp}
5. A.N.Kolmogorov, Foundations of the Theory of Probability: Chelsea,
1950. {Kolmogorov}
6. Robert Grover Brown, Introduction to Random Signal analysis and
Kalman Filtering: John Wiley,1983. {R.G.Brown}
7. J.L.Doob, Stochastic Processes:John Wiley,1953 {Doob}
8. Martin Gardner, My Best Mathematical and Logic Puzzles: Dover
Publications, Inc, New York, 1994. {M.Gardner}
02/May/2007
30 of 30
Notes :
1. Shown in the { } brackets are the codes used to annotate them in the
notes area.
30