Notes :
1. Distribution and Density functions/ Discrete, Continuous and Mixed Random variables/ Specific Random variables: Discrete and Continuous/ Conditional and Joint Distribution and Density functions
2. Why functions of Random variables are important to signal processing/ Transformations of Continuous and Discrete Random variables/ Expectation/ Moments: Moments about the origin, Central Moments, Variance and Skew/ Functions that give Moments:
Characteristic Function and Moment Generating Functions/ Chebyshev and Shwarz Inequalities/ Chernoff Bound
1. Functions of two Random variables/ Joint distribution and density functions/ Joint Moments (Covariance, Correlation Coefficient,
Orthogonality) and Joint Characteristic distribution and density functions
Functions/ Conditional
2. Expectation Vectors and Covariance Matrices/ Linear Estimator, MMSE Estimator/ ML Estimators {S&W}*
3. Random Sequences and Linear Systems/ WSS Random Sequences /Markov Random Sequences {S&W} / Stochastic Convergence and Limit Theorems/ Central Limit Theorem {Papoulis} {S&W}/ Laws of Large Numbers {S&W}
*Note: Shown inside the brackets {
}
are codes for Reference Books.
See page 30 of 30 of this document for references.
1. Correlation functions of Random Processes and their properties – {Peebles}; {S&W}; {Papoulis}
2. Power Spectral Density and its properties, relationship with autocorrelation ; White and Colored Noise concepts and definitions- {Peebles}
3. Spectral Characteristics of LTI System response; Noise Bandwidth- {Peebles};{S&W}
4. Matched Filter for Colored Noise/White Noise; Wiener Filters- {Peebles}
1. {Peebles}; Consider ‘Code Acquisition’ scenario in GPS applications for one example in finding the false alarm rate
2. Kalman Filtering; Applications of HMM (Hidden Markov Model)s to Speech Processing –{S&W}
Notes :
1. {R.G.Brown},pp1
2. {S&W},pp2
3. {S&W},pp2
4. {Papoulis}, pp1 [ 4.1Add Electron Emission, telephone calls, queueing theory, quality control, etc.]
5. Extra: {Peebles} pp2: [How do we characterize random signals:
One:how to describe any one of a variety of a random phenomena– Contents shown in Random Variables is required; Two: how to bring time into the problem so as to create the random signal of interest-- Contents shown in Random Processes is required] – ALL these CONCEPTS are based on PROBABILITY Theory.
1. Refer their failures from {Papoulis} pp6-7
2. {S&W} pp2-4
3. Slide not required!? Only of Historical Importance?
4. Classical Theory or ratio of Favorable to Total Outcomes approach cannot deal with outcomes that are not equally likely and it cannot handle uncountably infinite outcomes without ambiguity.
5. Problem with relative frequency approach is that we can never perform the experiment infinite number of times so we can only estimate P(A) from a finite number of trails.Despite this, this approach is essential in applying probability theory to the real world.
1. Experiment, Sample Space, Elementary Event (Outcome), Event, Discuss the equations why they are so? - :Refer {Peebles},pp10
2. Axiomatic Theory Uses- Refer {Kolmogorov}
3. Consider a simple resistor R = V(t) / I(t) -
is this true
under all conditions? Fully accurate?(inductance and capacitance?)clearly specified terminals? Refer{Papoulis}, pp5
4. Mutually Exclusive/Disjoint Events? [(refer point (iii) above) when P(AB) = 0]. When a set of Events is called Partition/Decomposition/Exhaustive (refer last point in the above slide); what is its use?(Ans: refer Tips and Tricks page of this document )
1. {S&W}, pp3
2. Also called ‘Prior Probability’ and ‘Posterior Probability’
3. Their role; Baye’s Theorem: Prior: Two types: Informative Prior and Uninformative(Vague/diffuse) Prior; Refer {Kemp},pp41-42
1. {R.G.Brown} pp12-13
2. Conditional probability, in contrast, usually is explained through relative frequency interpretation of probability see for example {S&W} pp16
Notes :
1. From {R.G.Brown} pp12-13
2. Joint Probability?
3. What happens if Events A1,A2,….An are not a partition but just some disjoint/Mutually Exclusive Events?Similarly for Events Bs?
4. Summing out a row for example gives the probability of an event A of Experiment A irrespective of the oucomes of Experiment A
5. Why they are called marginal? (because they used to be written in margins)
6. Sums of the Shaded Rows and Columns
Notes :
1. From {R.G.Brown} pp12-13
2. This table also contains information about the relative frequency of occurrence of various events in one experiment given a particular event in the other experiment .
3. Look at the Column with Red Box outline.Since no other entries of the table involve B2, list of these entries gives the relative
distribution of events A1,A2,…
An
given B2 has occurred.
4. However, Probabilities shown in the Red Box are not Legitimate Probabilities!(Because their sum is not unity, it is P(B2) ). So,
imagine renormalizing all the entries in the column by dividing by P(B2). The new set of numbers then is P(A1.B2)/P(B2), P(A2.B2)/P(B2) … P(An.B2)/P(B2) and their sum is unity. And the relative distribution corresponds to the relative frequency of
occurrence events A1,A2,…
An
given B2 has occurred.
5. This heuristic reasoning leads us to the formal definition of ‘Conditional Probability’.
1. {S&W} pp20
2. Average because expression looks likes averaging; Total because P(B) is sum of parts
3. In shade is ‘Total Probability Theorem’
Notes :
1. {Peebles} pp16
2. What about P(A) and P(B); both should not be zero or only P(B) should not be zero?
3. Ai’s form partition of Sample Space A; B is any event on the same space
1. {Peebles} pp17
2. BSC Transition Probabilities
1. {Peebles}
2. Average BER of the system is [(14 x 60 % )+ (6.9 x 40%) ] = 11.16% > 10% Erroneous Channel effect. This is due to unequal probabilities of 0t and 1t.
3. What happens if 0t and 1t are equi-probable? P(1t|0r) = 10% = P(0t|1r); and average BER of the system is [(10 x 50 % )+ (10 x 50%) ] = 10% = Erroneous Channel effect
4. Add: Bayesian methods of inference involve the systematic formulation and use of Baye’s Theorem. These approaches are distinguished from other statistical approaches in that, prior to obtaining the data, the statistician formulates degrees of belief concerning the possible models that may give rise to the data. These “
degrees of belief are regarded as probabilities. {Kemp} pp41
Posterior odds are equal to the likelihood ratio times the prior odds.”
[Note:Odds on A = P(A)/P(Ac); Ac= A compliment]
1. {Peebles} pp19
2. Can two independent events be mutually exclusive? Never (see the first point in the slide; when both P(A) and P(B) are non-zero, how can P(AB) be zero? ).
1. {Peebles} pp19-20
2. How many Equations are needed for ‘N’ Events to be independent? 2^n – 1 – n (add 1+n to nc2+…+ncn and find what it is and subtract the same from that)
1. {Peebles} pp20
1. {Peebles} and {Papoulis}
1. Example and all notes relating to this example are taken with humble gratitude in mind from S.Unnikrishnan Pillai’s Web support for the book “A. Papoulis, S.Unnikrishnan Pillai, Probability, Random Variables and Stochastic Processes, 4th Ed: McGraw Hill,
2002”
1. P(A) = m/k and P(B) = 1/n
Notes :
1. Interestingly the above strategy can be used to “play the stock market”.
2. Suppose one gets into the market and decides to stay up to 100 days. The stock values fluctuate day by day, and the important question is when to get out?
3. According to the above strategy, one should get out at the first opportunity after 37 days, when the stock value exceeds the maximum among the first 37 days. In that case the probability of hitting the top value over 100 days for the stock is also about 37%. Of course, the above argument assumes that the stock values over the period of interest are randomly fluctuating without exhibiting any other trend.
4. Interestingly, such is the case if we consider shorter time frames such as inter-day trading. In summary if one must day-trade, then a possible strategy might be to get
in at 9.30 AM, and get out any time after 12 noon (9.30 AM + 0.3679
11.54 AM to be precise) at the first peak that exceeds the peak value between 9.30
AM and 12 noon. In that case chances are about 37% that one hits the absolute top value for that day! (disclaimer : Trade at your own risk)
6.5 hrs =
5. Author’s note: The same example can be found in many ways in other contexts, e.g., Puzzle No.34 “The Game of Googol” from {M.Gardner}; the ancient Indian concept of ‘Swayamvara’ to name a few.
1. Shown in the { } brackets are the codes used to annotate them in the notes area.
Much more than documents.
Discover everything Scribd has to offer, including books and audiobooks from major publishers.
Cancel anytime.