You are on page 1of 44

Communications and Knowledge Engineering

Institute of Engineering
Paschimanchal Campus
Lamachour , Pokhara

Er. Santosh Kumar Baral


Er. Suraj Basant Tulachan
 Probability Theory
◦ Joint events and joint probabilities
◦ Conditional probability
◦ Statistical Independence
 Random Variables Probability Distributions
and Probability Densities
◦ Multiple Random Variables
◦ Joint Probability Distributions
◦ Joint Probability Densities
 Random Experiment
◦ Experiment depending on chance or probability
 Sample Space
◦ All possible distinct outcomes of a random experiment
◦ Sample space is not unique and its construction depends
upon the point of view as well as questions to be
answered
◦ can be classified as acceptable or non acceptable
 Sample Point
◦ Each possible outcome of random experiment
 Event
◦ A subset of the sample space having one or more
sample points as its elements
 Given a random experiment, a finite
probability measure of event A, P(A) is
assigned to every event in the sample space S
of all possible events
 Axiom 1: Non Negativity : 0≤P(A) ≤ 1
 Axiom 2: Normed : P(S)=1
 Axiom 3 : Additive
◦ For a countable collection of mutually exclusive
events A1, A2, A3 in S,
…..Eqn 2.1
 When two experiments are performed
simultaneously or single event repeated we call
them joint events and the probability of
combined outcomes evaluated are the joint
probabilities
 Example:
◦ Two tosses of a single coin or Single Toss of Two coins
◦ Sample Space=4 outcomes of tupples with probability of
¼ each
◦ Throwing of single dice twice or Two dice at a time
◦ Sample Space=36 outcome tupples with probability 1/36
each
…Eqn 2.2

…Eqn 2.3

…Eqn 2.4

…Eqn 2.5
 Multiple random variables arise during
Combined experiments or repeated trials of a
single experiment
 Multiple random variables are basically
multidimensional functions defined on a
sample space of a combined experiment
 Let us consider two random variables, X1 and
X2, each of which may be continuous,
discrete or mixed type
 HW:Transformation :Y=aX3 +b , a>0
=E(X2 )-[E(X)]2
 1. When two random variables are statistically
independent, then E(X1,X2)= E(X1 ) * E(X2 ) and
covariance , µij =0
 2. When covariance µij =0 is zero does not
necessarily mean that the random variables are
statistically independent
 3. When E(X1, X2 )=0 , then the random variables
are orthogonal
 When uncorrelated
 When either Statistical mean of Either Variable is
Zero
 1. We can find the nth central moment of the
functions by finding first nth derivatives
 2. We can find the pdf of a random varialble
expressed sum of other statistically
independent Random variables
 First find the characteristic function from Fourier
transform
 Then find the inverse Fourier transform
 1. Binomial Distribution
 2. Uniform Distribution
 3. Gaussian Distribution
 1. John G Proakis , Digital Communications, Fifth Edition, Tata
MCGraw-Hill companies, Inc-2008
 2. John G Proakis and Masoud Salehi, Communication System
Engineering, Second Edition, Prentice Hall- 2002

You might also like