Professional Documents
Culture Documents
EEN330 Topic 2
EEN330 Topic 2
Semester 2021-2022
A variable is a something that can take a value that may change. A random variable is a variable whose value is an
outcome of a random experiment.
The expectation of a random variable is the average of the values that the variable takes when the corresponding
experiment is repeated multiple times. It is the sum of the products of the possible values of the variable and the
corresponding probabilities when the random variable is discrete and it is the integral of the product of the variable and
the probability density function from negative infinity to positive infinity if the variable is continuous.
The variance of a random variable is the sum of the products of the values of the variable and the corresponding
probabilities minus the square of the expectation if the variable is discrete and it is the integral of the product of the
square of the variable and the probability density function from negative infinity to positive infinity minus the square of
the expectation if the variable is continuous. The variance is a measure of the deviation of the values, on average, from
the mean of the distribution.
If a linear operation 𝑌 = 𝑎𝑋 + 𝑏 is done on a variable 𝑋 with mean 𝜇 and standard deviation 𝜎, the variable 𝑌 has a
mean 𝜇 + 𝑏 and a standard deviation |𝑎 |𝜎.
Sometimes two random variables are studied together to find the relationship between two events. These events are
called joint events. The concepts of probability mass functions and probability density functions apply, but this time they
are functions of 2 variables. So, they have to be added / integrated over both variables for evaluating the probabilities.
These distributions are joint probability models.
Another important concept is that of the marginal probability. A marginal probability of a certain variable is the
probability of a variable taking a certain value irrespective of the values taken by other variables; in other words, this
function is obtained after adding or integrating the probability distribution function over all variables except the variable
of interest.
To check if 2 events are independent of each other, first take the joint probability distribution for the events. Then,
evaluate the marginal distribution for each event on its own. If the joint probability distribution is equal to the product
of the two marginal distributions, the pair of events is said to be independent.
As independent variables have statistical behavior independent of each other, the expectations of the individual random
variables do not depend on external factors. In addition, the expectation is a linear operator and the expectation for a
random variable is a constant.
𝐸(𝑋𝑌 − 𝑋𝐸(𝑌) − 𝑦𝐸(𝑋) + 𝐸 (𝑋)𝐸(𝑌)) = 𝐸 (𝑋)𝐸(𝑌) − 𝐸(𝑋)𝐸 (𝑌) − 𝐸(𝑋)𝐸 (𝑌) + 𝐸(𝑋)𝐸(𝑌) = 0
The correlation is 0.
Reference
H. Pishro-Nik, Introduction to Probability. Statistics and Random Processes. .