You are on page 1of 2

Random variables

Semester 2021-2022

Course name Instructor’s name: Worksheet created by


Random signals and noise Dr. Montasir Qasymeh Name: Mohammad Tahmid ID: 1076344
Hassan
There are two types of probability models. One type is the discrete model, where outcomes may be listed or counted
individually; the other is the continuous model, where an outcome is a value within a certain range or set of numbers.
For an event with discrete outcomes, the probability of the event is the sum of the probabilities of the individual
outcomes that belong to the event. For a continuous probability model, a probability is defined as the chance that a
random value will fall within a certain range. Discrete models give probability mass functions, which assign discrete
probability values to each value, while continuous models give probability density functions, which are continuous
functions of the range of values.

The conditions for a valid discrete probability model are:

1. Each probability value should be a number between 0 and 1.


2. The sum of the probabilities should be 1.

The conditions for a valid continuous probability model are:

1. The probability density function should always be positive.


2. The integral of the probability density function from minus infinity to positive infinity must equal 1.

A variable is a something that can take a value that may change. A random variable is a variable whose value is an
outcome of a random experiment.

The expectation of a random variable is the average of the values that the variable takes when the corresponding
experiment is repeated multiple times. It is the sum of the products of the possible values of the variable and the
corresponding probabilities when the random variable is discrete and it is the integral of the product of the variable and
the probability density function from negative infinity to positive infinity if the variable is continuous.

The variance of a random variable is the sum of the products of the values of the variable and the corresponding
probabilities minus the square of the expectation if the variable is discrete and it is the integral of the product of the
square of the variable and the probability density function from negative infinity to positive infinity minus the square of
the expectation if the variable is continuous. The variance is a measure of the deviation of the values, on average, from
the mean of the distribution.

The standard deviation of a variable is the square root of the variance.

If a linear operation 𝑌 = 𝑎𝑋 + 𝑏 is done on a variable 𝑋 with mean 𝜇 and standard deviation 𝜎, the variable 𝑌 has a
mean 𝜇 + 𝑏 and a standard deviation |𝑎 |𝜎.

Sometimes two random variables are studied together to find the relationship between two events. These events are
called joint events. The concepts of probability mass functions and probability density functions apply, but this time they
are functions of 2 variables. So, they have to be added / integrated over both variables for evaluating the probabilities.
These distributions are joint probability models.

Another important concept is that of the marginal probability. A marginal probability of a certain variable is the
probability of a variable taking a certain value irrespective of the values taken by other variables; in other words, this
function is obtained after adding or integrating the probability distribution function over all variables except the variable
of interest.

To check if 2 events are independent of each other, first take the joint probability distribution for the events. Then,
evaluate the marginal distribution for each event on its own. If the joint probability distribution is equal to the product
of the two marginal distributions, the pair of events is said to be independent.

The covariance of two variables X and Y is

𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸((𝑋 − 𝐸(𝑋)(𝑌 − 𝐸(𝑌)))


The correlation of two variables X and Y is
𝐶𝑜𝑣(𝑋, 𝑌)
𝐶𝑜𝑟(𝑋, 𝑌) =
𝜎𝑋 𝜎𝑌
If two random variables are independent, then

𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸 ((𝑋 − 𝐸 (𝑋)(𝑌 − 𝐸 (𝑌))) = 𝐸(𝑋𝑌 − 𝑋𝐸(𝑌) − 𝑦𝐸(𝑋) + 𝐸(𝑋)𝐸 (𝑌)))

As independent variables have statistical behavior independent of each other, the expectations of the individual random
variables do not depend on external factors. In addition, the expectation is a linear operator and the expectation for a
random variable is a constant.

𝐸(𝑋𝑌 − 𝑋𝐸(𝑌) − 𝑦𝐸(𝑋) + 𝐸 (𝑋)𝐸(𝑌)) = 𝐸 (𝑋)𝐸(𝑌) − 𝐸(𝑋)𝐸 (𝑌) − 𝐸(𝑋)𝐸 (𝑌) + 𝐸(𝑋)𝐸(𝑌) = 0

The correlation is 0.

For two random variables X and Y


2 2 2
𝑉𝑎𝑟(𝑋 + 𝑌) = 𝐸 ((𝑋 + 𝑌)2 ) − (𝐸(𝑋 + 𝑌)) = 𝐸 (𝑋 2 ) + 2𝐸 (𝑋𝑌) + 𝐸(𝑌 2 ) − (𝐸(𝑋)) − 2𝐸 (𝑌)𝐸(𝑋) − (𝐸(𝑌))

If the two variables are independent, then

2𝐸 (𝑋𝑌) = 2𝐸(𝑋)𝐸 (𝑌)


Therefore, for independent variables

𝑉𝑎𝑟(𝑋 + 𝑌) = 𝑉𝑎𝑟(𝑋) + 𝑉𝑎𝑟(𝑌)

Reference
H. Pishro-Nik, Introduction to Probability. Statistics and Random Processes. .

You might also like