Professional Documents
Culture Documents
Random Variables
Introduction
Consider the choice of a student at random from a population
of college students. We wish to know his/her height, weight,
blood pressure, pulse rate, etc.
The mapping from sample space of students to measurements
of height and weight, would be H(si) = hi, W(si) = wi, of the
student selected.
Two random variable that are defined on the same sample space S
are said to be jointly distributed.
Jointly Distributed RVs
There are four vectors that comprise the sample space
0 0 1 1
SX,Y = , , ,
0 1 0 1
The values of the random vector (multiple random variables)
are denoted either by (x,y) which is an ordered pair/point in
the plane or [x y]T a 2D vector.
The size of the sample space for discrete RV can be finite of
countably infinite.
If X can take on 2 values Nx = 2, and Y can take on 2 values NY
=2, the total number of elements in SX,Y is NXNY = 4.
Generally, if SX = {x1, x2,,xNx} and SY = {y1, y2,,yNy}, then the
random vector can take on values in
SX,Y = SX SY = {(xi , yi ) :i = 1,2,..., N X ; j = 1,2,..., NY }
Jointly Distributed RVs
The notation A B, where A and B are sets, denotes a Cartesian
product set.
p X ,Y [xi , y j ] = 1
i=1 j=1
Similarly for a countably infinite sample space.
0 p X ,Y [0,0] 1
For two fair coins that do not interact 0 p X ,Y [0,1] 1
as they are tossed we might assign
0 p X ,Y [1,0] 1
pX,Y[i,j] = .
0 p X ,Y [1,1] 1
1 1
p X ,Y [i, j] = 1
i=0 j=0
The procedure to determine the joint
PMF from the probabilities defined on S
The procedure depends on whether the RV mapping is one-to-
one or many-to-one.
For a one-to-one mapping from S to SX,Y we have
1 1 1 1 1 3
+ = ,i = 0 + = ,j=0
1
1
p X [i] = p X ,Y [i, j] = pY [ j] = pX ,Y [i, j] =
8 8 4 8 4 4
1 1 3 1 1 5
j=1
+ = ,i = 1 j=1
+ = , j =1
4 2 4 8 2 8
Joint PMF cannot be determined
from marginal PMFs
It is not possible in general to obtain joint PMF from marginal
PMFs.
Consider the following joint PMF
pW ,Z [wi , z j ] =
pX,Y [xk , yl ] i = 1,2,..., NW ;
j = 1,2,..., N Z
g( xk ,yl )=wi
(k,l ):
h( xk ,yl )=z j
Sometimes we wish to determine the PMF of Z = h(X, Y) only.
Then we can use auxiliary RV W = X, so that pZ is the marginal
PMF and can be found form the formula above as
pZ [z j ] = pW ,Z [wi , z j ].
i:wi SW
Direct computation of PMF for
transformed RV
Consider the transformation of the RV (X,Y) into the scalar RV
Z = X2 + Y2. The joint PMF is given by
3 / 8 i = 0, j = 0
1/ 8 i = 1, j = 0
PX ,Y [i, j] =
1/ 8 i = 0, j = 1
3/8 i = 1, j = 1
To find the PMF for Z first note that (X,Y) takes on the values
(i,j) = (0,0),(1,0),(0,1),(1,1). Therefore, Z must take on the
values zk = i2 + j2 = 0,1,2. Then
0 0
pZ [0] = pX,Y [i, j] = pX,Y [i, j] = pX ,Y [0,0] = 3 / 8
{(i, j ):zk =i 2 + j 2 } i=0 j=0
2
pZ [1] = pX,Y [0,1] + pX,Y [1,0] = pZ [2] = pX,Y [1,1] = 3 / 8
8
Expected Values
If Z = g(X, Y), then by definition its expected value
E[Z] = zi pZ [zi ]
i
Or using a more direct approach
= a xi p X ,Y [xi , y j ] + b y j p X ,Y [xi , y j ]
i j i j
More generally,
or alternatively
cov(X,Y ) = EX,Y [XY ]- EX [X]EY [Y ]
Joint moments
The questions of interest
EX,Y [XY ] = 2
That will produce the desired +1 for the joint PMF above.
Independence implies zero covariance but
zero covariance does not imply independence
Consider the joint PMF which assigns equal probability to each
of the four points point.
2) The values of a joint PMF are given below. Determine the marginal PMFs
3)