You are on page 1of 31
BIVARIATE PROBABILITY DISTRIBUTION Introduction. In the preceding courses (STA 142 & STA 241) the study of random variables and their probability distribution was restricted to one dimensional sample space i.e of a single random variable. However, in many situations we may find it desirable to record simulation outcomes of several random variables. e.g we might measure the mass, M of a precipitate and volume, V of gas released from a controlled chemical experiment, giving rise to a two- dimensional sample space (my). Ina study to determine the likelihood of success in college based on high school data, one might use a three dimensional sample space and record for each individual his or her optimal test score, high school rank in class and grade points average at the end of the freshman year in college. Bivariate distributions Let X and Y be random variables defined on the same sample space. Then the pair ( X,Y ) is called a two dimensional ramdom variable. ‘The values of the random vector (X, Y ) are denoted by (x, y) where x is the value assumed by X and y is the value assumed by Y . The random vector (X, Y ) takes values in the two-dimensional real space i.e (x,y) is a point in Re As in the case of one dimensional random variable, we also classify two dimensional random variables as discrete or continuous. a) Discrete joint distribution A two dimensional random variable (X, Y ) is said to be discrete if each of its components, X and Y is a discrete r. v, thus (X, Y ) is a discrete r.v if it can assume values only at a finite or countably infinite number of points (x,y) in R? . If both X and Y have discrete distributions, then (X, Y ) is said to have a discrete bivariate distribution or a discrete joint distribution . The joint distribution function of X and Y is defined to be a density function iff for any point (x, y) in _R®, we have that; The values of f(x,y) give the probability that outcomes x and y occur at the same time. of the rev (X, Y) then f(x,y) =0 If (x,y) is not one of the possible values of the pai i i itions ‘The joint probability function, f(x, ) satisfies the following conditio i DY Sew=1 i, fy) = PX P[(nye d= > fG,y) Example 1. Determine the value of C so that the followin ofar.v X and Y 51,23 Ly) = Cy, for y=123 Solution. DTL/&y=1 SQ) =e, f(1,2) =2¢, f(,3)=3e F (2:1) = 2¢, f (2,2) =4e, f(2,3) = 6e £1) =3e, f (3,2) = 6c, f (3,3) =9e e+2e+3c+2c+de+6c+3e+6c+9e=1 36c=1 e=$ Sey) = $y, X=1,2,3 y=1,2,3 i elsewhere Example 2 If the j.p.d.f of X and Y is given by SFY =AF, ¥=0,1,2,3 and y=0,1,2 i Verify that f(x,y) is ap.d.t i, p(X>¥) .¥ =) orany region A in the xy plane g function represents a joint probability distribution ii, pox+ysay Solution i) 2 : UT/ey=1 = £O,0)=0, FOI) = 4, 0,2) =4 £09 = 4/0) =F, 2,3 £02 =H, SQ20=F,/2, 12.2 =$,fB.0 =F, SBD =% 16.28% Hebe bedpceeedede sede sa ge Total =4+54+b4+5+h+h+d+h+34+4+h=Ihence pdf ii) 0 1 2 3 0 |0 1/30 | 2/30 _| 3/30_| 6/30 1 1/30 | 2/30 | 3/30_| 4/30_| 10/30 2 | 2/30 | 3/30 | 4/30 | 5/30_| 14/30 3/30 | 6/30 | 9/30 | 12/30 | 1 P(X >Y)=f(,0)+ £(2,0)+ £3,0) + 2,1) + £3, + £B,2) ath atatatio 8 =a ii) PUX+Y)= f(2,2)+ £3) =at8 a) Continuous Joint Distribution Consider two random variables X and Y. defined on the same sample space. Then the two dimensional r.v (X,Y ) is said to be continuous if each of its components Xand Y isa continuous r.v If both X and ¥ have continuous density functions then the j.p.d.f f(x,y) is a surface lying aN the xy plane. “The function f(x,y) is a j.p.d-f of the continuous r.v.s X and Y iffit satisfies i) Sy) ZOVH%Y) i) [Lf PG y)dedy =1 p[(x.Y)€4]= J, y)dedy for any region A in the xy plane Example 3 Suppose that the j.p.d.f of X and Y is given by K(2x43y),xSxS10SyS1 foo={f y a elsewhere a) Find the value of k b) Find p[(X,¥) € A] where A is the region {(x,y)|O B@)= J Seay ay and A) = f Peer Example 5 Suppose that X and Y have a discrete j.p.d.f given by 0, elsewhere + ¥=0,1,2, y=0,12,3 Fea =fROr 01,2, y Determine the marginal probability function of X and Y Solution 3 8O)= LS) = Vt y) 7 ~ abet htt h(x+D+50r43) 202843) 70 = 0H) yo = 2), ¢=0,1,2 x 0 T 2 2) 3/15 3/15__| 7/15 AO)= YS) = Vawe+y) vty tlt (+2) = 2, y =0,1,2,3 =, y= 0,1,2,3 19 =0,1,2,3 Y 0 1 2 3 nO) 1/10 20 [3/0 | 4/10 | ‘We can verify that g(x) and h(y) are probability distributions of X and Y and that they satisfy the conditions Dag@)=1 and PAy)=1 Example 6 Suppose the j.p.d.f of X and Y is as follows Hey ROSS Osy g(0)xh() = 2x2 =, ‘Therefore X and Y are not statistically independent. Exercise 3 Suppose that the j.p.d.f of X and Y is given as 1 2 3 4 1 | 0.04 | 0.03 | 0.08 | 0.05 2 | 0.06 | 0.045 | 0.72_| 0.075 3 [0.10 | 0.075 | 0.20 | 0.125 Determine whether or not X and Y are independent Theorem Let the r.vs X and Y have the j.p.d.fif f(x,y) , then X and Y are statistically independent iff f(x,y) can be written as product of non-negative function of X alone and Non-negative function of a lone . L(y) = Bx) xh) where, g(x) >0,x€ A, Ky) > Oye A, Example 13 . 8xy,0 0, then X and Y are said to be positively correlated, while if , <0 then they are said to be negatively correlated. Py =0 iff Cov(x,y)=0 Example 21 1 2 0 x 3/28 9/28 | 3/28 | 15/28 6/28 6/28 | 0 12/28 1/28 0 0 1/28 fe JrpHlo 10/28 | 15/28 | 3/28 | 1.0 Find the correlation coefficient between X and Y = Sete) Py = SE 2 B()= Ye fQ)=P x42 xda0 = var(x) = E(x*)-[E (x) = 2- (ly = 6 EQ*)=D yf) =P x R42 x bas var(y) = E(y*)- [EP = 6-4? = 3S = Some) — = Pry C0, = ag = Exercise 4 Let the r.v.s X and Y have the j.p.f as OCwhere ” is the variance covariance matrix X and Y -.Var(CZ)=(a b) var(X) cov(x,y)\(a ae cov(x,y) var(Y) Jb = var X 0 if ind =o vary) it lependence =(avar(X)+boov(x,y) acov(x,y)+bvar(¥)) (;) =a? var(X) + abcov(x, y)+abcov(y, Y) +b? var(Y) =a? var(X) +b? var(Y) + 2abcov(x, y) Var(X +Y) = var(a'z) a =(,)=a ya cn yf coves) =U Th cov(x.y) var(Y) JU =var X + var ¥+2cow(X,¥) Example 22 3 oe Nor, ae) Given the covariance of X and Y to be > -( Compute var(3x+4y—-5) Solution: var(3x+4y—5) var(3x-+4y—5) = 3? var X +4 var ¥ +2x3x4cov(X,Y) =9x3+16x2+24x1 = 27432+8=67 Or var(3x+4y—5)=var(c'z)=(¢')) ©) (3 af; D-6 9()-a1-36=67 Theorem If X,,X,,...X,, are independent, then E(X,,X,,...X,) =E(X,), E(X,),--E(X,) Theorem If X,,Xp..%, arervs and )'a,X, where a,,a,,..a, are constants then; ia E(Y) = Yak (X,) and a Var(¥) = xe var(X,) +2 Y aa, cov(it,), 7< jf Proof Var(¥) = ELY - EWP = AS ax, - Laeexor =A a0, - uP = Lee, -aP La BCs — M(x) - 4) . x a, var(X,) +25 aa, cov(x,x,) isl ay Ifthe rvs X,,X;,..X, are independent, then Var(Y) = }"a°, var(X,) because all the covariance terms, cov(x,x,) are zero a CONDITIONAL EXPECTATION Suppose that X and Y are r.v.s hay -.8 havin pdf of X and '8 4 continuous jp, F(y|x) denote the conditional p.d # ? ae SGI) ,let_f() denote the marginal ee ave " 7 n that X=) conditional expectation of Y Biven X is denoted by EQ|X) oe , is given by EWI X)= | vy aay BOY) = | acl yyae For discrete case EW1X)=LYVOlx 2 EX|Y)=DXCly) More generally, suppose that g(x, y) is any function of X and Y . Then the conditional expectation of g(x,y) given X=x is given by Ele x,y) |X =a] = f g(%y)f (| »)dy,, if X and Y are jointly continuous and Elg(x,y)| X =x]= Do g(%,y)f (| 2) ifX and Y are all discrete Example 23 ‘Suppose that X and Y are continuous r.vs with j.p.d.f Bxy, O

You might also like