You are on page 1of 4

Solutionjs to Problem Assignment #9

Math 501–1, Spring 2006
University of Utah

Problems:
1. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads
then she wins twice, and if tails, then one-half of the value that appears on the die.
Determine her expected winnings.
Solution: Let H = {heads}, and define N to be the number of dots on the rolled die. We
know that N and H are independent. Let W denote the amount won. We know
that
P (W = 2N | H) = 1 and P (W = N/2 | H c ) = 1.
Therefore,
E(W ) = E(W | H)P (H) + E(W | H c )P (H c )


 N .

.

c = E(2N | H)P (H) + E H P (H c ) 2 .

1 35 E(W ) = 2E(N )P (H) + E(N )P (H c ) = = 4. Therefore. Consequently.   N = E(2N )P (H) + E P (H c ). 1) random variables. y) = 1 if 0 ≤ x. f (x . and E(N ) = (1 + · · · + 6)/6 = 7/2. But P (H) = P (H c ) = 1/2. otherwise. 2 by independence. y) = 0. (α + 1)(α + 2) Solution: The density of (X . If X and Y are independent uniform-(0 .375. Y ) is f (x . then prove that 2 E (|X − Y |α ) = for all α > 0. Z 1Z 1 α E (|X − Y | ) = |x − y|α dx dy 0 0 Z 1 Z 1 Z 1 Z x α = (y − x) dy dx + (x − y)α dy dx 0 x 0 0 Z 1 Z1−x Z 1 Z x = z α dz dx + wα dw dx 0 0 0 0 Z 1 Z 1 1 α+1 1 = (1 − x) dx + xα+1 dx 1+α 0 1+α 0 2 = . y ≤ 1. 2 8 2. (1 + α)(2 + α) .

M1 ) = P (Wn ≥ 1 . so that 2n ! X E Ii = 2nA. . . n. . but now assume that the group is randomly seated at a round table. . Note that E(Ii ) = P (Wi ≥ 1 . . be independent with common mean µ and common variance σ 2 . 2n. P (Mi ) = 1/2 [produce the requisite combinatorial argument]. 3. . Let X1 . n − 1. Mn ) = := B. Now consider the events Mi = {person i is a man}. then P (Wi = 0 . . Set Yn = Xn + Xn+1 + Xn+2 for all n ≥ 1. Mi ) = A for all i = 1 . . . . Therefore. 2n ! 2n X X E Ii = E(Ii ) i=1 i=1 2n−1 X = P (W1 ≥ 1 . (b) Repeat part (a). n − 1. Evidently. as asserted. (2n)! 2 Similarly. Solution: The difference now is that P (Wi ≥ 1 . . . for i = 1 . M1 ) + P (Wi ≥ 1 . (a) Find the expected number of men who have a woman next to them. Mi ). Clearly. . . . the expected number of men who have n a woman next to them is i=1 Ii . We can note that if i = 2. (2n)! Let Ii = 1 if Mi occurs Pand Wi ≥ 1. A group of n men and n women are lined up at random. n2 (2n − 2)! P (W1 ≥ 1 . (2n)! Consequently. . Mi ) P (Mi−1 ∩ Mi ∩ Mi+1 ) P (Wi = 0 | Mi ) = = P (Mi ) 1/2 n  3 (2n − 3)! = 2P (Mi−1 ∩ Mi ∩ Mi+1 ) = 2 . Mn ) i=2 = 2B + (2n − 2)A. . i=1 4. Let Wi denote the number of women who are neighboring person i. Solution: All (2n)! possible permutations are equally likely. Mi ) + P (Wn ≥ 1 . . Mi ) = P (Wi ≥ 1 | Mi )P (Mi ) = 1−2 × := A. X2 . for i = 2. . ! n  3 (2n − 3)! 1 P (Wi ≥ 1 . .

consider the case that j = 0. E Yn2 = 3 σ 2 + µ2 + 6µ2 = 3σ 2 + 9µ2 . E Yn2 = E Xn2 + E (Xn Xn+1 ) + E (Xn Xn+2 )   2  + E (Xn+1 Xn ) + E Xn+1 + E (Xn+1 Xn+2 ) 2  + E (Xn+2 Xn ) + E (Xn+2 Xn+1 ) + E Xn+2 . Thus. E(Yn ) · E(Yn+j ) = 9µ2 . Yn+j = Xn+j + Xn+j+1 + Xn+j+2 . Yn = Xn + Xn+1 + Xn+2 . Solution: Note that E(Yn ) = E(Xn ) + E(Xn+1 ) + E(Xn+2 ) = 3µ. consider the case that j = 1. E(Yn+j ) = 3µ. Also. then by independence E(Xn Xm ) = E(Xn )E(Xm ) = µ2 . Yn+j ) for all n ≥ 1 and j ≥ 0. (j = 1) . Take expectations to find that E (Yn Yn+j ) = E (Xn Xn+j ) + E (Xn Xn+j+1 ) + E (Xn Xn+j+2 ) + E (Xn+1 Xn+j ) + E (Xn+1 Xn+j+1 ) + E (Xn+1 Xn+j+2 ) + E (Xn+2 Xn+j ) + E (Xn+2 Xn+j+1 ) + E (Xn+2 Xn+j+2 ) . Yn Yn+j = Xn Xn+j + Xn Xn+j+1 + Xn Xn+j+2 + Xn+1 Xn+j + Xn+1 Xn+j+1 + Xn+1 Xn+j+2 + Xn+2 Xn+j + Xn+2 Xn+j+1 + Xn+2 Xn+j+2 . Therefore. But E(Xn2 ) = E(Xn+12 2 ) = E(Xn+2 ) = Var(Xn ) + (EXn )2 = σ 2 + µ2 . Let us work this out in separate cases. if n 6= m. Compute Cov(Yn . E (Yn Yn+1 ) = E (Xn Xn+1 ) + E (Xn Xn+2 ) + E (Xn Xn+3 ) 2  + E Xn+1 + E (Xn+1 Xn+2 ) + E (Xn+1 Xn+3 ) 2  + E (Xn+2 Xn+1 ) + E Xn+2 + E (Xn+2 Xn+3 ) = 7µ2 + 2 σ 2 + µ2  = 2σ 2 + 9µ2 . Also. Then. First.   (j = 0) Next. depending on the value of j ≥ 0. In this case. thus.

(j ≥ 3) Theoretical Problems: 1.” then construct an example. Xn are independent. where φ(x) := E [ X1 | X1 + · · · + Xn = x] . For instance. compute φ(x) for all x. E(X) = 0. Next we consider the case j = 2. φ(x) = E[X3 | X1 + · · · + Xn = x] = · · · = E[Xn | X1 + · · · + Xn = x]. . Therefore. (Hard) Suppose X1 . . φ(x) = E[X2 | X1 +· · ·+Xn = x] as well. and f (x) = 0 otherwise. . In this case. suppose f (x) = 21 if −1 ≤ x ≤ 1. . Solution: Note that the distribution of (X1 . Prove that Z ∞ E(X) = P {X > t} dt. [f is the R ∞uniform-(−1 . . X1 . Suppose X is a nonnegative random variable with density function f . Thus. Solution: The trick is to start with the right-hand side: Z ∞ Z ∞Z ∞ Z ∞Z x P {X > t} dt = f (x) dx dt = f (x) dt dx 0 0 t 0 0 Z ∞ = xf (x) dx = E(X). Add the preceding equations to find that nφ(x) = E [ X1 + · · · + Xn | X1 + · · · + Xn = x] = x.1) 0 Is this still true when P {X < 0} > 0? If “yes. 1) density. . φ(x) = (x/n). . 2. If “no. E (Yn Yn+2 ) = E (Xn Xn+2 ) + E (Xn Xn+3 ) + E (Xn Xn+4 ) + E (Xn+1 Xn+2 ) + E (Xn+1 Xn+3 ) + E (Xn+1 Xn+4 ) 2  + E Xn+2 + E (Xn+2 Xn+3 ) + E (Xn+2 Xn+4 ) = σ 2 + 9µ2 . Xn ) is the same as (X2 . . whereas the preceding shows that 0 P {X > t} dt = 0 xf (x) dx = R1 (1/2) 0 x dx = (1/4). . then E (Yn Yn+j ) = 9µ2 .” then prove it. . Then. (j = 2) Finally. . (eq. 0 This cannot be true when P {X < 0} > 0. . . Similarly. if j ≥ 3. and have the same distribution. Xn ).] R∞ Then.