© All Rights Reserved

4 views

© All Rights Reserved

- STA_5325_HW_2_Ramin_Shamshiri
- Jacobian Convolution
- Chapter 06 - Solutions
- S2 2006 Jun QP
- Risk Analysis in Investment Appraisal
- Concise Introduction to Multi Agent Systems and Distributed Artificial Intelligence Synthesis Lectures on Artificial Intelligence and Machine Learni
- Image Enhancement in the Spacial Domain
- Risk Analysis in Investment Appraisal
- June 2016 (IAL) QP - S1 Edexcel
- Lec Uniform Disttribution
- Engineering Probability and Statistics Problem Set
- Preface
- R03 Simulation.128
- section7.2
- Cumulative Distribution Function - Wikipedia
- 2009
- MPRA_paper_10035.pdf
- Lecture 1
- hwk6
- 05 Random Signal

You are on page 1of 11

Problem 31

According to the U.S. National Center for Health Statistics, 25.2 percent of males and

23.6 percent of females never eat breakfast. Suppose that random samples of 200 men and

200 women are chosen. Approximate the probability that

Let M denote the number of men that never eat breakfast and W denote the number

of women that never eat breakfast. Note that M is a binomial random variable with

p = .252 and n = 200, and W is a binomial random variable with p = .236 and

n = 200. We compute

E[W ] = 200(.236) = 47.2, Var(M ) = 200(.236)(1 − .236) ≈ 36.1.

σ 2 ≈ 37.7 and we may approximate W by a normal random variable with µ = 47.2

and σ 2 ≈ 36.1.

We want to compute P {110 ≤ M + W }. We may approximate M + W by a normal

distribution with µ = 50.4 + 47.2 = 97.6 and σ 2 ≈ 37.7 + 36.1 = 73.8. Hence

110 − 97.6 M + W − 97.6 110 − 97.6

P {110 ≤ M + W } = P √ ≤ √ ≈1−Φ √

73.8 73.8 73.8

≈ 1 − Φ(1.44) = 1 − .9251 = .0749.

(b) the number of the women who never eat breakfast is at least as large as the number

of the men who never eat breakfast.

We want to compute P {M ≤ W } = P {M − W ≤ 0}. We may approximate M − W

by a normal distribution with µ = 50.4 − 47.2 = 3.2 and σ 2 = 37.7 + 36.1 = 73.8.

Hence

M − W − 3.2 −3.2

P {M ≤ W } = P {M − W ≤ 0} = P √ ≥√

73.8 73.8

−3.2

≈ Φ √ ≈ 1 − Φ(.37) = 1 − .6443 = .3557.

73.8

Problem 44

If X1 , X2 , X3 are independent random variables that are uniformly distributed over (0, 1),

compute the probability that the largest of the three is greater than the sum of the other

two.

1

Note that if, for example, X1 ≥ X2 + X3 , then X1 is automatically the largest of the

three. We want to compute

By symmetry, these three terms are all equal, so it suffices to compute the first term

P {X1 ≥ X2 + X3 }. Recall from Example 3a on page 252 that

y if 0 ≤ y ≤ 1,

fX2 +X3 (y) = 2−y if 1 < y ≤ 2,

0 otherwise.

Hence

Z 1 Z x Z 1 Z x Z 1

1 2 1

P {X1 ≥ X2 + X3 } = fX1 (x)fX2 +X3 (y)dydx = 1 · y dydx = x dx = .

0 0 0 0 0 2 6

So the probability that the largest of the three is greater than the sum of the other

two is

1 1 1 1

P {X1 ≥ X2 + X3 } + P {X2 ≥ X1 + X3 } + P {X3 ≥ X1 + X2 } = + + = .

6 6 6 2

Problem 19

Let X1 , X2 , X3 be independent and identically distributed continuous random variables.

We consider 7 mutually exclusive cases: the first six are X1 > X2 > X3 , X1 >

X3 > X2 , X2 > X1 > X3 , X2 > X3 > X1 , X3 > X1 > X2 , X3 > X2 > X1 , and

the seventh case is where at least two of X1 , X2 , X3 are equal. Since the Xi are

continuous random variables, the probability that at least two of the Xi are equal

is zero. Since X1 , X2 , X3 , are independent and identically distributed, the first six

cases are all equally likely. This means that P (X1 > X2 > X3 ) = 1/6, and similarly

for the other orderings. Hence

P {X1 > X2 |X1 > X3 } = = .

P {X1 > X2 > X3 } + P {X1 > X3 > X2 } + P {X2 > X1 > X3 } 3

P {X1 > X2 |X1 < X3 } = = .

P {X2 > X3 > X1 } + P {X3 > X1 > X2 } + P {X3 > X2 > X1 } 3

2

(c) Compute P {X1 > X2 |X2 > X3 }.

P {X1 > X2 |X1 > X3 } = = .

P {X1 > X2 > X3 } + P {X2 > X1 > X3 } + P {X1 > X3 > X2 } 3

P {X1 > X3 > X2 } + P {X3 > X1 > X2 } 2

P {X1 > X2 |X1 > X3 } = = .

P {X1 > X3 > X2 } + P {X3 > X1 > X2 } + P {X3 > X2 > X1 } 3

Problem 28

Show that the median of a sample of size 2n + 1 from a uniform distribution on (0, 1) has

a beta distribution with parameters (n + 1, n + 1).

Let X denote the median of the independent and identically distributed random

variables X1 , . . . , X2n+1 . Consider equation 6.2 on page 272. By replacing n with 2n + 1

and by choosing j = n + 1, we get that the probability density function of X is

(2n + 1)!

fX (x) = (F (x))n (1 − F (x))n f (x),

n!n!

where f is the common probability density function and F is the common cumulative

distribution function of the Xi ’s. Since X1 , . . . , X2n+1 are uniformly distributed on (0, 1),

this means that f (x) = 1 for 0 ≤ x ≤ 1 and F (x) = x for 0 ≤ x ≤ 1. Hence

(

(2n+1)! n

n!n!

x (1 − x)n for 0 ≤ x ≤ 1

fX (x) =

0 otherwise.

Now, consider the Beta distribution on page 218. Note that

Z 1

B(n + 1, n + 1) = xn (1 − x)n dx

0

Z 1

n

= xn+1 (1 − x)n−1 dx after integrating by parts

n+1 0

= ...

Z 1

n!

= x2n dx after integrating by parts

(2n) · . . . · (n + 1) 0

n!

=

(2n + 1)(2n) · . . . · (n + 1)

n!n!

= .

(2n + 1)!

Hence

(

(2n+1)! n

n!n!

x (1 − x)n for 0 ≤ x ≤ 1

fX (x) =

0 otherwise

(

1

B(n+1,n+1)

xn (1 − x)n for 0 ≤ x ≤ 1

=

0 otherwise.

3

So X has a Beta distribution with parameters (n + 1, n + 1).

Problem 1

A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads,

then she wins twice, and if tails, then one-half of the value that appears on the die. De-

termine her expected winnings.

Let X denote the value on the die, let Y be 1 if the coin lands heads and 0 if the coin

lands tails, and let g(X, Y ) denote the winnings. Then her expected winnings are

6 X 1 6

X X 1

E[winnings] = g(x, y)p(x, y) = 2x · p(x, 0) + x · p(x, 1)

x=1 y=0 x=1

2

6 6

X 1 1 1 5 X 5

= 2x · + x· = x= · 21 = 4.375.

x=1

12 2 12 24 x=1 24

Problem 12

A group of n men and n women is lined up at random.

(a) Find the expected number of men who have a woman next to them.

Label the people in order 1 though 2n and let Xi = 1 if the i-th person is a man

standing next to a woman and Xi = 0 otherwise. We want to compute

" 2n # 2n 2n

X X X

E Xi = E[Xi ] = P {Xi = 1}.

i=1 i=1 i=1

Note X1 = 1 only if the first person is male and the second is female. There are

2n(2n − 1) ways to choose the first two people. There are n ways to choose the first

person to be male and n ways to choose the second person to be female, and hence

n2 ways we can have Xi = 1. Hence

n2 n

P {X1 = 1} = = .

2n(2n − 1) 4n − 2

n

Similarly P {X2n = 1} = 4n−2

.

Now let’s find P {Xi = 1} for 1 < i < 2n. There are 2n(2n − 1)(2n − 2) ways to

choose the (i − 1)-th, i-th, and (i + 1)-th person. We have Xi = 1 if the three people

chosen are female male female, female male male, or male male female. Hence there

are

(n)(n)(n − 1) + (n)(n)(n − 1) + (n)(n − 1)(n) = 3n2 (n − 1)

ways we can have Xi = 1. Hence

3n2 (n − 1) 3n

P {Xi = 1} = = .

2n(2n − 1)(2n − 2) 8n − 4

4

Hence

" 2n

# 2n

X X n 3n 3n2 − n

E Xi = P {Xi = 1} = 2 · + (2n − 2) · = .

i=1 i=1

4n − 2 8n − 4 4n − 2

(b) Repeat part (a), but now assuming that the group is randomly seated at a round

table.

Label the people in order 1 though 2n and let Xi = 1 if the i-th person is a man

standing next to a woman and Xi = 0 otherwise. We want to compute

" 2n # 2n 2n

X X X

E Xi = E[Xi ] = P {Xi = 1}.

i=1 i=1 i=1

3n

P {Xi = 1} = .

8n − 4

This computation is the same as the one that we used in part (a) for 1 < i < 2n.

Hence " 2n # 2n

X X 3n 3n2

E Xi = P {Xi = 1} = 2n · = .

i=1 i=1

8n − 4 4n − 2

Problem 19

A certain region is inhabited by r distinct types of a certain species of insect. Each insect

caught will, independently of the types of the previous catches, be of type i with probability

r

X

Pi , i = 1, . . . , r, Pi = 1.

i=1

(a) Compute the mean number of insects that are caught before the first type 1 catch.

Let X denote the number of insects caught before the first type 1 catch. Then

P {X = x} = (1 − P1 )x P1 . Hence

∞

X

E[X] = x(1 − P1 )x P1

x=0

∞

X

= P1 x(1 − P1 )x

x=0

1 − P1

= P1 ·

P12

∞

X z

using the formula nz n = for |z| < 1

n=1

(1 − z)2

1 − P1

= .

P1

5

(b) Compute the mean number of types of insects that are caught before the first type 1

catch.

Let Xi denote the number of insects of type i caught before the first type 1 catch.

Let g(0) = 0 and let g(x) = 1 for positive integers x > 0. We want to compute

" r # r r r

X X X X

E g(Xi ) = E[g(Xi )] = P {g(Xi ) = 1} = P {Xi ≥ 1}.

i=2 i=2 i=2 i=2

Let X denote the number of insects caught before the first type 1 catch as in part

(a). Then

∞

X

P {Xi ≥ 1} = P {Xi ≥ 1, X = x}

x=0

since the events{Xi ≥ 1, X = x} are mutually

exclusive and have union{Xi ≥ 1}

∞

X

= (P {X = x} − P {Xi = 0, X = x})

x=0

X∞

= ((1 − P1 )x P1 − (1 − P1 − Pi )x P1 )

x=0

P1 P1

= −

P1 P1 + Pi

using the formula for a geometric series, twice

Pi

= .

P1 + Pi

Hence " #

r r r

X X X Pi

E g(Xi ) = P {Xi ≥ 1} = .

i=2 i=2 i=2

P 1 + Pi

Problem 24

A bottle initially contains m large pills and n small pills. Each day, a patient randomly

chooses one of the pills. If a small pill is chosen, then that pill is eaten. If a large pill

is chosen, then the pill is broken in two; one part is returned to the bottle (and is now

considered a small pill) and the other part is then eaten.

(a) Let X denote the number of small pills in the bottle after the last large pill has been

chosen and its smaller half returned. Find E[X].

Label the small pills initially present 1 though n and label the small pills created

by splitting a large one n + 1 though n + m. Let Ii = 1 if the i-thPpill remains after

the last large pill is chosen and let Ii = 0 otherwise. Then X = m+n i=1 Ii , so

n+m

X n+m

X

E[X] = E[Ii ] = P {Ii = 1}.

i=1 i=1

6

We will calculate P {Ii = 1} by considering the two cases 1 ≤ i ≤ n and n + 1 ≤

i ≤ n + m separately.

If 1 ≤ i ≤ n, then the i-th small pill is initially present. Pretend we keep choosing

pills until all of them are gone. It suffices to consider the order in which the i-th pill

and the m large pills are chosen. There are m + 1 of these pills, so the probability

that the i-th pill is chosen last among them is P {Ii = 1} = 1/(m + 1).

If n + 1 ≤ i ≤ n + m, then the i-th small pill is formed by breaking a large pill

in two. It suffices to consider the order in which the large pills and the i-th small

pill are chosen. Label the large pills 1 through m in the order in which they are

initially chosen. Let Ji denote this label for the large pill corresponding to the i-th

pill, that is, let Ji denote when the large pill is broken forming the i-th small pill.

By conditioning on the value of Ji , we get

m

X

P {Ii = 1} = P ({Ii = 1}|{Ji = j})P {Ji = j}.

j=1

The probability that the large pill corresponding to the i-th small pill is labeled j

out of the m large pills is P {Ji = j} = 1/m. Once the j-th large pill is broken

to form the i-th small pill, m − j large pills and the i-th small pill remain, so the

probability that the i-th small pill is chosen last among these m − j + 1 pills is

1

P ({Ii = 1}|{Ji = j}) = .

m−j+1

Hence

m

X

P {Ii = 1} = P ({Ii = 1}|{Ji = j})P {Ji = j}

j=1

m

X 1 1

= ·

j=1

m−j+1 m

m

X 1

= ,

k=1

km

by letting k = m − j + 1. Therefore

n+m

X

E[X] = P {Ii = 1}

i=1

n

X n+m

X

= P {Ii = 1} + P {Ii = 1}

i=1 i=n+1

m

1 X 1

=n· +m·

m+1 k=1

km

m

n X1

= + .

m + 1 k=1 k

7

(b) Let Y denote the day on which the last large pill is chosen. Find E[Y ].

There are a total of n + 2m days. On the Y -th day the last large pill is chosen and

for the remaining X days small pills are chosen. Here X is the number of small

pills in the bottle after the last large pill has been choosn, as in part (a). Thus

X + Y = n + 2m, so

m

n X1

E[Y ] = E[n + 2m − X] = n + 2m − E[X] = n + 2m − − .

m + 1 k=1 k

Problem 26

If X1 , X2 , . . . , Xn are independent and identically distributed random variables having

uniform distributions over (0, 1), find

We want to compute

Z 1 Z 1

E[max(X1 , . . . , Xn )] = ... max(x1 , . . . , xn )dx1 . . . dxn .

0 0

this region into the n! regions corresponding to each of the n! possible orderings

of x1 , . . . , xn , plus a negligible region of zero volume where at least two of the xi ’s

are equal. Since the X1 , . . . , Xn are independent and identically distributed, the

integrals over each of the n! different regions will have the same value. Hence we’ll

only consider the region where x1 < x2 < . . . < xn . We have

8

Z 1 Z 1

E[max(X1 , . . . , Xn )] = ... max(x1 , . . . , xn )dx1 . . . dxn

0Z 0

x1 <...<xn

since the integral is the same over all n! such regions

Z

= n! xn dx1 . . . dxn

x1 <...<xn

since the max in this region is xn

Z 1 Z xn Z xn−1 Z x2

= n! ... xn dx1 . . . dxn

0 0 0 0

Z 1 Z xn Z xn−1 Z x3

= n! ... xn x2 dx2 . . . dxn

0 0 0 0

Z 1 Z xn Z xn−1 Z x4

1

= n! ... xn x23 dx3 . . . dxn

0 0 0 0 2

= ...

Z 1

1

= n! xn · xn−1

n dxn

0 (n − 1)!

Z 1

=n xnn dxn

0

n

= .

n+1

The clever way to do this problem is to note that

min(X1 , . . . , Xn ) = 1 − max(1 − X1 , . . . , 1 − Xn ).

E[min(X1 , . . . , Xn )] = 1 − E[max(1 − X1 , . . . , 1 − Xn )]

n

=1−

n+1

by part (a), since each Xi has

the same distribution as 1 − Xi

1

= .

n+1

integral and then break this integral up into a sum of n! integrals with equal values,

corresponding to each of the n! orderings of x1 , . . . , xn . We obtain

9

Z 1 Z 1

E[min(X1 , . . . , Xn )] = ... min(x1 , . . . , xn )dx1 . . . dxn

0Z 0

x1 <...<xn

since the integral is the same over all n! such regions

Z

= n! x1 dx1 . . . dxn

x1 <...<xn

since the min in this region is x1

Z 1 Z xn Z xn−1 Z x2

= n! ... x1 dx1 . . . dxn

0 0 0 0

Z 1 Z xn Z xn−1 Z x3

1

= n! ... x2 dx2 . . . dxn

0 0 0 0 2

= ...

Z 1

1 n

= n! xn dxn

0 n!

Z 1

= xnn dxn

0

1

= .

n+1

Problem 14

For Example 2i, show that the variance of the number of coupons needed to amass a full

set is equal to

N −1

X iN

i=1

(N − i)2

When N is large, this can be shown to be approximately equal (in the sense that their

ratio approaches 1 as N → ∞) to N 2 π 2 /6.

Since the Xi are independent, we have

N

X −1

Var(X) = Var(Xi ).

i=1

Recall k−1

N −i i

P {Xi = k} = for k ≥ 1,

N N

and E[Xi ] = N/(N − i). We compute

∞ k−1

2 N −iX 2 i N −i 1 + i/N N (N + i)

E[Xi ] = k = · 3

=

N k=1 N N (1 − i/N ) (N − i)2

10

where the second equality follows since

∞

X 1+x

k 2 xk−1 = for |x| < 1.

k=1

(1 − x)3

Hence

N (N + i) N2 iN

Var(Xi ) = E[Xi2 ] − E[Xi ]2 = 2

− 2

= ,

(N − i) (N − i) (N − i)2

and

N −1 N −1

X X iN

Var(X) = Var(Xi ) = .

i=1 i=1

(N − i)2

11

- STA_5325_HW_2_Ramin_ShamshiriUploaded byRaminShamshiri
- Jacobian ConvolutionUploaded bylol
- Chapter 06 - SolutionsUploaded byCaryn Richardson
- S2 2006 Jun QPUploaded byNikhil Sehgal
- Risk Analysis in Investment AppraisalUploaded bykevinlan
- Concise Introduction to Multi Agent Systems and Distributed Artificial Intelligence Synthesis Lectures on Artificial Intelligence and Machine LearniUploaded bylibardoserrano
- Image Enhancement in the Spacial DomainUploaded byMohammed Ali Abbaker Ali
- Risk Analysis in Investment AppraisalUploaded byvasilev
- June 2016 (IAL) QP - S1 EdexcelUploaded byWambui Kahende
- Lec Uniform DisttributionUploaded byRana Gulraiz Hassan
- Engineering Probability and Statistics Problem SetUploaded byCorazon Francisco Austria
- PrefaceUploaded bykatta
- R03 Simulation.128Uploaded byFimaIei10
- section7.2Uploaded byAnonymous 8erOuK4i
- Cumulative Distribution Function - WikipediaUploaded bySagarSaren
- 2009Uploaded byJeffrey Lu
- MPRA_paper_10035.pdfUploaded byPrincess Bautista
- Lecture 1Uploaded by0776
- hwk6Uploaded byLakshmi Gopal
- 05 Random SignalUploaded bylokesh_harami_kurm
- 2004-LiUploaded byamarolima
- CH1Uploaded byMauth Yacoup Thwapiah
- MIN-291 Chapter 2 (Engineering Analsysis)(1)Uploaded byRaghav Mohta
- Chapter 4Uploaded bySrinivas S
- GuidelinesUploaded byIwin Soossay Sathianathan
- 2014-mid-2solutions.pdfUploaded byKevin Pioc
- Errors in OLS RegressionUploaded bysnikt7863443
- C10Uploaded byJohn Wong
- Amicarelli, Di Sciascio, Toibero and Alvarez (2010) Including Dissolved Oxygen DynamicsUploaded bysanlopezres
- BIOL 2300 Homework 2 Summer2019Uploaded byTanner Johnson

- math3160s13-hw4_sols.pdfUploaded byPei Jing
- midterm1sols09.pdfUploaded byPei Jing
- midterm1solf08.pdfUploaded byPei Jing
- midterm2s09soln.pdfUploaded byPei Jing
- ma3215_quiz6_soln.pdfUploaded byPei Jing
- ma3215_hw1_soln.pdfUploaded byPei Jing
- math3160s13-hw7_sols.pdfUploaded byPei Jing
- math3160s13-hw11_sols.pdfUploaded byPei Jing
- math3160s13-hw2_sols.pdfUploaded byPei Jing
- math3160s13-hw9_sols.pdfUploaded byPei Jing
- math3160s13-hw6_sols.pdfUploaded byPei Jing
- math3160s13-hw5_sols.pdfUploaded byPei Jing
- math3160s13-hw8_sols.pdfUploaded byPei Jing
- math3160s13-hw10_sols.pdfUploaded byPei Jing
- ma3215_hw5b_soln.pdfUploaded byPei Jing
- Math3160s13-hw3_sols.pdfUploaded byPei Jing
- Math3160s13-hw1_sols.pdfUploaded byPei Jing
- ma3215_quiz4_soln.pdfUploaded byPei Jing
- ma3215_quiz1_soln.pdfUploaded byPei Jing
- ma3215_quiz3_soln.pdfUploaded byPei Jing
- ma3215_quiz5_soln.pdfUploaded byPei Jing
- ma3215_hw1b_soln.pdfUploaded byPei Jing
- ma3215_exam1_soln.pdfUploaded byPei Jing
- ma3215_hw5_soln.pdfUploaded byPei Jing
- ma3215_exam2_practice_soln.pdfUploaded byPei Jing
- ma3215_exam3_practice_soln.pdfUploaded byPei Jing
- ma3215_hw3_soln.pdfUploaded byPei Jing
- ma3215_hw3b_soln.pdfUploaded byPei Jing
- ma3215_exam2_soln.pdfUploaded byPei Jing

- Sharp Writeview EL-W535 ManualUploaded byJkPhoenix
- Speaker: Daniel Liew (Armajaro Asia)Uploaded bycidami
- KOSIK (1976) - Dialectics of the ConcreteUploaded byjulianocamillo
- Anudeep Filter.pdfUploaded byBriccio
- Chloro 2 NitrobenzeneUploaded byHoffman Lysergic
- Masonry Design Examples to BS5628Uploaded bymys85
- Gimp Workshop Part 3Uploaded byapi-3730515
- End of Year Revision PaperUploaded byAshok Shanishetti
- 99-Torrison.pdfUploaded bymifferdk
- Reaction Kinetics ( MPM )Uploaded byAlex Tan
- Laparoscope Procedures in BangaloreUploaded byAltiusHospital
- Division 18 Operating Controls_201212050944459666Uploaded byAli Mukhtar Shigri
- A Review on Effects of Deforestation on Landslide: Hill AreasUploaded byInternational Journal for Scientific Research and Development - IJSRD
- 208881 PRM BrochureUploaded byRabah Amidi
- 2006_Catalogue nikko stirlingUploaded byAlisdair Reid
- 050419 Outlook Newspaper, 19 April 2005, United States Army Garrison Vicenza, ItalyUploaded byUS Army Africa
- jennifer sites lesson planUploaded byapi-252303219
- Examiners Report July 2008 Unit BUploaded byCarlos King
- Wot Tact ItsUploaded byDusan Kekic
- The Great Cookie Swap CookbookUploaded byLisa Liu
- Brain Waves and ReflexesUploaded bykukutxi
- Lecture Notes 04 - Conformational Analysis 1Uploaded byrashid
- 3PAR for EVA Admin Dec'12.pdfUploaded byArnab Mallick
- Agricultural Revolution 2Uploaded byeckmeder
- 0512156Uploaded byShourya Mukherjee
- MT PDFUploaded byKishor Khilare
- GEN00012-00Uploaded byAlex Robledo
- 5.17i Service Manual[1]Uploaded bygigglebugdad
- NORMA Cable Ties Datasheet EnUploaded bykrzysztof
- (1970) - Adaptive Radiation of Reproductive - Princípio de StebbinsUploaded byMichele Soares de Lima