You are on page 1of 26

For any Homework related queries, call us at :- +1 678 648 4277

You can mail us at :-  support@statisticshomeworksolver.comor


reach us at :- https://www.statisticshomeworksolver.com/

Statistics Homework Help


Problem 1: Multiple Choice Questions: There is only one correct answer
for each question listed below, please clearly indicate the correct answer.
There will be no partial credit given for multiple choice questions, thus
any explanations will not be graded.

(I) Alice and Bob each choose a number independently and uniformly at
random from the interval [0, 2]. Consider the following events:

A : The absolute difference between the two numbers is greater than 1/4.

B : Alice’s number is greater than 1/4.


(II) There are m red balls and n white balls in an urn. We draw two balls
simultaneously and at random. What is the probability that the balls are
of different color?

(III) 20 blackpebbles are arranged in 4 rows of 5 pebbles each. We


choose 4 of these pebbles at random and color them red. What is the
probability that all the red pebbles lie in different rows?
(IV) We have two light bulbs, A and B. Bulb A has an exponentially
distributed lifetime with mean lifetime 4 days. Bulb B has an
exponentially distributed lifetime with mean lifetime 6 days. We select
one of the two bulbs at random; each bulb is equally likely to be
chosen. Given that the bulb we selected is still working after 12 hours,
what is the probability that we selected bulb A?

(V) A test for some rare disease is assumed to be correct 95% of the
time: if a person has the disease, the test results are positive with
probability 0.95, and if the person does not have the disease, the test
results are negative with probability 0.95. A random person drawn from
a certain population has probability 0.001 of having the disease. Given
that the person just tested positive, what is the probability of having the
disease?
(VI) Let X be a random variable with

(VII) X and Y are independent random variables, with


(VIII) A number p is drawn from the interval [0, 1] according to the
uniform distribution, and then a sequence of independent Bernoulli
trials is performed, each with success probability p. What is the variance
of the number of successes in k trials? Note k is a deterministic number.

(IX) A police radar always over-estimates the speed of incoming cars by


an amount that is uniformly distributed between 0 and 5 mph. Assume
that car speeds are uniformly distributed between 60 and 75 mph and
are independent of the radar over-estimate. If the radar measures a
speed of 76 mph, what is the least squares estimate of the actual car
speed?

(a) 72.5 mph

(b) 73.0 mph

(c) 73.5 mph

(d) 72.9 mph


(X) You are visiting the Killian rainforest, when your insect repellant runs
out. This forest is infested with lethal mosquitoes, whose second bite will
kill any human instantaneously. Assume mosquito’s land on your backand
deliver a vicious bite according to a Bernoulli Process. Also assume the
expected time till the first bite is 10 seconds. If you arrive at the forest at t
= 0, what is the probability that you die at exactly t = 10 seconds?

(XI) In order to estimate p, the fraction of people who will vote for
George Bush in the next election, you conduct a poll of n people drawn
randomly and independently from the population. Your estimator Mn is
obtained by dividing Sn, the number of people who vote for Bush in your
sample, by n, i.e., Mn = Sn/n. Find the smallest value of n, the number of
people you must poll, for which the Chebyshev inequality yields a
guarantee that
(XII) On each day of the year, it rains with probability 0.1, independent of
every other day. Use the Central Limit Theorem (CLT) (with no half step)
to approximate the probability that, out of the 365 days in the year, it
will rain on at least 100 days. In the answers below, Φ denotes the
standard normal CDF.

Problem 2: Please write all workfor Problem 2 in your first blue book. No
workrecorded below will be graded..

You take a safari trip to the Porobilati game reserve. A highlight of the
game reserve is the Poseni river where one can watch deer and
elephants coming to drinkwater. Deer come to the river according to a
Poisson process with arrival rate λd = 8 per hour; elephants come to the
river according to an independent Poisson process with arrival rate λe = 2
per hour. On the first day of your safari, you reach the Poseni river early
in the morning hoping to see some elephants. Assume that deer and
elephants are the only kinds of animals that visit this river.

(a) Let N be the number of animals you see during the first 3 hours. Find
E[N] and var(N).
(b) What’s the probability of seeing your 3rd elephant before your 9th
deer?
(c) At the end of 3 hours, you have seen 24 deer but no elephants. What
is the probability that you will see an elephant in the next hour?
(d) It is now 6 hours since you started watching. You have seen 53 deer
but there is still no sight of an elephant. How many more deer do you
expect to see before you see your first elephant?

(e) Unfortunately, you forgot your camera in your lodge the first day. You
come backwith your camera the next day and plan to stay at the river
until you’ve clicked a picture of both a deer and an elephant. Assume
you’re always able to clicka picture as soon as an animal arrives. How long
do you expect to stay?

(f) Your friend, a wildlife photographer, is fascinated by the stories of your


trip, and decides to take the safari herself. All excited, she arrives at the
Poseni river, and is prepared to stay as long as required in order to get lots
of beautiful pictures. Unfortunately, she dropped her camera in a bush on
the way, as a result of which the camera is not functioning properly; each
time she clicks, there is a probability 0.1 that the camera will produce a
faulty picture, independently of everything else. Assume she clicks one
picture each time an elephant or deer arrives at the river, as soon as it
arrives. Let X be the time (in hours) until she gets her third successful
picture of an elephant. Find the PDF of X

Problem 3: Please write all workfor Problem 3 in your second blue book.
No workrecorded below will be graded. Each question is equally weighted
at 4 points each.
The MIT football team’s performance in any given game is correlated to its
morale. If the team has won the past two games, then it has a 0.7
probability of winning the next game. If it lost the last game but won the
one before that, it has a 0.4 probability of winning. If it won its last game
but lost the one before that it has a 0.6 probability of winning. Finally if it
lost the last two games it has only a 0.3 probability of winning the next
game. No game can end up in a draw. Consider a starting time when the
team has won its preceding two games
(a) Graph the minimum state Markov chain model for the MIT football
team’s performance. Be sure to label all transition probabilities, and
clearly indicate what state of the world each Markov state represents.
You may find it convenient for the below questions to label each state
with an integer.

(b) Find the probability that the first future loss will be followed by
another loss.

(c) Let X be the number of games played up to, but not including the first
loss. Find the PMF of X.

(d) Either evaluate the steady-state probabilities or explain why they do


not exist.

(e) Find a good approximation to the probability that the team will win
its 1000th game, given that the outcomes of games 1000 and 1001 are
the same. Clearly state any assumptions you use in the approximation.

(f) Let T be the number of games up to and including the team’s 2nd
consecutive loss (i.e. 2 losses in a row). Write a system of equations that
can be used to find E[T].

(g) The coach decides that he will withdraw the team from the
competition immediately after the team’s 3rd consecutive loss (i.e. three
losses in a row). Let N be the number of games that the team will play in
the competition? Write a system of equations that can be used to find
E[N].
Solutions
Problem 1
Multiple Choice Questions: There is only one correct answer for each
question listed below, please clearly indicate the correct answer.

(1)D Alice's and Bob's choices of number can be illustrated in the following
figure. Event A (the absolute difference of the two numbers is greater than i)
corresponds to the area shaded by the vertical line. Event B (Alice's number is
greater than 1) corresponds to the area shaded by the horizontal line. There
fore, event AFB corresponds to the double shaded area. Since both choices is
uniformly distributed between 0 and 2, we have that
2B
Let B1 be the first ball and B2 be the second ball (you may do this by
drawing two balls with both hands, then first look at the ball in left hand
and then look at the ball in right hand).

(3) C
Without any constraint, there are totally 20 × 19 × 18 × 17 possible
ways to get 4 pebbles out of 20. Now we consider the case under the
constraint that the 4 picked pebbles are on different rows. As illustrated
by the following figure, there are 20 possible positions for the first
pebble, 15 for the second, 10 for the third and finally 5 for the last
pebble. Therefore, there are a total of 20 × 15 × 10 × 5 possible ways.
With the uniform discrete probability law, the probability of picking 4
pebbles from different rows is
(4) A
1 Let XA and XB denote the the life time of bulb A and bulb B,
respectively. Clearly, XA has exponential distribution with parameter λA =
1 and XB has exponential distribution with parameter 4 1 λB = 6 . Let E be
the event that we have selected bulb A. We use X to denote the lifetime
of the selected bulb. Using Bayes’ rule, we have that
(7) B
Clearly, X is exponentially distributed with parameter λ = 3. Therefore,
its transform is M 3 X(s) = 3−s . The transform of Y can be easily
calculated from its PMF as MY (s) = 1 es + 1 . Since 2 2 X and Y are
independent, transform of the X+Y is just the product of the transform
of X and the transform of Y. Thus,

(8)C
Let P be the random variable for the value drawn according to the
uniform distribution in interval [0, 1] and let X be the number of
successes in k trials. Given P = p, X is a binomial random variable
(9) B
Let X be the car speed and let Y be the radar’s measurement. Then, the
joint PDF of X and Y is uniform in the range of pairs (x, y ) such that x ∈
[60, 75] and x ≤ y ≤ x + 5. Therefore, the least square estimator of X given
Y = y is
(11) B
Let Xi be the indicator random variable for the i th
person’s vote as follows:

(12) D
Let Xi be the indicator random variable for the i th day being rainy
as follows:
We have that µ = E[Xi] = 0.1 and σ2 = var(Xi) = 0.1 · (1 − 0.1) = 0.09. Then
the number of rainy days in a year is S365 = X1 + . . . + X365, which is
distributed as a binomial(n = 365, p = 0.1). Using the central limit
theorem, we have that

Problem 2,
Each question is equally weighted at 6 points each) The deer process is a
Poisson process of arrival rate 8 (λD = 8). The elephant process is a
Poisson process of arrival rate 2 (λE = 2).

(a) 30 Since deer and elephants are the only animals that visit the river,
then the animal process is the merged process of the deer process and
the elephant process. Since the deer process and the elephant process
are independent Poisson processes with λD = 8 and λE = 2 respectively,
the animal process is also a poisson process with λA = λD + λE = 10.
Therefore, the number of animals arriving the three hours is distributed
according to a Poisson distribution with parameter λA · 3 = 30. Therefore

E[ number of animals arriving in three hours ]


= λA · 3 = 30 ,

var( number of animals arriving in three hours


) = λA · 3 = 30 .
Let’s now focus on the animal arrivals themselves and ignore the inter-
arrival times. Each animal arrival can be either a deer arrival or a
elephant arrival. Let Ek denote the event that the kth animal arrival is an
elephant. It is explained in the book (Example 5.14, Page 295) that

and the events E1, E2, . . . are independent. Regarding the arrival of
elephant as “success”, it forms a Bernoulli process, with parameter p =
1 . Clearly, observing the 3rd elephant before the 9th deer is 5 equivalent
to that the 3rd elephant arrives before time 12 in the above Bernoulli
process. Let T3 denote the 3rd elephant arrival time in the Bernoulli
process.
Since the deer arrival process is independent of the elephant arrival
process, the event that “several deer arrive by the end of 3 hours” is
independent of any event defined on the elephant arrival process. Let
T1 denote the first arrival time. Then the desired probability can be
expressed as

Step (3) follows from the memoryless property of the Poisson process.
(d) 4
As in part (b), regarding the arrival of elephant as “success”, it forms a
Bernoulli process, with parameter p = 1 . Owing to the memoryless
property, given that no “success” in the first 53 trials 5 (have seen 53
deer but no elephant), the number of trials, X1, up to and including the
first “success” is still a geometrical random variable with parameter p =
1 . Since the number of “failures” (deers) 5 before the first “success”
(elephant) is X1 − 1, then

(e) 40
Let TD (resp. TE) denote the first arrival time of the deer process (resp.
elephant process). The time S until I clicked a picture of both a deer and
a elephant is equal to max{TD, TE}. We express S into two parts,
where S1 = min{TD, TE} is the first arrival time of an animal and S2 =
max{TD, TE} − min{TD, TE} is the additional time until both animals
register one arrival. Since the animal arrival process is Poisson with rate
10,

Concerning S2, there are two cases.

(1) The first arrival is a deer, which happens with probability λD+λE λD .
Then we wait for an elephant 1 arrival, which takes λE time on
average.

(2) (2) The first arrival is an elephant, which happens with probability
λD+λE λE . Then we wait for a deer 1 arrival, which takes λD time on
average.
X denotes the time until the third successful picture of an elephant. Since
the deer arrivals are independent of the elephant arrivals and X is only
related to elephant arrivals, we can focus on the elephant arrival process.
We split the elephant arrival process into two processes, one composed
of the successful elephant pictures (We call this process the “successful
elephant picture arrival process”) and the other one composed of
unsuccessful elephant pictures (We call this process the “unsuccessful
elephant picture arrival process”). Since the quality of each picture is
independent of everything else, the successful elephant picture arrival
process is a Poisson process with parameter λSE = 0.9 λE. Therefore, X is
the 3rd · arrival time of the successful elephant picture arrival process.
Thus X has the following Erlang distribution

(a) The performance of the MIT football team is described by a Markov


chain, where the state is taken to be
(result of the game before the last game, result of the last game).

There are four possible states: {WW, W L, LW, LL}. The corresponding
transition probability graph is We use throughout this problem the
following labels for the four states
(b) 0.6 When the winning streak of the team gets interrupted, the chain
is in state 2. The probability of losing the next game is 0.6.

An alternative computation-intensive method: Denote the starting time


to be n = 1 and let Xn denote the result of the nth game. Since the team
starts when they has won their previous two games, we set X0 = X−1 = W.
Then,
P[ the first future loss followed by another loss | X0 = X−1 = W ] +∞ = P

the first future loss occurs at time n and another loss at time n + 1 | X0 =
X−1 = W ] n=1
Clearly, X, the number of games played before the first loss, takes value
in {0, 1, . . .}. Using the same notations as in (b),the PMF of X can be
calculated as follows.
The Markov chain consists of a single aperiodic recurrent class, so the
steady-state distribution exists, denoted by π = (π1, π2, π3, π4). Solving
the equilibrium system πP = π together with 4 i=1 πi = 1, we get the
desired result.

(e) 2 Recall that Xn denotes the outcome of the nth game. The desired
probability is

T is the first passage time from state 1 to state 4. Let µ1, µ2, µ3, µ4 be
the average first passage time from each state to state 4. We have that
E[T] = µ1 Since we are concerned with the first passage time to state 4,
the Markov chain’s behavior from state 4 is irrelevant. Therefore, we
focus on the modified Markov chain graph where state 4 is converted to
an absorbing state. The average first
passage time to state 4 in the original Markov chain is equal to the
average absorption time to state 4 in the modified Markov chain. The
required linear equation system is
Three losses in a row can only be reached from state LL. Following this
observation, we create a fifth state LLL (labeled 5) and construct the
following Markov transition graph. The probability transition matrix of
this Markov chain is

The number of games played by the MIT football team corresponds to


the absorption time from state 1 to state 5. Denote the average
absorption time to state 5 as t1, t2, t3, t4, t5. The linear equation system
is as follows.

You might also like