0 Up votes0 Down votes

13 views10 pagesecen303

Jun 22, 2013

© Attribution Non-Commercial (BY-NC)

PDF, TXT or read online from Scribd

ecen303

Attribution Non-Commercial (BY-NC)

13 views

ecen303

Attribution Non-Commercial (BY-NC)

- Types of Simulation
- Probability_and_Stochastic_Processes_A_F.pdf
- Module02-ProbReviewSlides_150819
- R-01
- Randomness
- Measurement of Var 00 Boa Su of t
- 545 Exercise.m
- Stochastic Processes
- Probability Final Paper
- 1009.0516
- Formula
- MAT503[fall12.3]
- Math Cheat Sheet
- Lecture 12
- UG Economics
- Chapter 1
- 1pdf.net_monte-carlo-methods-mit.pdf
- Mean Median and Mode of a PDF Teacher
- ps0
- Var Models

You are on page 1of 10

Wednesday, April 16, 7:30 - 9:30 PM.

SOLUTIONS

Question 0 1 2 Part all a b c d e f a b c d e Score Out of 3 36 4 5 5 8 5 6 4 6 6 6 6 100

TA:

6.041/6.431:

Total

Write your solutions in this quiz packet, only solutions in the quiz packet will be graded. Question one, multiple choice questions, will receive no partial credit. Partial credit for question two and three will be awarded. You are allowed 2 two-sided 8.5 by 11 formula sheet plus a calculator. You have 120 minutes to complete the quiz. Be neat! You will not get credit if we cant read it. We will send out an email with more information on how to obtain your quiz before drop date. Good Luck!

Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2008)

Question 1

Multiple choice questions. CLEARLY circle the best answer for each question below. Each question is worth 4 points each, with no partial credit given. a. Let X1 , X2 , and X3 be independent random variables with the continuous uniform distribution over [0, 1]. Then P(X1 < X2 < X3 ) = (i) 1/6 (ii) 1/3 (iii) 1/2 (iv) 1/4 Solution: To understand the principle, rst consider a simpler problem with X1 and X2 as given above. Note that P(X1 < X2 ) + P(X2 < X1 ) + P(X1 = X2 ) = 1 since the corresponding events are disjoint and exaust all the possibilities. But P(X1 < X2 ) = P(X2 < X1 ) by symmetry. Furthermore, P(X1 = X2 ) = 0 since the random variables are continuous. Therefore, P(X1 < X2 ) = 1/2. Analogously, omitting the events with zero probability but making sure to exhaust all other possibilities, we have that P(X1 < X2 < X3 ) + P(X1 < X3 < X2 ) + P(X2 < X1 < X3 ) + P(X2 < X3 < X1 ) + P(X3 < X1 < X2 ) + P(X3 < X2 < X1 ) = 1. And, by symmetry, P(X1 < X2 < X3 ) = P(X1 < X3 < X2 ) = P(X2 < X1 < X3 ) = P(X2 < X3 < X1 ) = P(X3 < X1 < X2 ) = P(X3 < X2 < X1 ). Thus, P(X1 < X2 < X3 ) = 1/6. b. Let X and Y be two continuous random variables. Then (i) E[XY ] = E[X ]E[Y ] (ii) E[X 2 + Y 2 ] = E[X 2 ] + E[Y 2 ] (iii) fX +Y (x + y ) = fX (x)fY (y ) (iv) var(X + Y ) = var(X ) + var(Y ) Solution: Since X 2 and Y 2 are random variables, the result follows by the linearity of expecta tion. c. Suppose X is uniformly distributed over [0, 4] and Y is uniformly distributed over [0, 1]. Assume X and Y are independent. Let Z = X + Y . Then (i) fZ (4.5) = 0 (ii) fZ (4.5) = 1/8 (iii) fZ (4.5) = 1/4 (iv) fZ (4.5) = 1/2 Solution: Since X and Y are independent, the result follows by convolution: 4 1 1 fZ (4.5) = fX ()fY (4.5 ) d = d = . 8 3.5 4

Page 3 of 10

Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2008)

d. For the random variables dened in part (c), P(max(X, Y ) > 3) is equal to (i) 0 (ii) 9/4 (iii) 3/4 (iv) 1/4 Solution: Note that P(max(X, Y ) > 3) = 1 P(max(X, Y ) 3) = 1 P({X 3} {Y 3}). But, X and Y are independent, so P({X 3} {Y 3}) = P(X 3)P(Y 3). Finally, computing the probabilities, we have P(X 3) = 3/4 and P(X 3) = 1. Thus, P(max(X, Y ) > 3) = 1 3/4 = 1/4. e. Recall the hat problem from lecture: N people put their hats in a closet at the start of a party, where each hat is uniquely identied. At the end of the party each person randomly selects a hat from the closet. Suppose N is a Poisson random variable with parameter . If X is the number of people who pick their own hats, then E[X] is equal to (i) (ii) 1/2 (iii) 1/ (iv) 1 Solution: Let X = X1 + . . . + XN where each Xi is the indicator function such that Xi = 1 if the ith person picks their own hat and Xi = 0 otherwise. By the linearity of the expectation, E[X | N = n] = E[X1 | N = n] + . . . + E[XN | N = n]. But E[Xi | N = n] = 1/n for all i = 1, . . . , n. Thus, E[X | N = n] = nE[Xi | N = n] = 1. Finally, E[X ] = E[E[X | N = n]] = 1. f. Suppose X and Y are Poisson random variables with parameters 1 and 2 respectively, where X and Y are independent. Dene W = X + Y , then (i) W is Poisson with parameter min(1 , 2 ) (ii) W is Poisson with parameter 1 + 2 (iii) W may not be Poisson but has mean equal to min(1 , 2 ) (iv) W may not be Poisson but has mean equal to 1 + 2 Solution: The quickest way to obtain the answer is through transforms: MX (s) = e1 (e 1) and s s s MY (s) = e2 (e 1) . Since X and Y are independent, we have that MW (s) = e1 (e 1) e2 (e 1) = s e(1 +2 )(e 1) , which equals the transform of a Poisson random variable with mean 1 + 2 .

s

Page 4 of 10

Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2008)

g. Let X be a random variable whose transform is given by MX (s) = (0.4 + 0.6es )50 . Then (i) P(X = 0) = P(X = 50) (ii) P(X = 51) > 0 (iii) P(X = 0) = (0.4)50 (iv) P(X = 50) = 0.6 Solution: Note that MX (s) is the transform of a binomial random variable X with n = 50 trials and the probability of success p = 0.6. Thus, P(X = 0) = 0.450 . h. Let Xi , i = 1, 2, . . . be independent random variables all distributed according to the PDF 1 100 fX (x) = x/8 for 0 x 4. Let S = 100 i=1 Xi . Then P(S > 3) is approximately equal to (i) 1 (5) (ii) (5)

5 (iii) 1 2 5 (iv) 2

1 100 Solution: Let S = 100 i=1 Yi where Yi is the random variable given by Yi = X1 /100. Since Yi are iid, the distribution of S is approximately normal with mean E[S ] and variance var(S ). 3 E(S ) Thus, P(S > 3) = 1 P(S 3) 1 . Now, var(S )

4

Therefore,

E[S ] = var(S ) = and

1 1

E[Xi ] + . . . + E[X100 ] = 8/3. 100 100 1 1 8 1 var(Xi ) + . . . + var(Xi ) = . 1002 1002 9 100

3 8 / 3 5 P(S > 3) 1 =1 .

8 1 2 9 100

Page 5 of 10

Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2008)

i. Let Xi , i = 1, 2, . . . be independent random variables all distributed according to the PDF fX (x) = 1, 0 x 1. Dene Yn = X1 X2 X3 . . . Xn , for some integer n. Then var(Yn ) is equal to n 12

1 1

(ii) n n

3 4 1

(iii) 12n

1

(iv) 12 (i) Solution: Since X1 , . . . , Xn are independent, we have that E[Yn ] = E[X1 ] . . . E[Xn ]. Sim 2 ] = E[X 2 ] . . . E[X 2 ]. Since E[X ] = 1/2 and E[X 2 ] = 1/3 for i = 1, . . . , n, it ilarly, E[Yn i n 1 i 1 1 2 2 follows that var(Sn ) = E[Yn ] (E[Yn ]) = n n . 3 4

Question 2

Each Mac book has a lifetime that is exponentially distributed with parameter . The lifetime of Mac books are independent of each other. Suppose you have two Mac books, which you begin using at the same time. Dene T1 as the time of the rst laptop failure and T2 as the time of the second laptop failure. a. Compute fT1 (t1 ). Solution Let M1 be the life time of mac book 1 and M2 the lifetime of mac book 2, where M1 and M2 are iid exponential random variables with CDF FM (m) = 1 em . T1 , the time of the rst mac book failure, is the minimum of M1 and M2 . To derive the distribution of T1 , we rst nd the CDF FT1 (t), and then dierentiate to nd the PDF fT1 (t).

FT1 (t) = P (min(M1 , M2 ) < t) = 1 P (min(M1 , M2 ) t) = 1 P (M1 t)P (M2 t) = 1 (1 FM (t))2 = 1 e2t Dierentiating FT1 (t) with respect to t yields: fT1 (t) = 2e2t t0 t0

Page 6 of 10

Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2008)

b. Let X = T2 T1 . Compute fX |T1 (x|t1 ). Solution Conditioned on the time of the rst mac book failure, the time until the other mac book fails is an exponential random variable by the memoryless property. The memoryless property tells us that regardless of the elapsed life time of the mac book, the time until failure has the same exponential CDF. Consequently, fX |T1 (x) = ex x 0.

c. Is X independent of T1 ? Give a mathematical justication for your answer. Solution Since we have shown in 2(c) that fX |T1 (x | t) does not depend on t, X and T1 are independent. d. Compute fT2 (t2 ) and E[T2 ]. Solution The time of the second laptop failure T2 is equal to T1 + X. Since X and T1 were shown to be independent in 2(b), we convolve the densities found in 2(a) and 2(b) to determine fT2 (t). fT2 (t) = =

0 t

fT1 ( )fX (t )d

2()2 e2 e(t ) d t t = 2e e d

0 0

= 2e

(1 et )

t0

1 2

3 2 .

An equivalent method for solving this problem is to note that T2 is the maximum of M1 and M2 , and deriving the distribution of T2 in our standard CDF to PDF method: FT2 (t) = P(max(M1 , M2 ) < t) = P(M2 t)P (M2 t) = FM (t)2 = 1 2et + e2t Dierentiating FT2 (t) with respect to t yields: fT2 (t) = 2et 2e2t which is equivalent to our solution by convolution above.

Finally, from the above density we obtain that E[T2 ] = solution.

2

t0

t0

1 2 3 2 ,

Page 7 of 10

Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2008)

e. Now suppose you have 100 Mac books, and let Y be the time of the rst laptop failure. Find the best answer for P(Y < 0.01) Solution Y is equal to the minimum of 100 independent exponential random variables. Following the derivation in (a), we determine by analogy: fY (y ) = 100e100y t0

Your friend, Charlie, loves Mac books so much he buys S new Mac books every day! On any given day S is equally likely to be 4 or 8, and all days are independent from each other. Let S100 be the number of Mac books Charlie buys over the next 100 days. f. (6 pts) Find the best approximation for P(S100 608). Express your nal answer in terms of (), the CDF of the standard normal. Solution Using the De Moivre - Laplace Approximation to the Binomial, and noting that the step size between values that S can take on is 4, 608 + 2 100 6 608) = 100 4 10 = 20 = (.5)

P (S100

Question 3

Saif is a well intentioned though slightly indecisive fellow. Every morning he ips a coin to decide where to go. If the coin is heads he drives to the mall, if it comes up tails he volunteers at the local shelter. Saifs coin is not necessarily fair, rather it possesses a probability of heads equal to q . We do not know q , but we do know it is well-modeled by a random variable Q where the density of Q is 2q for 0 q 1 fQ (q ) = 0 otherwise Assume conditioned on Q each coin ip is independent. Note parts a, b, c, and {d, e} may be answered independent of each other.

Page 8 of 10

Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2008)

a. (4 pts) Whats the probability that Saif goes to the local shelter if he ips the coin once? Solution Let Xi be the outcome of a coin toss on the ith trial, where Xi = 1 if the coin lands heads, and Xi = 0 if the coin lands tails. By the total probability theorem: P(X1 = 0) = =

0 1

0 1

1 3

In an attempt to promote virtuous behavior, Saifs father oers to pay him $4 every day he volunteers at the local shelter. Dene X as Saifs payout if he ips the coin every morning for the next 30 days. b. Find var(X ) Solution Let Yi be a Bernoulli random variable describing the outcome of a coin tossed on morning i. Then, Yi = 1 corresponds to the event that on morning i, Saif goes to the local shelter; Yi = 0 corresponds to the event that on morning i, Saif goes to the mall. Assuming that the coin lands heads with probability q , i.e. that Q = q , we have that P (Yi = 1) = q , and P (Yi = 0) = 1 q for i = 1, . . . , 30. Saifs payout for next 30 days is described by random variable X = 4(Y1 + Y2 + + Y30 ). var(X ) = 16 var(Y1 + Y2 + + Y30 ) = 16 var(E[Y1 + Y2 + + Y30 | Q]) + E[var(Y1 + Y2 + + Y30 | Q)] Now note that, conditioned on Q = q , Y1 , . . . , Y30 are independent. Thus, var(Y1 + Y2 + + Y30 | Q) = var(Y1 | Q) + . . . + var(Y30 | Q). So, var(X ) = 16 var(30Q) + 16 E[var(Y1 | Q) + . . . + var(Y30 | Q)] = 16 302 var(Q) + 16 30 E[Q(1 Q)] = 16 302 (E[Q2 ] (E[Q])2 ) + 16 30(E[Q] E[Q2 ]) = 16 302 (1/2 4/9) + 16 30(2/3 1/2) = 880 1 1 since E[Q] = 0 2q 2 dq = 2/3 and E[Q2 ] = 0 2q 3 dq = 1/2.

Page 9 of 10

Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2008)

Let event B be that Saif goes to the local shelter at least once in k days. c. Find the conditional density of Q given B , fQ|B (q ) Solution By Bayes Rule: P (B | Q = q )fQ (q ) P (B | Q = q )fQ (q )dq (1 q k )2q (1 q k )2q dq 0q1

fQ|B (q ) = = =

1

0

2q (1 q k ) 1 2/(k + 2)

While shopping at the mall, Saif gets a call from his sister Mais. They agree to meet at the Coco Cabana Court yard at exactly 1:30PM. Unfortunately Mais arrives Z minutes late, where Z is a continuous uniform random variable from zero to 10 minutes. Saif is furious that Mais has kept him waiting, and demands Mais pay him R dollars, where R = exp(Z + 2). e. Find Saifs expected payout, E[R]. Solution E[R] =

0

10

ez +2 f (z )dz

e2 10 z e dz 10 0 e12 e2 10

FR (r) = P(eZ +2 r) = P(Z + 2 ln(r)) ln(r)2 1 = dz 10 0 ln(r) 2 = e2 r e12 10 fR (r) = 1 10r e2 r e12

Page 10 of 10

Fall 2010

For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

- Types of SimulationUploaded byMani Kandan
- Probability_and_Stochastic_Processes_A_F.pdfUploaded bytakahashikeyaki
- Module02-ProbReviewSlides_150819Uploaded byYhony Luis Zavaleta Ramos
- R-01Uploaded byFajar Putranto
- RandomnessUploaded byPrethiveDhilip
- Measurement of Var 00 Boa Su of tUploaded byPhil Kostov
- 545 Exercise.mUploaded byLuiz Soneghet
- Stochastic ProcessesUploaded bypratgame
- Probability Final PaperUploaded bykamalshrish
- 1009.0516Uploaded byecsonia
- FormulaUploaded bysahil_buet
- MAT503[fall12.3]Uploaded byklutch
- Math Cheat SheetUploaded byQuito Nitura
- Lecture 12Uploaded byPragyan Nanda
- UG EconomicsUploaded byRiddhijit Sarkar
- Chapter 1Uploaded byPrabhakar Krishnamurthy
- 1pdf.net_monte-carlo-methods-mit.pdfUploaded byFethi Zizou
- Mean Median and Mode of a PDF TeacherUploaded byjoy paul
- ps0Uploaded byRupesh Parab
- Var ModelsUploaded byNaseer Shahzada
- Reliability_Verification,_Testing,_and_Analysis_in_Engineering_Design_2002.pdfUploaded bykhiemdn
- Tutorial 2.pdfUploaded byUddeshya Gupta
- What is Poetry. What is Art.docxUploaded byLongshade_John
- New Microsoft Excel WorksheetUploaded bySabir Ali
- Poisson Distribution ExamplesUploaded byEdward Kahwai
- Randomness - WikipediaUploaded bybrqu
- R Tutorial 2 SurvivalUploaded byromain9999
- FormulaUploaded bykrrmallika
- Kedah Mahsuri MATHS T, SEM 3 2015Uploaded bylingboo
- Solution for startUploaded byDipen K. Das

- Spring 2006 Test 1 SolutionUploaded byAndrew Zeller
- Intech Micro 2400 A16 Sales BrochureUploaded byAndrew Zeller
- IntelliBox IO PBUploaded byAndrew Zeller
- flex 31xUploaded byAndrew Zeller
- Spring 2006 FinalUploaded byAndrew Zeller
- dgh_d17mUploaded byAndrew Zeller
- GW 7472cutsheetUploaded byAndrew Zeller
- Fieldbusmodule Flyer Esco e 13c22060Uploaded byAndrew Zeller
- AM3-SM RTUUploaded byAndrew Zeller
- ADAM-6052_DS20140613100542Uploaded bymirko
- et_6060d_hwUploaded byAndrew Zeller
- D6000 Serial Data SheetUploaded byAndrew Zeller
- BAS-700 Hardware User GuideUploaded byAndrew Zeller
- AdvantechPCM-9375Uploaded byAndrew Zeller
- 1997808_ADVANTECH_93626A1Uploaded byAndrew Zeller
- 1005503_ADVANTECH_93750A1Uploaded byAndrew Zeller
- Marine Signal Lights – Part 4 - Determination and Calculation of Effective Intensity 200Uploaded byAndrew Zeller
- ELEN 325 Lab 1 PrelabUploaded byAndrew Zeller
- QuizUploaded bySerge Mandhev Swamotz
- Spring 2009 Test 1 SolutionUploaded byAndrew Zeller
- Sping 2009 FinalUploaded byAndrew Zeller
- Spring 2008 Test 2Uploaded byAndrew Zeller
- Spring 2006 Test 2Uploaded byAndrew Zeller
- Spring 2006 Test 2 SolutionUploaded byAndrew Zeller
- Spring 2006 Final SolutionUploaded byAndrew Zeller
- Fall 2010 Test 2Uploaded byAndrew Zeller
- fall 2010 Test 1Uploaded byAndrew Zeller

- HPE Smart Array P431 Controller User GuideUploaded bynicolepetrescu
- Sanford Bennet ANTI-AGING Article New York NY Sun 1915 Grayscale - 3319Uploaded byMatthew Michael Crown
- ASUS VX239 Series EnglishUploaded byJohn Mark Ombina
- Great Balls of CottonUploaded byIgor Cherny
- Tnpsc Aptitude Tamil for HANDWRITINGUploaded byelasu85
- Journal of Alloys and Compounds Volume Issue 2018 [Doi 10.1016%2Fj.jallcom.2018.02.227] Zhang, Lanhui; Peng, Huabei; Qin, Qiuhui; Fan, Qichao; Bao, Shil -- Effects of Annealing on Hardness and CorrosiUploaded byAndika Pandu Vidianto
- Chapter 27Uploaded byKarla Perera
- ZLDUploaded byalremyze1664
- Mercury ProfessionUploaded byvaibhavsattikar
- Weldability of Grade 23[1]Uploaded byclaude.bouillot3566
- pc config.Uploaded byShivani Mahajan
- Computer Vision Ch13Uploaded byLee RickHunter
- Car Servicing Future StrategyUploaded byNeha
- Effortless Success - Course 1 WorkbookUploaded byAmbra Jon
- Medgulf ConstructionUploaded bysyedAtif
- Railway OFC TrainingUploaded bySreerup Chowdhury
- Measuring InstrumentUploaded byDigo
- Classbook Great Explorers 6Uploaded byDavid Gl
- WEBMOUploaded byGordon7222
- ME5202_2011_Mujumdar hal 15 16Uploaded bybambangg
- cirozaUploaded byalexandra
- Gas Dynamics-Rayleigh FlowUploaded byRahul
- F30 Vehicle Dash Camera Dual Lens Car DVRUploaded bySam Loer
- AE2254-LPUploaded byrsujithkumar
- InTech-Pvt Properties of Polymers for Injection MoldingUploaded bynkumar_488121
- Grounding ExplanationUploaded byBuelvas Nicanor
- h Star ManualUploaded byNicholas Waters
- NXS 9750 Users Manual 2.8Uploaded bycradrodoli
- Ph.D.brochure..Uploaded byLalit Kumar
- V.93-3-Post Traumatic Stress Disorder. Sethanne Howard and Mark CrandalllUploaded byLori Hines Cole

## Much more than documents.

Discover everything Scribd has to offer, including books and audiobooks from major publishers.

Cancel anytime.