4 views

Uploaded by vishnu

Probability and Stochastic Processes

- Stochastics Tutorial
- Tgs
- Optimal Stochastic Location Updates in Mobile Ad Hoc Networks by Coreieeeprojects
- Random Process
- Allocation - Multiple Sites
- 1-s2.0-001905789500008N-main
- Tut 2
- Lecture_4_-_GARCH_models20130320201137
- Assignment
- napsofisetekwlozwa
- 9166-9146-1-PB
- Euro NF PhDCourse
- OPERATION_RESEARCH_INTRO.
- Albert Cohen - Multiple State Model
- 1404240111719464_Probability Supp Solutions and Hints_2014
- Bernoulli Process.pdf
- z.drawdown
- PQT_MJ07
- ARMA Box
- Spring 12 ECON-E370 IU Exam 1 Formula Sheet

You are on page 1of 2

MTL 106 (Introduction to Probability and Stochastic Processes)

1. Let X(t) = A0 + A1 t + A2 t2 , where A0i s are uncorrelated random variables with mean 0 and variance 1. Find

the mean function and covariance function of X(t).

2. In a communication system, the carrier signal at the receiver is modeled by Y (t) = X(t) cos(2πwt + Θ)

where {X(t), t ≥ 0} is a zero-mean and wide sense stationary process, Θ is a uniform distributed random

variable with interval (−π, π) and w is a positive constant. Assume that, Θ is independent of the process

{X(t), t ≥ 0}. Is {Y (t), t ≥ 0} wide sense stationary? Justify your answer.

3. Consider the random telegraph signal, denoted by X(t), jumps between two states, 0 and 1, according

to the following rules. At time t = 0, the signal X(t) start with equal probability for the two states, i.e.,

P (X(0) = 0) = P (X(0) = 1) = 1/2, and let the switching times be decided by a Poisson process {Y (t), t ≥ 0}

with parameter λ independent of X(0). At time t, the signal

1

1 − (−1)X(0)+Y (t) , t > 0.

8

X(t) =

2

-1

Is {X(t), t ≥ 0} covariance/wide sense stationary?

17

4. Let A be a positive random variable that is independent of a strictly stationary random process {X(t), t ≥ 0}.

Show that Y (t) = AX(t) is also strictly stationary random process.

20

5. The owner of a local one-chair barber shop is thinking of expanding because there seem to be too many

people waiting. Observations indicate that in the time required to cut one person’s hair there may be 0,

1 and 2 arrivals with probability 0.3, 0.4 and 0.3 respectively. The shop has a fixed capacity of six people

whose hair is being cut. Let X(t) be the number of people in the shop at any time t and Xn = X(t+ n ) be the

r

number of people in the shop after the time instant of completion of the nth person’s hair cut. Prove that

te

{Xn , n = 1, 2, . . .} is a Markov chain assuming i.i.d arrivals. Find its one step transition probability matrix.

es

6. Let X0 be an integer-valued random variable, P (X0 = 0) = 1, that is independent of the i.i.d. sequence

Z1 , Z2 , . . . , where P (Zn = 1) = p, P (Zn = −1) = q, and P (Zn = 0) = 1 − (p + q). Let Xn = max(0, Xn−1 +

Zn ), n = 1, 2, . . .. Prove that {Xn , n = 0, 1, . . .} is a discrete time Markov chain. Write the one-step transition

m

probability matrix or draw the state transition diagram for this Markov chain.

Se

7. Suppose that a machine can be in two states: 0 = working and 1 = out of order on a day. The probability

that a machine is working on a particular day depends on the state of the machine during two previous days.

Specifically assume that P (X(n + 1) = 0/X(n − 1) = j, X(n) = k) = qjk j, k = 0, 1 where X(n) is the state

of the machine on day n.

II

(b) Define a new state space for the problem by taking the pairs (j, k) where j and k are 0 or 1. We say

that machine is in state (j, k) on day n if the machine is in state j on day (n − 1) and in state k on day

n. Show that with this changed state space the system is a discrete time Markov chain.

(c) Suppose the machine was working on Monday and Tuesday. What is the probability that it will be

working on Thursday?

8. The transition probability

matrixof a discrete time Markov chain {Xn , n = 0, 1, . . .} having three states 1,

0.3 0.4 0.3

2 and 3 is P = 0.6 0.2 0.2 and the initial distribution is π = (0.7, 0.2, 0.1)

0.5 0.4 0.1

(a) Compute P (X2 = 3). (b) Compute P (X3 = 2, X2 = 3, X1 = 3, X0 = 2).

1

9. Consider a time-homogeneous discrete time Markov

chain {Xn , n = 0, 1, . . .} with

state space S = {0, 1, 2, 3, 4}

1 0 0 0 0

0.5 0 0.5 0 0

and one-step transition probability matrix P =

0 0.5 0 0.5 0 .

0 0 0.5 0 0.5

0 0 0 0 1

(a) Classify the states of the chain as transient, +ve recurrent or null recurrent.

(b) When P (X0 = 2) = 1, find the expected number of times the Markov chain visit state 1 before being

absorbed.

(c) When P (X0 = 1) = 1, find the probability that the Markov chain absorbs in state 0.

10. Two gamblers, A and B, bet on successive independent tosses of an unbiased coin that lands heads up with

probability p. If the coin turns up heads, gambler A wins a rupee from gambler B, and if the coin turns up

tails, gambler B wins a rupee from gambler A. Thus the total number of rupees among the two gamblers

stays fixed, say N . The game stops as soon as either gambler is ruined; i.e., is left with no money! Assume

the initial fortune of gambler A is i. Let Xn be the amount of money gambler A has after the nth toss. If

8

Xn = 0, then gambler A is ruined and the game stops. If Xn = N , then gambler B is ruined and the game

stops. Otherwise the game continues. Prove that {Xn , n = 0, 1, . . .} is a discrete time Markov chain. Write

-1

the one-step transition probability matrix or draw the state transition diagram for this Markov chain.

17

11. For j = 0, 1, . . ., let Pjj+2 = vj and Pj0 = 1 − vj , define the transition probability matrix of Markov chain.

Discuss the character of the states of this chain.

20

12. Show that if a Markov chain is irreducible and Pii > 0 for some state i then the chain is aperiodic.

(n)

13. Let 0 be an absorbing state and for j > 0, Pjj = p, Pjj−1 = q where p + q = 1. Find fj0 , the probability

that absorption takes place exactly at nth step given initial state is j.

r

14. Consider a branching process, denoted by Galton-Watson process, that model a population in which each

te

individual in generation n produces some random number of individuals in generation n + 1, according, in the

simplest case, to a fixed probability distribution that does not vary from individual to individual. That is,

es

the first generation of individuals is the collection of off-springs of a given individual. The next generation is

formed by the off-springs of these individuals. Let Xn denote the number of individuals of the nth generation,

m

starting with X0 = 1 individual (the size of the zeroth generation). Let Yi (or Yi,n ) be the number of offspring

of the ith individual of the nth generation. Suppose that, {Yi , i = 1, 2, . . .} are non-negative integer valued

Se

i.i.d. random variables with probability mass function pj = P (Yi = j), j = 0, 1, . . . and independent of the

size of the generation. Then

Xn−1

X

Xn = Yi , n = 1, 2, . . .

II

i=1

and {Xn , n = 0, 1, . . .} is a discrete time Markov chain. Classify the states of the chain.

15. Consider a DTMC on the non negative integers such that, starting from i, the chain goes to state i + 1 with

probability p, 0 < p < 1 and goes to state 0 with probability 1 − p. Show that this DTMC has a unique

steady state distribution π and then find π.

16. (a) Consider an aperiodic irreducible finite state space

P DTMC P {Xn , n = 0, 1, . . .} with one step transition

probability matrix P = [Pij ], i, j ∈ S satisfying j Pij = i Pij = 1. Find the steady state distribution

for this DTMC, if it exist.

(b) Consider the simple random walk on a circle. Assume that K odd number of points labeled 0, 1, . . . , K −1

are arranged on a circle clockwise. From i, the walker moves to i + 1 (with K identified with 0) with

probability p (0 < p < 1) and to i − 1 (with −1 identified with K − 1) with probability 1 − p. Find the

steady state distribution for this random walk, if it exist.

- Stochastics TutorialUploaded byBrofessor Paul Nguyen
- TgsUploaded byaaneee
- Optimal Stochastic Location Updates in Mobile Ad Hoc Networks by CoreieeeprojectsUploaded byMadhu Sudan U
- Random ProcessUploaded byChaithanya Suresh
- Allocation - Multiple SitesUploaded bynidee_nishok
- 1-s2.0-001905789500008N-mainUploaded byJheison Rodriguez
- Tut 2Uploaded byErin Needham
- Lecture_4_-_GARCH_models20130320201137Uploaded byAllen Li
- AssignmentUploaded byPrasanth Talluri
- napsofisetekwlozwaUploaded bypalpakratwr9030
- 9166-9146-1-PBUploaded byDennyLumbanRaja
- Euro NF PhDCourseUploaded byiky77
- OPERATION_RESEARCH_INTRO.Uploaded byprashantmehta
- Albert Cohen - Multiple State ModelUploaded bydavid kurniawan
- 1404240111719464_Probability Supp Solutions and Hints_2014Uploaded byKyle Harrison
- Bernoulli Process.pdfUploaded byMark Labinski
- z.drawdownUploaded byNik Ahmad Zukwan
- PQT_MJ07Uploaded byRaguraman Balaji
- ARMA BoxUploaded byVarun Chauhan
- Spring 12 ECON-E370 IU Exam 1 Formula SheetUploaded byTutoringZone
- Efecto Del Tiempo y Las Condiciones Del Camino de Superficie en Velocidad TrÃ¡Fico de Carreteras Rurales REVISTA Q1Uploaded byCesar Mora
- 2001wscUploaded byDark Grave
- Biến ngẫu nhiênUploaded bytangphamvan
- stocksUploaded bypostscript
- Appraisal of Queuing Behaviors in Intafact Beverage Transportation SystemUploaded byAJER JOURNAL
- 1Uploaded bydadidyan
- f01ps1sol.pdfUploaded bySagar Bollineni
- 1-Signals and Spectra - NotesUploaded byElaine Calamatta
- Jurzal Pezgetahuaz Lizgkuzgaz Syukira Azizah.pdfUploaded bymusliadi
- 8485-26088-1-PBUploaded byyaqoob008

- Tutsheet1 MTL106Uploaded byharshjd
- Tutsheet7Uploaded byvishnu
- Tutsheet5Uploaded byvishnu
- Tutsheet3Uploaded byvishnu
- Develpment From My Perspective A1 - A small town personUploaded byvishnu
- EAMCET_2012 www.6tube.inUploaded bySiva Meruva
- Tutsheet4Uploaded byvishnu
- Tutsheet2Uploaded byvishnu
- 11 April 2015Uploaded bySara
- 10 April 2016Uploaded byRohan Patel
- Tutsheet8Uploaded byvishnu
- JEE_Main_Paper_1_Code_D_Solutions_v2.pdfUploaded byvishnu
- JEE MAIN 2014 ONLINE QUESTION PAPER 09.04.2014Uploaded byHimansu Mookherjee
- Online Jee Main Paper Answerkey Solutions v2!11!04 2015Uploaded byAnonymous HSXJ3qIAeF
- Jee-Main-Official-Answer-key-Code-10.pdfUploaded byvishnu
- ONLINE_JEE_Main_Paper_AnswerKey_SOLUTIONS_v2_10_04_2015.pdfUploaded byvishnu
- Jee-Main-Official-Answer-key-Code-11.pdfUploaded byvishnu
- Jee Main 2014 Question Paper 12.04.2014Uploaded byAbhishek Kumar
- Jee Main 2014 Question Paper 19.04.2014Uploaded byAbhishek Kumar
- Jee Main 2014 Question Paper 11.04.2014Uploaded byAbhishek Kumar
- IAPT_student_brouchure_2014.pdfUploaded byvishnu
- ASRao_finalresults_2014Uploaded byvishnu
- Students Registration Form 2014Uploaded bydeepak876
- AP_EAMCET_Engg_2016Uploaded byvishnu
- EAMCET2014_Engg.pdfUploaded byvishnu
- engg_key_ts_2016.pdfUploaded byvishnu
- Eenadu Mockengineering Solutions 2016Uploaded byvishnu
- EAMCET2014_Engg_Key.pdfUploaded byvishnu
- Chemical Thermodynamics 1.JpgUploaded byvishnu

- Brown F.B. - Fundamentals of Monte Carlo Particle Transport. Lecture notes for Monte Carlo course.pdfUploaded bynagatopein6
- Implementation of Bootstrap Aggregating AlgorithmUploaded byPruthuvi Maheshakya Wijewardena
- Ccna Sec v1.1 Chap 7Uploaded byBearto Bear
- Methodology for Pavement Design Reliability and Back Analysis Using Markov Chain Monte Carlo SimulationUploaded bygustavillagomez
- Lab12 Sorting HandoutUploaded byNico Scheinost
- filtrocontrolUploaded byAngel Mnts
- Linear ProgrammingUploaded byjogindra singh
- 9D06105 Neural Networks and ApplicationsUploaded bysubbu
- Me to Do Newton KrylovUploaded byJulio Cesar Barraza Bernaola
- DSPUploaded byAbhinav Malhotra
- Nullclines.pdfUploaded byAli AH
- Lesson_09.pdfUploaded byMostak Shahriar Momit
- A Fuzzy Back Propagation AlgorithmUploaded byĐặng Sơn Tùng
- 1997_Vol_23_no_2_1001_abstractUploaded byFatih Kantaş
- ChE516 Chapter 3Uploaded bygauravkumar2
- Lecture 9 Response to General Dynamic LoadingUploaded byTUAMIGOVANCITO
- Gradient DescentUploaded byJavier Garcia Rajoy
- Fundamentals of Machine LearningUploaded byAnkitmaurya
- am50_hw4Uploaded byBryan Baek
- Lecture 3Uploaded bykkanjum
- digital certificate and signature.pptUploaded byChandni Bathla
- Fourier TransformUploaded byAguirre Ur
- Week 3 Lecture Material isg.pdfUploaded byAkhendra Kumar
- homework-week-2-_-coursera.pdfUploaded byÅshish
- Chapter 17Uploaded bysami009
- coursera2Uploaded byVlad Doru
- CB312 OutlineUploaded byRenoMasr
- Intelligent Diagnostic System for the diagnosis and prognosis of Breast Cancer using ANNUploaded byJournal of Computing
- Analysis of Stock Market Volatility using Neural Network for Apple Stock IndexUploaded byIJSTE
- hw_1Uploaded byPrafessar Khaos