You are on page 1of 2

Definition

chain is usually discrete stochastic process (as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved.
Markov

Application
Markov chains are applied to many different fields. Research has reported their application in a wide range of topics such as Physics, Chemistry, Medicine, Music, Game Theory and Sports.

Mathematical biology
Markov chains also have many applications in biological modelling, particularly population processes, which are useful in modelling processes that are (at least) analogous to biological populations. The Leslie matrix is one such example, though some of its entries are not probabilities (they may be greater than 1). Another example is the modeling of cell shape in dividing sheets of epithelial [19] cells. Yet another example is the state of Ion channels in cell membranes.

Games
Markov chains can be used to model many games of chance. The children's games Snakes and Ladders and "Hi Ho! Cherry-O", for example, are represented exactly by Markov chains. At each turn, the player starts in a given state (on a given square) and from there has fixed odds of moving to certain other states (squares).

The use of the term in Markov chain Monte Carlo methodology covers cases where the process is in discrete time (discrete algorithm steps) with a continuous state space. Hamilton (1989). and the random process is a mapping of these to states. and the probabilities associated with various state-changes are called transition probabilities. with the state changing randomly between steps. the steps are the integers or natural numbers. the statistical properties of the system's future can be predicted. The set of all states and transition probabilities completely characterizes a Markov chain. it is generally impossible to predict with certainty the state of a Markov chain at a given point in the future. Another was the regime-switching model of James D.e.Markov chains are used in Finance and Economics to model a variety of different phenomena. A more recent example is the Markov Switching Multifractal asset pricing model. A discrete-time random process involves a system which is in a certain state at each step. including asset prices and market crashes. It uses an arbitrarily large Markov chain to drive the level of volatility of asset returns. The Markov property states that the conditional probability distribution for the system at the next step (and in fact at all future steps) depends only on the current state of the system. The following concentrates on the discrete-time discrete-state-space case. in which a Markov chain is used to model switches between periods of high volatility and low volatility of [15] asset returns. and not additionally on the state of the system at previous steps. [16] which builds upon the convenience of earlier regime-switching models. formally. The first financial model to use a Markov chain was from [14] Prasad et al. By convention.. Usually a Markov chain is defined for a discrete set of times (i. so there is always a next state and the process goes on forever . we assume all possible states and transitions have been included in the definition of the processes. Definition Often. a discrete[1] time Markov chain) although some authors use the same terminology where "time" can [2][3] take continuous values. However. The steps are often thought of as moments in time. The changes of state of the system are called transitions. in 1974. In many applications. but they can equally well refer to physical distance or any other discrete measurement. it is these statistical properties that are important. the term "Markov chain" is used to mean a Markov process which has a discrete (finite or countable) state-space. Since the system changes randomly.