2.7K views

Uploaded by pi194043

A basic implementation of first order markov chain in C++ using Eigen

save

You are on page 1of 9

Pi19404

February 23, 2014

Contents

Contents

Markov Chains

Introduction . . . . . . . . . . . . . . 0.1.1 Sequence Classification 0.1.2 Generating a Sequence 0.1.3 Code . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . 0.1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

3

3 5 7 8 8

2 | 9

Markov Chains

Markov Chains

0.1 Introduction

In this article we will look at markov models and its application in classification of discrete sequential data.

Markov processes are examples of stochastic processes that generate random sequences of outcomes or states according to certain probabilities. Markov chains can be considered mathematical descriptions of Markov models with a discrete set of states. Markov chains are integer time process X ; n 0 for which each random variable X is integer valued and depends on the past only through most recent random variable X 1 for all integer n 1.

n n n

n

= 1; : : : ; M

At each time instant t,The system changes state ,and makes a transition. The markov chains follow the markovian and stationarity property. For a first order markov chain,the markov property states that the state of the system at time t + 1 depends only on the state of the system at time t.The markov chain is also said to be memoryless due to this property.

P r(X +1 = x +1 jX

t t

x1 : : : x

P r(X +1 = x +1 jX

t t

x)

t

A stationarity assumption is also made which implies that markov property is independent of time.

P r(X +1 = x jX

t i

)=

P for

i;j

8 t and 8i; j 2 0 : : : M

Thus we are looking at processes whose sample functions are sequence of integers between 1 : : : M .

3 | 9

Markov Chains

ij i

Markov chains can be represented by directed graphs,where each state is represented by a node and directed arc represents a non zero transition probability. If a markov chain has M states then transition probability can be represented by a MxM matrix.

M M ij j

3 7 7 5

MM

The matrix T is stochastic matrix where elements in each row sum to 1 This implies that it is necessary for transition to occur from present state to one of the M states. The probability of sequence being generated by markov chain is given by

P (X ) = (x0 )

T Y

=1

p(x jx 1 )

t t

p(x jx 1 ) is the probability of observing the sequence x at time instant t given the present state is t 1

t t t

Let us consider a 2 models with following initial transition and probability matrix.

1 =

T1 = 40:3 0:4 2 =

0:1

0:6

0 :4

0 :3

0 :1

0:45 0:5

:5 0:4 2 3 0:9 0:05 0:05 T2 = 40:3 0:1 0:6 5 0:3 0:5 0:2

4 | 9

Markov Chains

S1 =

1 3 3 1

2 3 2 1

1 2 1 1

2 1 1 1

3 1 1 1

S2 =

In sequence 1 since P13 = 0,we can observe there is no transition from 1 to 3 and dominant transition is expected to be from 1 > 1. In sequence 2 since dominant transition is from observe a long sequence of 1s.

1

> 1 we can

We can also compute the probability that the sequence has been generate from a given markov process. The sequence 1 has probability of 8:6400e05 from the 1st model and 2:4300e07 from second model. The sequence 2 has probability of 0 being generate from 1st model and 0.00287 from 2nd model. Thus if we have sequence and know it is being generate from 1 of 2 models we can always predict the model the sequence has been generated from by choosing the model which generates the maximum probability. Thus we can use markov chain for sequence modelling and classification.

/** * @brief The markovChain class : markovChain is a class * which encapsulates the representation of a discrete markov * chain.A markov chain is composed of transition matrix * and initial probability matrix */ class markovChain { public: markovChain(){}; /* _transition holds the transition probability matrix * _initial holds the initial probability matrix */

5 | 9

Markov Chains

MatrixXf _transition; MatrixXf _initial; /** * @brief setModel : function to set the parameters * of the model * @param transition : NXN transition matrix * @param initial : 1XN initial probability matrix */ void setModel(Mat transition,Mat initial) { _transition=EigenUtils::setData(transition); _initial=EigenUtils::setData(initial); } void setModel(Mat transition,Mat initial) { _transition=EigenUtils::setData(transition); _initial=EigenUtils::setData(initial); } /** * @brief computeProbability : compute the probability * that the sequence is generate from the markov chain * @param sequence : is a vector of integral sequence * starting from 0 * @return : is probability */ float computeProbability(vector<int> sequence) { float res=0; float init=_initial(0,sequence[0]); res=init; for(int i=0;i<sequence.size()-1;i++) { res=res*_transition(sequence[i],sequence[i+1]); } return res; } }

6 | 9

Markov Chains

The idea behind generating a sequence from a markov process is to use a uniform random number generator. For each row of initial probability or transition matrix select state which is most likely. For example if the row contains values

[0:6; 0:4; 0]

If a uniform random value generates a value between 0 and 0.6 then state 0 is returned If a random value between 0.6 and 1 is generated then state 1 is returned.

/** * @brief initialRand : function to generate a radom state * @param matrix : input matrix * @param index ; row of matrix to consider * @return */ int initialRand(MatrixXf matrix,int index) { float u=((float)rand())/RAND_MAX; cerr << u << endl; float s=matrix(0,0); int i=0; //select the index corresponding to the highest probability //or if all the cols of matrix have transitioned while(u>s & (i<matrix.cols())) { i=i+1; s=s+matrix(index,i); } return i;

First step is to use the above method to select a inital state of matrix by passing the initial probability matrix as input. Next random state will be selected from the transition probability by passing the transition probability matrix as input.

7 | 9

Markov Chains

* @param n : is the length of the sequence * @return : is a vector of integers representing * generated sequence */ vector<int> generateSequence(int n) { vector<int> result; result.resize(n); int i=0; int index=0; //select a random initial value of sequence int init=initialRand(_initial,0); result[i]=init; index=init; for(i=1;i<n;i++) { //select a random transition to next sequence state index=initialRand(_transition,index); result[i]=index; } return result;

0.1.3 Code

The code can be found in https://github.com/pi19404/OpenVision in files ImgML/markovchain.cpp and ImgML/markovchain.hpp .

8 | 9

Bibliography

Bibliography

[1] Chee-Way Chong, P. Raveendran, and R. Mukundan. A comparative analysis of algorithms for fast computation of Zernike moments. In: Pattern Recognition (2003), pp. 731742. [2] Ming-Kuei Hu. Visual Pattern Recognition by Moment Invariants. In: IRE Transactions on Information Theory

[3]

Radhika Sivaramakrishna and N. S. Shashidhar. Hus moment invariants: How invariant are they under skew and perspective transformations? In: IEEE WESCANEX 97: Communications, Power and Computing. Conference Proceedings.

9 | 9

- Gaussian Multivariate Distribution -Part 1Uploaded bypi194043
- Image enhancement using FusionUploaded bypi194043
- Text Independent Speaker Recognition Using MFCC Technique and VQ Using LBG AlgorithmUploaded byamit15094775
- markov-chainsUploaded byelgianka
- Noise CancellationUploaded byDaksh Dagli
- Implementation of discrete hidden markov model for sequence classification in C++ using EigenUploaded bypi194043
- Noise CancellationUploaded byvantienbk
- Dense optical flow expansion based on polynomial basis approximationUploaded bypi194043
- Continuous Emission Hidden Markov Model for sequence classificationUploaded bypi194043
- OpenVision Library Gaussian Mixture Model ImplementationUploaded bypi194043
- 20752_0412055511MarkovChainUploaded bycartamenes
- A Distributed CSMA Algorithm for Throughput and Utility Maximization in Wireless NetworksUploaded byMuthu Maha Raja
- JRA152Uploaded byLogabharathi Aruchamy
- Speaker Identification SystemUploaded byAmit Nahar
- MarkovUploaded byyoki_triwahyudi
- Opt NotesUploaded byAnonymous E7JynRcI5N
- Sound AnalysisUploaded byAkanksha Singh Thakur
- David Lando - Cox Process and Risky Securities.pdfUploaded byChun Ming Jeffy Tam
- paper01bUploaded byLarry Blume
- Macro and Micro level HRPUploaded byGurpreet Kaur
- Sen EtaUploaded byJohnny Deep

- Compiling Native C/C++ library for AndroidUploaded bypi194043
- Multi Class Logistic Regression Training and TestingUploaded bypi194043
- Polynomial Approximation of 2D image patch -Part 2Uploaded bypi194043
- Modified Canny Edge DetectionUploaded bypi194043
- polynomial approximation of a 2D signalUploaded bypi194043
- Implementation of discrete hidden markov model for sequence classification in C++ using EigenUploaded bypi194043
- Dense optical flow expansion based on polynomial basis approximationUploaded bypi194043
- Fast 2D Separable Symmetric/Anti-Symmmetric ConvolutionUploaded bypi194043
- Continuous Emission Hidden Markov Model for sequence classificationUploaded bypi194043
- OpenVision Library Gaussian Mixture Model ImplementationUploaded bypi194043
- ARM Neon Optimization for image interleaving and deinterleavingUploaded bypi194043
- A linear channel filterUploaded bypi194043
- Adaptive Skin Color DetectorUploaded bypi194043
- Embedded Systems Programming with ARM on Linux - Blinking LEDUploaded bypi194043
- Integral Image for Computation of Mean And VarianceUploaded bypi194043
- Random Ferns for Patch DescriptionUploaded bypi194043
- Polynomial Approximation of 1D signalUploaded bypi194043
- C++ Const,Volatile Type QualifiersUploaded bypi194043
- Local Binary PatternUploaded bypi194043
- Mean Shift AlgorithmUploaded bypi194043
- C++ InheritanceUploaded bypi194043
- Normalized convolution for image interpolationUploaded bypi194043
- Tan and Triggs Illumination normalizationUploaded bypi194043
- Uniform Local Binary Pattern and Spatial Histogram ComputationUploaded bypi194043
- C++ static members and functionUploaded bypi194043
- C++ virtual functions and abstract classUploaded bypi194043

- Isolamento de micro-organismos do soloUploaded byRamir Bavaresco Junior
- Rosales FinalUploaded byLesly Schiffer
- Seize Your Health - The Answer for Cancer, Type 2 Diabetes, High Blood Pressure, High Cholesterol, Arthritis, and Many Other Diseases Revealed! by Milton ArchUploaded byMilton Arch
- Tarea v Eris Introduccion a La EconomiaUploaded byHubert Permar
- Captulo i - Introduo Mecnica Dos FluidosUploaded byWallace James
- english essay finalUploaded byapi-340258936
- download-9306-Raciocínio_Lógico_com_Flash_Cards-2-48227.pdfUploaded byThiago Santana
- ObturacaoUploaded byAmanda S. Mancuzo
- APOSTILA+DO+PIN+MBA+GE+-+v3+2009Uploaded byStations Stations
- DESARROLLO-PERSONAL.pdfUploaded byOscar
- SHIPBOARD HIGH VOLTAGE.docxUploaded byarun
- Manual de Mecanica de Fluidos IUploaded byVíctor Vásquez Borda
- Blanqueamiento y BorramientoUploaded byAxler Yépez Saldaña
- horário.pdfUploaded byPedro Lyra
- FinalUploaded bycomputeranjel
- How To Sweep An EqUploaded byspankthebazz
- Journal of FinanceUploaded bymschorr24
- 122790711-Planificacion-Cuarto-Primaria.pdfUploaded byyojanmerch
- Monografia de La ComunicacionUploaded byIvan Espillco Quispe
- A Study of Consumer Preference and Precept Ion Towards Internet Service Provider in PunjabUploaded byChandan Kumar Singh
- signal transduction.pptxUploaded bysangha_mitra_2
- List of Countries Whose Nationals Need or Do Not Need a Schengen Visa 8-10-2016Uploaded byRocky Nair
- Percy and The Search for GnosisUploaded byLuke Trouwborst
- Psicoterapia prescriptivaUploaded byKevin Barchiesi
- AllahUploaded byMohammed Faizal
- METODOLOGIA_ReferênciasUploaded byscribidubidu
- Oggetti Articolo 179 ITA FAeBP3NMh2kM92pnDqwH6bNVMccS2WNxpTY3LCETUploaded byalainthelone
- Autoevaluacion Problemas Socioeconomicos de MexUploaded byGiovanni Madrigal
- English Civil WarUploaded byreader888
- 118t2guia4Uploaded byBenjo Alonz