0 Up votes0 Down votes

5 views33 pagesj

Apr 25, 2017

© © All Rights Reserved

PDF, TXT or read online from Scribd

j

© All Rights Reserved

5 views

j

© All Rights Reserved

- Markov Chains
- Phillips D.image Processing With C
- Text Detection Survey
- Project Report
- Tagging Talkers in a Noisy, Reverberant Large-aperture Microphone-Array Environment
- HMM Tutorial
- Lecture 05
- Temporal Stabilization of Vidoe Object Segmentation for 3D TV Applications
- Fruit Defect
- Ijarece Vol 3 Issue 9 990 994
- Imageprocessing@Sapience 2010
- Image Segmentation
- Unsupervised Method of Object Retrieval Using Similar Region Merging and Flood Fill
- beetz09ijcss
- ifsr_Preprocessing
- A New License Plate Recognition System Based on Probabilistic.pdf
- A contribution to mouth structure segmentation in images aimed towards automatic mouth gesture recognition
- UseR-SC-10-B-Part2.pdf
- Accurate and Fast Iris Segmentation
- Medical Image Segmentation by Transferring Ground Truth Segmentation Based upon Top Down and Bottom Up Approach

You are on page 1of 33

Meryem Ameur

Faculty of Sciences and Technics

Sultan Moulay Slimane University

Beni Mellal city

Morocco

30 March 2017

30 March

Processing

2017 and

1 / Decis

33

Outline

1 Introduction

3 Iterative Estimators

4 Experimental Results

5 Conclusion

30 March

Processing

2017 and

2 / Decis

33

Introduction 1/3

processes :

1 An observed process, noted Y.

2 A hidden process, noted X.

In Markovian segmentation we consider :

1 An observed process Y is an image to be segmented.

2 A hidden process X is an image result of segmentation .

30 March

Processing

2017 and

3 / Decis

33

Example

Observation Y Result X

30 March

Processing

2017 and

4 / Decis

33

Introduction 2/3

where K is a number of membership class. It's initialized by the user. From

the observed image Y = (Yn ) R, n N , where N is a number of pixels

component the image. HMM(eld,chain,tree) used the Bayes theorem :

P(X ).P(Y |X )

P(X |Y ) = (1)

P(Y )

Where :

P(X |Y ) : is the probability of a posteriori law X knows Y .

P(X ) : is the probability of the a priori law .

P(Y |X ) : is the probability of the attached data law.

P(Y ) : is a constant of normalization P(Y ) = 1.

30 March

Processing

2017 and

5 / Decis

33

Introduction 3/3

ICE algorithm. We are used these estimators to estimate a parameters of

Hidden Markov Chain with Independant Noise model(HMC-IN) to

segment the brain tumor MRI images.

30 March

Processing

2017 and

6 / Decis

33

Hidden Markov Chain With Independant Noise (HMC-IN)

Independent Noise (HMC-IN), because, it ignores the noise information

contained the image to be segmented Y .

Let the process Z = (X , Y ), where X = (Xn )Nn=1 and

Y = (Yn )N n=1 R.

30 March

Processing

2017 and

7 / Decis

33

Hidden Markov Chain With Independant Noise (HMC-IN)

1 The process X is a Markov Chain, it's homogeneous and stationary, its

law is as follows :

Y1

N

P(x) = P(X1 = x1 ). P(Xn+1 = xn+1 |Xn = xn ) (2)

n=1

3 Each observation yn , n N depends only on its hidden class xn .

P(Yn = yn |X ) = P(Yn = yn |Xn = xn ) (3)

30 March

Processing

2017 and

8 / Decis

33

The laws of HMC-IN

1 The law of Markov chain X .

2 The law of attached data Y |X (observations).

30 March

Processing

2017 and

9 / Decis

33

The parameters of HMC-IN

I The initial law PI (i) = p(x1 = i) i .

I The transition matrix A(i, j) = p(xn+1 = j|xn = i) i, j .

2 Attached data parameter's y |x (Gaussian law ) :

I The mean i i for each class k .

I The variance i2 i for each class k .

I The density f of the distribution Y condionally to X using :

1 (yn i )2

f =p exp[ ] (4)

2(i2 ) (i2 )

30 MarchProcessing

2017 10

and/ Decis

33

Estimation phases

2 The iterative estimation parameters phase.

3 The nal decision phase.

30 MarchProcessing

2017 11

and/ Decis

33

Initialization phase

Conguration initial of X 0 by :

I K-means.

I FCM....

Initialization parameters of the X process x0 .

I PI (i)0 .

I A(i, j)0 .

Initialization parameters of the data attached law y0|x .

I (i)0 .

I ((i)2 )0 .

30 MarchProcessing

2017 12

and/ Decis

33

Iterative Estimation phase

iterations q Q .

1 EM : Expectation Maximization.

2 SEM :Stochastic Expectation Maximization.

3 ICE : Iterative Conditional Estimation.

These algorithms are based on Baum Welch algorithm to calculate

parameters.

30 MarchProcessing

2017 13

and/ Decis

33

Baum Welch algorithm

Calculated Forward probabilities n (i) = p(y1 , ....., yn , xn ).

Forward Algorithm

1 Initialization : (n = 1)

PI (i)fi (y1 )

1 (i) = P i (5)

j PI (i)fj (y1 )

2 Induction : (n > 1)

P

fi (yn+1 ) j n (j)A(i, j)

n (i) = P P i (6)

k fk (yn+1 ) j n (j)A(i, j)

30 MarchProcessing

2017 14

and/ Decis

33

Baum Welch Algorithm

Backward Algorithm

1 Initialization : (n = N)

N (i) = 1i, j (7)

2 Induction : (n < N)

P

j A(i, j)fj (yn+1 )n+1 (j)

n (i) = P P i, j (8)

k fk (yn+1 ) j n (j)A(i, j)

30 MarchProcessing

2017 15

and/ Decis

33

Baum Welch Algorithm

Marginal a posteriori Law n (i)

n (i) = n (i)n (i)i, j (9)

Calculated Joint a posteriori probabilities

n (i, j) = p(xn = i, xn+1 = j|yn ).

n (i)A(i, j)fj (yn+1 )n+1 (j)

n (i, j) = P P i, j (10)

k fk (yn+1 ) l n (l)A(l, k)

30 MarchProcessing

2017 16

and/ Decis

33

A.EM Algorithm

EM Algorithm

For a number of iteration q Q :

Step(E ) :

I Calculate n (i), n (i), n (i, j) and n (i), using Baum Welch.

Step(M ) :

[q]

I Estimate parameters of the Markov chain x using the deterministic

strategy.

PN

1 n (i, j)

[q]

A(i, j) = Pn= N

i, j (12)

n=1 n (i)

30 MarchProcessing

2017 17

and/ Decis

33

EM Algorithm

Estimate parameters of law attached data y[q]|x for each class i also by

the deterministic strategy.

PN

1 yn n (i)

[q]

i = Pn=

N

i (13)

n=1 n (i)

PN [q] 2

2 [q] n=1 (yn i ) n (i)

(i ) = PN i (14)

n=1 n (i)

30 MarchProcessing

2017 18

and/ Decis

33

B.ICE Algorithm

ICE Algorithm

For each iteration q Q :

Calculate n (i), n (i), n (i, j) and n (i), using Baum Welch.

Simulate a sample of X q for one random simulation using the

parameters of the iteration q

strategy.

PI (i)[q] = 1 (i)i (15)

30 MarchProcessing

2017 19

and/ Decis

33

ICE Algorithm

PN

1 n (i, j)

A(i, j) [q]

= Pn=

N

i, j (16)

n=1 n (i)

stochastic strategy :

yn 1[xn = i]

PN

1

[q]

i = Pn= i (17)

n=1 1[xn

N

= i]

PN [q] 2

2 [q] = i]

(i ) = i (18)

n=1 1[xn = i]

PN

30 MarchProcessing

2017 20

and/ Decis

33

Complexity of ICE and EM (Comparaison)

Task ICE EM

Calculate O(K 2 N) O(K 2 N)

Calculate O(K 2 N) O(K 2 N)

Calculate O(K 2 N) O(K 2 N)

Calculate O(KN) O(KN)

Calculate PI O(K ) O(K )

Calculate A O(K 2 N) O(K 2 N)

Calculate O(KN) O(KN)

Calculate 2 O(KN) O(KN)

Simulate X O(KN) not executable

Table: Complexity of EM and ICE algorithms

From this table, we notice that the complexity of ICE is superior than the

complexity of EM. Because, ICE simulates the hidden process X one time

in each iteration q Q . This task makes ICE more complex than EM.

Meryem Ameur Markovian Segmentation (Laboratory of Information

30 MarchProcessing

2017 21

and/ Decis

33

Final Decision phase

result image X .

MPM : Maximum Posteriori Mode.

MPM Estimator

The solution in image segmentation by Hidden Markov Chain according

MPM is : maximize the marginal a posteriori probability n (i) :

xnmpm = argmaxi (n (i)n (i)) = argmaxi ((n (i))) (19)

The complexity of this estimator is O(KN).

30 MarchProcessing

2017 22

and/ Decis

33

Experiments

We use K-means algorithm to initialize the conguration of X0 where

K = 3.

Concerning the initial law PI 0 we have :

PI0 =

0.33

0.33

0.33

Concerning the matrix of transition A0 (i, j), we have :

A0 =

0.5 0.25 0.25

0.25 0.5 0.25

0.25 0.25 0.5

The mean 0 and the variance ( 2 )0 from the conguration of X0 .

We have a number of iterations Q = 30.

We have used this type of initialization parameters in all experiments

presented in this work.

We have used the thresholding technic to extract the interset region.

Meryem Ameur Markovian Segmentation (Laboratory of Information

30 MarchProcessing

2017 23

and/ Decis

33

Experiment 1

Figure: Experiment 1

30 MarchProcessing

2017 24

and/ Decis

33

Experiment 2

Figure: Experiment 2

30 MarchProcessing

2017 25

and/ Decis

33

Experiment 3

Figure: Experiment 3

30 MarchProcessing

2017 26

and/ Decis

33

Results 1/2 :PSNR index SSIM index and error rate

We have compared these estimators in term of the PSNR index, the SSIM

index, the error rate and the convergence.

Experiments PSNR ICE SSIM ICE PSNR EM SSIM EM

Experiment 1 24,0672 0.5710 24,0672 0.5697

Experiment 2 18.4713 0.4773 18.4713 0.4784

Experiment 3 21,8050 0.5157 21,8058 0.5150

Table: PSNR index and SSIM index

Experiment 1 8,1357 8,1357

Experiment 2 9.6075 9.6075

Experiment 3 10,0450 10,0450

Table: error rate

Meryem Ameur Markovian Segmentation (Laboratory of Information

30 MarchProcessing

2017 27

and/ Decis

33

Results 2/2 : Convergence

Experiments ICE EM

Experiment 1 5 iterations 7 iterations

Experiment 2 6 iterations 8 iterations

Experiment 3 7 iterations 8 iterations

Table: The convergence of ICE and EM algorithms

30 MarchProcessing

2017 28

and/ Decis

33

Conclusion

I EM.

I ICE.

Under the nal Bayesian criteria.

I MPM.

In Segmentation of.

I Brain tumor MRI images.

To extract brain tumor position

I Thresholding technic.

30 MarchProcessing

2017 29

and/ Decis

33

Conclusion

1 The complexity.

2 The PSNR index.

3 The SSIM index.

4 The error rate.

5 The convergence.

30 MarchProcessing

2017 30

and/ Decis

33

Conclusion

1 In term of the Complexity ICE is more complex than EM.

2 In term of the quality of segmentation the dierence is not important.

3 In term of the convergence ICE converge faster than EM.

30 MarchProcessing

2017 31

and/ Decis

33

Perspectives

Perspectives

Using these estimator algorithms to :

1 Segment color textured images using a pairewise Markov chain model.

2 Segment the MRI images using the triplet Markov chain, considering

that X is non stationary.

30 MarchProcessing

2017 32

and/ Decis

33

Meryem Ameur Markovian Segmentation (Laboratory of Information

30 MarchProcessing

2017 33

and/ Decis

33

- Markov ChainsUploaded byNajjar7
- Phillips D.image Processing With CUploaded byapi-3701770
- Text Detection SurveyUploaded byShanti Prasad
- Project ReportUploaded byJarjit Tandel
- Tagging Talkers in a Noisy, Reverberant Large-aperture Microphone-Array EnvironmentUploaded bySEP-Publisher
- Lecture 05Uploaded bypopstick2000
- Temporal Stabilization of Vidoe Object Segmentation for 3D TV ApplicationsUploaded byMerlin Gilbert Raj
- Imageprocessing@Sapience 2010Uploaded byAlapati Anudeep
- HMM TutorialUploaded byMark Ebrahim
- Fruit DefectUploaded byAchmad Lukman
- Image SegmentationUploaded byangels2_2009
- Unsupervised Method of Object Retrieval Using Similar Region Merging and Flood FillUploaded byEditor IJACSA
- Ijarece Vol 3 Issue 9 990 994Uploaded bylakshmipo
- beetz09ijcssUploaded byBeroeank Lagi Mencoba Belajar
- ifsr_PreprocessingUploaded bymatlab5903
- A New License Plate Recognition System Based on Probabilistic.pdfUploaded byFrancis Junior
- A contribution to mouth structure segmentation in images aimed towards automatic mouth gesture recognitionUploaded by3qeel
- UseR-SC-10-B-Part2.pdfUploaded byuniversedrill
- Accurate and Fast Iris SegmentationUploaded byDhi92
- Medical Image Segmentation by Transferring Ground Truth Segmentation Based upon Top Down and Bottom Up ApproachUploaded byCS & IT
- 28Uploaded byAndrea Fields
- ic981991.pdfUploaded byarab
- An OCR for VLP's (Vehicle License Plate)Uploaded bylarataquevuela
- 2Uploaded byCristina Alexandra
- 24 InferenceUploaded byJan Hula
- Lecture 15 EMUploaded bymrthlinh
- 123Uploaded bywin teck
- 2Uploaded byruchi
- Mix ToolsUploaded bysaparya92
- Image ProcessingUploaded byVidvek InfoTech

- JSIR 70(4) 270-272Uploaded byAbdelkbir Ws
- Arabic Speech Recognition SystemsUploaded byAbdelkbir Ws
- pxc3872774.pdfUploaded byAbdelkbir Ws
- Speech Recognition System Using Mfcc and Htk ToolkitUploaded byAbdelkbir Ws
- ReportUploaded byAbdelkbir Ws
- Isolated Digit Recognizer Using Gaussian Mixture ModelsUploaded byAbdelkbir Ws
- An Experimental Framework for Arabic Digits Speech Recognition in Noisy EnvironmentsUploaded byAbdelkbir Ws
- D04812125.pdfUploaded byAbdelkbir Ws
- compare.txtUploaded byAbdelkbir Ws
- 05556819.pdfUploaded byArif Ahmed
- CompareUploaded byAbdelkbir Ws

- Neutov Article 2014Uploaded byZarr Aksh
- 0.3_PakalnisUploaded byEugenio Salinas Ibarra
- 309509323-Design-of-Packed-Columns-for-Absorption-and-Distillation-Processes-prelecture-Slids.pdfUploaded byHung Cao Duc
- Controller_with_EMI.pdfUploaded byJames Quintero
- Solution for Exam 1Uploaded byKim Howard Castillo
- Topographic Map of JerichoUploaded byHistoricalMaps
- h43 Fuel Syst #2 Injection Duration Controls TOYOTAUploaded byFahroes Yasser Can'nbetrusted
- mcs021b1Uploaded bysawant_shnkr
- KN0Uploaded byNabila Saribanun
- DrumCore 3 Plug-In GuideUploaded byDaniel Jiggens
- Pulmonary Function Tests ScribdUploaded byNamita Jadhao
- UMTS and WLAN InteroperabilityUploaded bymd1234_1984
- LogUploaded byDaniel Maier
- Lec 02 Intersection DesignUploaded byDr Firas Asad
- Critique Sample - Graduate StudyUploaded byEzra Hilary Ceniza
- Auma Basic Data ActuatorsUploaded bySudharsan Rao Gopisetty
- DSE5220manualUploaded byAsif Shah
- Bcs the Chartered Institute for It Bcs Higher Education Qualifications Bcs Level 5 Diploma in It Database SystemsUploaded byAlan Hugh
- TTC__FSTUploaded byDavid Hall
- Danfoss Refrigeration Basics - ESSENTIALUploaded byGeorge Mavromatidis
- PayPass - MChip (V1.3)Uploaded byvijayasimhas
- KeysUploaded byAdelani Oyawoye
- Extention of Unit 4Uploaded byMehaboob Basha
- Chap04 - Laplace Transforms.pptUploaded byKeith Madden
- Disaster TransportUploaded byswalker948
- FrSky X6R ManualUploaded bymaodesus
- LPG LNG Cargo Handling(1)Uploaded byaman agrawal
- MDM Plugin Installation GuideUploaded byrsreddy434
- Problem Of EvilUploaded bySavannah Goodman
- KPOLOVIE Hypothesis PostulationUploaded byvirus_xxx

## Much more than documents.

Discover everything Scribd has to offer, including books and audiobooks from major publishers.

Cancel anytime.