You are on page 1of 9

ECE 2030 - Physiological signal

Processing
Module 3
Adaptive segmentation
Adaptive segmentation
• Limitations of short-time analysis lies with the use of a fixed
window duration
– A signal may remain stationary for a certain duration of time much
longer than the window duration chosen and yet the signal would be
broken into many segments
– Conversely, a signal may change its characteristics within the duration of
the fixed window
– short-time analysis cannot guarantee stationarity of the signal over even
the relatively short duration of the analysis window used.
• It would be desirable to adapt the analysis window to changes
in the given signal, allowing the window to be as long as
possible while the signal remains stationary
• to start a new window at the exact instant when the signal or
the related system changes its characteristics
– Spectral error measure (SEM), ACF distance, Generalised likelihood ratio
(GLR)
Adaptive segmentation
• All pole LP(linear prediction model) or AR model used for
adaptive segmentation of EEG signals into quasi-stationary
segments
• Observations:
– time domain: present value of the prediction error indicates the
instantaneous degree of “unexpectedness” in the signal
– autocorrelation domain: The prediction error is decorrelated.
– spectral domain: The prediction error being white noise, the AR model
yields an all-pole representation of the signal spectrum

• Algorithm for adaptive segmentation


– Let n = 0 - starting point of analysis where the first reference or fixed
analysis window is placed for each adaptive segment,
– (N + P) samples of the signal y(n) should be available prior to the
arbitrarily designated origin at n = 0
– where (2N + 1) is the size of the analysis window
– P is the order of the AR model to be used
• Using
1.   the signal samples y(-N) to y(N), compute the signal ACF
up to lag P
2. Derive the corresponding AR model of order P
3. Using the signal values y(-N-P) to y(n + N), compute the
prediction error e(-N) to e(n + N), and compute the running short-
time ACF (n, m) of the prediction error
first index n to indicate the position of the short-time analysis window -
second index m to indicate the lag for which the ACF is computed
4. Calculate (O, m) for m = 0,1, . . . , M - represents the fixed window
5. Compute (n, m) for the moving window
6. Compute the SEM
7. Test if SEM(n) > Th1, where Th1 is a threshold (No: increase n by 1
repeat 5, yes: segment the boundary)
8. Shift the time axis by substituting (n + k) with (k - N) and start
the procedure again with Step 1.
a. Original EEG signal AR model optimized signal window
b. Prediction error for fixed window and moving window
c. Segmentation threshold
d. SEM - the vertical lines represent the segmentation boundaries
SEM to segment an EEG signal
a. Original EEG signal of a child in sleep stage I with
superimposed 14 Hz spindles
b. Segmentation threshold
c. SEM
d. Deviation in prediction error power
e. Deviation in prediction error spectral shape - vertical lines
represent the segmentation boundaries
Generalized likelihood ratio
GLR - Use a reference window that is continuously grown as long
as no new boundary is marked.
• The test window is a sliding window of constant duration as in the case
of the SEM and ACF methods
• growing reference window is that it contains the maximum amount of
information available from the beginning of the new segment to the
current instant
• 3 data sets: the growing reference window, the sliding test window, and
a pooled window formed by concatenating the two
• RLS: to predict the current signal sample from the available knowledge
of the previous samples stored in the filter’s memory units.
• Adaptive filter’s tap-weight vector is modified by the RLS algorithm
• RLS method computes a new filter tap-weight vector at each sample of
the incoming signal (ex: Gradual change and sudden variations in VAG signal)
• RLSL method uses a lattice filter, and is based upon forward and backward
prediction
• Recursive least-squares lattice (RLSL) algorithm is suitable to solve the
least-squares problem recursively and rapidly

You might also like