Professional Documents
Culture Documents
PAPER
During the embargo period (the 12 month period from the publication of the Version of Record of this article), the Accepted Manuscript is fully
protected by copyright and cannot be reused or reposted elsewhere.
As the Version of Record of this article is going to be / has been published on a subscription basis, this Accepted Manuscript will be available for
reuse under a CC BY-NC-ND 3.0 licence after the 12 month embargo period.
After the embargo period, everyone is permitted to use copy and redistribute this article for non-commercial purposes only, provided that they
adhere to all the terms of the licence https://creativecommons.org/licences/by-nc-nd/3.0
Although reasonable endeavours have been taken to obtain all necessary permissions from third parties to include their copyrighted content
within this article, their full citation and copyright line may not be present in this Accepted Manuscript version. Before using any content from this
article, please refer to the Version of Record on IOPscience once published for full citation and copyright details, as permissions may be required.
All third party content is fully copyright protected, unless specifically stated otherwise in the figure caption in the Version of Record.
network dynamics 2
Maryam M. Shanechi1,4 4
1 Ming Hsieh Department of Electrical and Computer Engineering, Viterbi School of 5
Engineering, University of Southern California, Los Angeles, CA, USA 6
2 Department of Neurological Surgery, University of California, San Francisco, California, 7
USA. 8
3 Kavli Institute for Fundamental Neuroscience, University of California, San Francisco, San 9
Francisco, California, USA. 10
4 Neuroscience Graduate Program, University of Southern California, Los Angeles, CA, USA 11
† Equal contribution 12
E-mail: shanechi@usc.edu 13
Abstract. 14
Objective. Extracting and modeling the low-dimensional dynamics of multi-site electrocor- 15
ticogram (ECoG) network activity is important in studying brain functions and dysfunctions 16
and for developing translational neurotechnologies. Dynamic latent state models can be used 17
to describe the ECoG network dynamics with low-dimensional latent states. But so far, non- 18
stationarity of ECoG network dynamics has largely not been addressed in these latent state 19
models. Such non-stationarity can happen due to a change in brain state or recording insta- 20
bility over time. A critical question is whether adaptive tracking of ECoG network dynamics 21
can lead to further dimensionality reduction and more parsimonious and precise modeling. 22
This question is largely unaddressed. Approach. We investigate this question by employing an 23
adaptive linear state-space model for ECoG network activity constructed from ECoG power 24
feature time-series over tens of hours from 10 human subjects with epilepsy. We study how 25
adaptive modeling affects the prediction and dimensionality reduction for ECoG network dy- 26
namics compared with prior non-adaptive models, which do not track non-stationarity. Main 27
results. Across the 10 subjects, adaptive modeling significantly improved the prediction of 28
ECoG network dynamics compared with non-adaptive modeling, especially for lower latent 29
state dimensions. Also, compared with non-adaptive modeling, adaptive modeling allowed 30
for additional dimensionality reduction without degrading prediction performance. Finally, 31
these results suggested that ECoG network dynamics over our recording periods exhibit non- 32
stationarity, which can be tracked with adaptive modeling. Significance. These results have 33
important implications for studying low-dimensional neural representations using ECoG, and 34
for developing future adaptive neurotechnologies for more precise decoding and modulation 35
of brain states in neurological and neuropsychiatric disorders. 36
reduction 38
P Ahmadipour et al 2
41 as a neural signal modality to study brain functions and Given these potential non-stationarities, a critical 97
42 dysfunctions and to develop clinically-feasible neurotech- question in modeling low-dimensional dynamics in ECoG 98
43 nologies [1–6]. Compared with other invasive neural sig- network activity is whether adaptive tracking of these 99
44 nal modalities, ECoG can provide advantages in terms of dynamics can lead to further dimensionality reduction 100
45 recording longevity [1, 7–9] and the spatial extent of brain and more parsimonious and precise dynamic latent state 101
46 networks that can be recorded from [8] while having better models. This question has largely been unexplored to 102
47 signal-to-noise ratio than non-invasive electroencephalo- date, partly because prior dynamic latent state models 103
48 gram (EEG) [10] (see Discussion). Thus ECoG is widely have been time-invariant and thus have not aimed to track 104
49 used in many applications. For example, multi-site ECoG non-stationarity [46] (see Discussion). Here we address 105
50 has been used to investigate brain functions such as move- this question by employing an adaptive algorithm that can 106
51 ment [11, 12], speech [13–16], attention [17–19], memory track non-stationarity in low-dimensional latent dynamics 107
52 [20, 21] and mood [22], and also brain dysfunctions such of neural activity [61, 62]. This algorithm can be applied 108
53 as epilepsy [23], Parkinson’s disease (PD) [24–27], dys- to latent LSSMs to achieve both adaptive tracking and 109
54 tonia [28, 29] and essential tremor [30, 31]. Furthermore, dimensionality reduction at the same time, and we thus 110
55 multi-site ECoG, which we refer to as ECoG network ac- refer to it as adaptive LSSM. Using this algorithm, we 111
56 tivity, can provide long-term measurements of the brain for adaptively model ECoG network activity constructed from 112
57 developing real-time neurotechnologies such as closed-loop multi-site ECoG power feature time-series at multiple 113
58 brain-machine-interfaces (BMIs) [6, 11, 32–34], and holds frequency bands over a span of tens of hours from 10 human 114
59 promise in developing future closed-loop deep brain stim- subjects with epilepsy. We chose to model the ECoG power 115
60 ulation (DBS) systems for treating neurological and neu- feature time-series because powers have been shown to 116
61 ropsychiatric disorders [6, 35, 36]. contain useful information for decoding various brain states 117
62 The importance of ECoG network activity in devel- such as movement intention [11, 33, 63, 64], speech [14], 118
63 oping future neurotechnologies makes it critical to accu- covert attention [18] and mood [22] and we henceforth 119
64 rately model its dynamics, i.e., how it evolves over time. refer to these multi-site ECoG power feature time-series as 120
65 The structured spatiotemporal dynamics observed in neu- ECoG network activity in short. We also directly compare 121
66 ral data across various modalities has motivated the devel- with a prior non-adaptive LSSM that does not track non- 122
67 opment of dynamic latent state models that describe the stationarity [46]. We evaluate how well our models can 123
68 temporal dynamics of neural activity and simultaneously track the dynamics of ECoG network activity constructed 124
69 achieve dimensionality reduction [6, 37]. For example, dy- from ECoG power feature time-series—henceforth termed 125
70 namic latent state models have been developed for popula- ECoG network dynamics in short—by quantifying how 126
71 tion spiking activity [37–45] and used successfully to de- well they can predict the future value of ECoG network 127
72 code movement using spikes [39, 42, 43]. More recently, activity from its current and past values. This quantification 128
73 dynamic latent state models have been developed for ECoG provides a prediction performance in each case. 129
74 network activity to describe its dynamics in terms of a low- Across 10 subjects, we find that adaptively tracking 130
75 dimensional latent state that summarizes the state of the net- ECoG network dynamics leads to significantly better pre- 131
76 work [22, 46]. These models have taken the form of a low- diction performance, especially for lower latent state di- 132
77 dimensional linear state-space model (LSSM), which can mensions. Further, compared with non-adaptive model- 133
78 extract the low-dimensional latent dynamics of ECoG net- ing, adaptive modeling can achieve a given prediction per- 134
79 work activity [46], can successfully decode mood in human formance using a markedly lower latent state dimension, 135
80 subjects from ECoG [22] and holds promise in developing suggesting that it can extract lower-dimensional latent dy- 136
81 closed-loop controllers of brain states using electrical stim- namics, enable further dimensionality reduction, and lead 137
82 ulation [47, 48]. to more parsimonious models. Finally, we use numerical 138
83 An important application of ECoG network activity simulations to confirm that the improvement in prediction 139
84 is in translational neurotechnologies that operate over long performance achieved because of adaptation in real ECoG 140
85 time-periods such as ECoG-based closed-loop BMIs [6, network dynamics is significantly higher than what would 141
86 11, 32, 33] and closed-loop DBS systems [6, 35, 36, 49]. be expected for stationary ECoG network dynamics. This 142
87 In these applications, it is likely that the ECoG network result suggests the existence of non-stationarity in ECoG 143
88 dynamics will be non-stationary over time. For example, network dynamics over the recording periods in our data, 144
89 recording instability can happen over time [50, 51], leading which can be tracked with our adaptive modeling. These 145
90 to non-stationary dynamics. Further, neural dynamics findings can lead to more parsimonious and precise models 146
91 could change due to a change in the brain state, for of ECoG network dynamics through adaptation. Studying 147
92 example during the process of learning in a BMI [52–57] the extracted lower-dimensional dynamics can help inves- 148
149 tigate the low-dimensional latent representations of ECoG 2.2. Adaptive LSSM algorithm for model identification 198
156 We denote the ECoG network activity at time step t by Fig. S1(a)). The key idea in this algorithm is to track 207
157 yt ∈ Rny , and define the ECoG network dynamics over time the change in the covariance matrices by computing time- 208
158 as the time-series {yt ,t = 1, 2, ..., T }, where T is the total variant covariance matrices for the ECoG network activity 209
159 recording time ({·} denotes a time-series or a set depending at each time t as: 210
t
160 on the context). In this work, we take the network activity
Λt,τ = E[yt+τ ytT ] ∝ ∑ β t−k yk+τ yTk , (2) 211
161 yt as ECoG power features at multiple frequency bands k=1
162 in multiple brain regions (Fig. 1(a); section 2.5.2). We
where β ∈ (0, 1) is a forgetting factor that determines how 212
163 emphasize that the adaptive LSSM is a general network
the past observed ECoG network activity are weighted 213
164 model and can be used to model features from different
in computing the covariance matrices. Then, for a fixed 214
165 frequency bands and regions either together or individually.
latent state dimension nx , the model parameters Mt can 215
166 We chose to model multiple frequency bands and regions
be computed using standard subspace projection methods. 216
167 together because this choice provides a more general case
The forgetting factor (β ) is a key algorithm parameter, 217
168 where we also consider and model interrelations across
which controls how fast past observed ECoG network 218
169 different frequency bands and brain regions. Modeling
activity are forgotten in updating the covariance matrices 219
170 all bands and regions together is especially important in
in equation (2). A smaller β indicates faster decay of the 220
171 studying brain functions and dysfunctions (e.g. mental
weights on past ECoG network activity. We select β in 221
172 disorders such as depression) that simultaneously involve
a given subject by sweeping over a range and selecting 222
173 multiple frequency bands and/or brain regions [22, 26,
the one that improves the prediction of ECoG network 223
174 58, 65–70]. Also, we chose to model the ECoG power
dynamics either for that subject or for other subjects on 224
175 features because powers have been shown to contain useful
average (see section 2.6). Here our focus is to employ the 225
176 information for decoding various brain states such as
adaptive LSSM algorithm [61] in order to build adaptive 226
177 movement intention [11, 33, 63, 64], speech [14], covert
dynamic latent state models of ECoG network activity, 227
178 attention [18] and mood [22]. To enable adaptation, we use
study whether adaptation can enable more precise and 228
179 a time-varying LSSM structure:
parsimonious modeling of ECoG dynamics, and whether 229
194 improved with adaptation. To do so, we fit the LSSM state xt+1 using y1 , y2 , ..., yt , and ŷt+1|t is the associated 244
195 parameters Mt by employing our recent adaptive LSSM one-step-ahead prediction of yt+1 , and Kt ∈ Rnx ×ny is the 245
196 algorithm for model identification [61]. We present this Kalman gain: 246
197 algorithm briefly in the next section. Kt = (At XCtT + St )(Ct−1 XCtT + Pt )−1 (4) 247
P Ahmadipour et al 4
(a)
Multi-site ECoG recordings ECoG network activity
1- 8 Hz 8-12 Hz
Pre- log power of
processing binned data in
10 s windows 12-30 Hz 30-55 Hz
0.5 mV
65-100 Hz
10 dB
...
5s
...
5 minute
(b) Time (c) Time
1 ... t-1 t t+1 1 ... t-1 t t+1
ECoG network
activity
yt+1
Non-adaptive Adaptive [y1 y2...yt-1yt]
LSSM LSSM Kalman
Future predictor et+1
algorithm algorithm Adaptive LSSMs
ECoG activity
at time t+1:
yt+1 Kalman
Non-adaptive LSSM Adaptive LSSM
Non-adaptive LSSMs predictor e’t+1
Figure 1: Adaptive and non-adaptive model fitting and prediction of human ECoG network dynamics (constructed from
ECoG power feature time-series). (a) Human ECoG data processing and power feature extraction. (b) Fitting of adaptive
LSSM parameters and non-adaptive LSSM parameters using the adaptive and non-adaptive LSSM algorithms, respectively.
At time step t, adaptive LSSM and non-adaptive LSSM are fitted based on ECoG network activity observed up to the current
time t, i.e. y1 , y2 , ..., yt . Note that the future ECoG network activity yt+1 is not used in fitting whether in adaptive LSSM
or in non-adaptive LSSM. (c) Adaptive and non-adaptive prediction using the fitted LSSMs. At time t, the fitted model
parameters and observed ECoG network activity up to time t are used in a Kalman predictor of the future ECoG network
activity at time t + 1, i.e., predicting yt+1 . This prediction provides the adaptive one-step-ahead prediction error denoted
by et+1 and the non-adaptive one-step-ahead prediction error denoted by et+1 0 . See also Fig. S1.
248 with X the solution of the algebraic Riccati equation: 2.4. Prediction performance measure 262
X = At XAtT + Qt − (At XCtT + St ) To quantify the performance of the above adaptive 263
249 (5)
(Ct XCtT + Pt )−1 (Ct XAtT + St−1 ). prediction, we first calculate the one-step-ahead prediction 264
252 current ECoG network activity, i.e. y1 , y2 , ..., yt , and x̂1|0 Then, using the one-step-ahead prediction error of all time 267
253 is initialized as 0. steps, we calculate the explained variance (EV) of the 268
254 The above adaptive prediction can run in parallel with adaptive prediction of the i’th ECoG power feature in yt 269
255 the adaptive LSSM algorithm: at time t, an updated LSSM as (Fig. S1(c)): 270
260 time step t + 1, i.e., yt+1 (Fig. S1(b)); then this procedure vector. EV quantifies the proportion of the variance in the 273
261 repeats at time t + 1 (Fig. S1(c)). i’th ECoG power feature that is explained by its one-step- 274
278 ECoG network dynamics by the average EV (AEV) across 2.5.2. ECoG power feature extraction We compute ECoG 328
279 all components in the ECoG power features yt (Fig. S1(c)): power features from the preprocessed 20 hour long ECoG 329
ny data and use them as the ECoG network dynamics {yt } to 330
280 AEV = 1
ny ∑i=1 EV(i) . (8)
be modeled in equation (1) (Fig. 1(a)). To obtain the ECoG 331
power features, we first causally filter the ECoG data in five 332
281 2.5. Human ECoG data acquisition and processing frequency bands (using an order 8 Butterworth IIR filter): 333
282 We fit adaptive LSSMs to multi-site ECoG data collected 1 − 8 Hz (delta and theta), 8 − 12 Hz (alpha), 23 − 30 Hz 334
283 from 10 human subjects, and use them to predict the ECoG (beta), 30 − 55 Hz (low-gamma), and 65 − 100 Hz (high- 335
284 network dynamics. gamma). These frequency bands have been shown to have 336
functions and dysfunctions [22, 26, 27, 61, 65, 66, 73]. We 338
285 2.5.1. Human ECoG recordings and preprocessing We
then bin the filtered ECoG data over time into 10 s non- 339
286 recorded human ECoG network activity from 10 epilepsy
overlapping segments. We take the mean of the squared 340
287 patients (S1–S10) implanted with several intracranial ECoG
filtered signal in each 10 s segment for each frequency 341
288 electrodes for seizure monitoring. The surgery was
band and contact. We then compute the logarithm of this 342
289 approved by University of California, San Francisco
mean-squared signal to obtain the log powers. We next 343
290 Institutional Board and all the patients agreed to participate
collect the log powers at each time t into a vector zt ∈ Rnz , 344
291 in the study with written consent. ECoG data was recorded
where nz = 5 frequency bands × number of contacts and t ∈ 345
using the Nicolet/XLTek EEG clinical recording systems
{1, 2, ..., T }, where T = 10s20segment
hour of ECoG signal
292
293 (Natus Medical, Inc.) with 512 Hz or 1024 Hz sampling of ECoG signal = 7200 is 346
307 non-stationary dynamics of ECoG network activity during network dynamics 359
308 awake periods because many neuroscience and neurotech- We compare adaptive and non-adaptive prediction of ECoG 360
309 nology applications operate under awake conditions. How- network dynamics (Fig. 1(c)). Note that non-adaptive 361
310 ever, in a control analysis, we also confirm that the same model fitting and prediction is a special case of adaptive 362
311 results hold if we also include the sleep periods (see section model fitting and prediction in which the forgetting factor 363
312 3.1). To extract the awake data, using visual inspection, we β = 1 in equation (2)—corresponding to equal weighting 364
313 remove data segments during sleep and also those segments of all past ECoG network activity regardless of how far 365
314 with non-neuronal noises such as motion artifacts. We then they are from the current time. For the adaptive algorithm, 366
315 filter the ECoG data above 1 Hz using an order 2 IIR Butter- we investigate two methods to pick the β . First, for 367
316 worth filter, and remove the 60 Hz line noise using an order each subject, we pick the β that maximizes the adaptive 368
317 4 notch filter. Then, we downsample the ECoG data to 256 prediction performance for that subject (Fig. S1(c)). To do 369
318 Hz by applying an order 8 IIR Chebyshev Type I antialias- so, we calculate the AEV for each value of β over a grid 370
319 ing filter with cut off frequency of 100 Hz. Next, we apply G of 41 points [0.992:0.0002:1] and find the optimal β . 371
320 common average referencing to contacts of the same strips. The lower limit of the grid is chosen such that 124 steps 372
321 Finally, we accumulate several multi-hour segments of pre- into the past, the weighting of the ECoG network activity 373
322 processed ECoG data extracted from the span of two days decays to 1e (36.79%); this essentially means that in this 374
323 to form a 20 hour long ECoG data time-series. For each case 124 is the effective number of ECoG power feature 375
324 patient, all subsequent processing and analyses are done on samples used for model fitting as the weight of samples 376
325 this 20 hour long ECoG data time-series. In a control anal- farther into the past will be small. A smaller lower limit 377
326 ysis, we also analyze 40 hour long ECoG data time-series than this uses too few effective number of samples to fit the 378
327 which also include sleep periods. model parameters and can lead to a large variance in the 379
fitted model parameters [72]. The upper limit of the grid 380
382 past ECoG network activity with equal weight for model prediction performance, (ii) find the dimension at which 433
383 fitting. Second, in addition to the above method, we also do adaptive modeling achieves a prediction performance equal 434
384 a control analysis in which we select the forgetting factor to the best non-adaptive prediction performance, and (iii) 435
385 for a given subject based on ECoG data from other subjects. compare the two dimensions. 436
386 Here we used the average of the best forgetting factors in More precisely, for each subject we first find the 437
387 the other nine subjects to redo the adaptive modeling in the latent state dimension that achieves the best non-adaptive 438
388 given subject. We show that the same conclusions hold in prediction 439
397 how much adaptation improves the prediction performance across the 10 subjects by conducting a one-sided Wilcoxon 447
398 using the relative EV improvement measure defined as signed rank test. Similarly, we can find the above quanti- 448
AEVadapt −AEVnon−adapt
399
AEV . ties associated with adaptive prediction and refer to them 449
non−adapt
with subscript of adapt . Then, we can also compare the best 450
405 nx is a design choice, we compare adaptive and non- the value of ECoG network activity yt+1 is computed 455
406 adaptive prediction of ECoG network dynamics across using the adaptive LSSM fitted with data up to time t 456
407 different latent state dimensions nx ∈ H = [2 : 2 : 20]. (equation (3)). To evaluate whether the adaptive LSSM 457
408 More precisely, we compare EVadapt and EVnon−adapt across fitted at time t remains advantageous into the future, we 458
409 all ECoG power features pooled from 10 subjects (using also perform the following analysis. At each time step 459
410 a right-sided Wilcoxon signed rank test). Since we make t, after updating the adaptive LSSM Mt , we stop the 460
411 this comparison for each of the latent state dimensions adaptive LSSM fitting procedure and fix Mt . We then keep 461
412 nx ∈ [2 : 2 : 20], we use false discovery rate control (FDR) performing one-step-ahead prediction with this model into 462
413 [74] to account for the multiple comparisons being made. the future. This means that we predict yt+t f +1 using the 463
414 We declare that adaptation significantly improves ECoG ECoG network activity observed up to time t + t f but with 464
415 prediction if the FDR-corrected P value < 0.05. the LSSM fixed as the one fitted at time t. Then based 465
416 We also compute the relative improvement in predic- on the prediction error et+1 = yt+t f +1 − ŷt+t f +1|t+t f , we 466
417 tion performance for each of the above latent state dimen- calculate the corresponding EV in equation (7) and repeat 467
418 sions for each subject. We study whether and how this im- the comparison of adaptive prediction and non-adaptive 468
419 provement changes as a function of latent state dimension prediction. We assess future time-periods t f of 1 minute, 469
420 nx . We use rank correlation to test if relative EV improve- 5 minutes, and 30 minutes. 470
423 man’s rank correlation P < 0.05 [75]. network dynamics 472
425 Our second hypothesis is that adaptive modeling can that ECoG network dynamics are non-stationary over 475
426 achieve further dimensionality reduction compared with the recording periods in our data. To confirm this 476
427 non-adaptive modeling without degrading the prediction conclusion, we conduct a numerical simulation to show 477
428 performance. Equivalently, we hypothesize that despite that if ECoG network dynamics are stationary, adaptive 478
429 using a much smaller latent state dimension, adaptive modeling combined with dimensionality reduction does not 479
430 modeling can achieve the same prediction as non-adaptive perform better than non-adaptive modeling combined with 480
431 modeling. To test this hypothesis, (i) we find the latent state dimensionality reduction. So if there is an improvement 481
432 dimension at which non-adaptive modeling achieves its best due to adaptation, this result would suggest non-stationarity. 482
484 network dynamics (see more in the next paragraph); (ii) adaptive prediction and non-adaptive prediction across 537
485 based on simulated data, fit an adaptive LSSM with a all latent state dimensions (Fig. 2(c) and (d)). In this 538
486 lower latent state dimension than the true dimension; (iii) subject, the EV of adaptive prediction was significantly 539
487 repeat (ii) for non-adaptive LSSM; (iv) compare the low- larger than non-adaptive prediction for almost all latent 540
488 dimensional adaptive LSSM with the low-dimensional non- state dimensions (one-sided Wilcoxon signed-rank test, 541
489 adaptive LSSM in terms of their prediction performance on Ns = 50, FDR-corrected P ≤ 4.28 × 10−3 for all latent 542
490 simulated data. state dimensions other than 16 for which predictions were 543
491 We simulate the ECoG network dynamics from the similar, Fig. 2(c)). Moreover, in this subject, the best EV for 544
492 non-adaptive LSSMs fitted on the actual data from our adaptive prediction EV∗adapt was significantly larger than the 545
493 10 subjects with a latent state dimension of 10. We best EV for non-adaptive prediction EV∗non−adapt (one-sided 546
494 choose this latent state dimension because non-adaptive Wilcoxon signed-rank test, Ns = 50, P = 9.35 × 10−5 ). 547
495 prediction performance is highest at this dimension on Also, the best adaptive prediction was achieved at latent 548
496 average among subjects (Fig. 3). We then examine how state dimension of nx = 6 whereas the best non-adaptive 549
497 adaptive modeling and non-adaptive modeling perform with prediction was achieved at the higher latent state dimension 550
498 latent state dimensions that are lower than the actual latent of nx = 10. 551
499 state dimension of 10. More specifically, for each of latent Second, consistent results held across all ECoG 552
500 state dimensions nx ∈ [2 : 2 : 10], we fit adaptive LSSMs power features pooled from all 10 subjects. The EV of 553
501 and non-adaptive LSSMs to the simulated ECoG network adaptive prediction was significantly larger than that of non- 554
502 dynamics and compare the prediction performances on adaptive prediction for each latent state dimension (one- 555
503 simulated data. sided Wilcoxon signed-rank test, Ns = 500, FDR-corrected 556
504 3. Results adaptive prediction EV∗adapt was significantly larger than the 558
505 3.1. Adaptation improves prediction of ECoG network Wilcoxon signed-rank test, Ns = 500, P = 5.23 × 10−31 ). 560
506 dynamics Third, even though our focus here is to track 561
507 Our first hypothesis is that adaptive modeling enables the non-stationary dynamics of awake ECoG network 562
508 better prediction of ECoG network dynamics compared activity (see section 2.5.1), we also adaptively modeled 563
509 with non-adaptive modeling. To test this hypothesis, we 40 hour long ECoG network activity periods, which also 564
510 compared the adaptive LSSM and non-adaptive LSSM in included sleep periods in addition to awake periods and 565
511 predicting ECoG network dynamics in 10 human subjects compared with non-adaptive modeling. We found that the 566
512 (see sections 2.2, 2.6, 2.7). The prediction was evaluated adaptive prediction again significantly outperformed the 567
513 by how well the values of the future ECoG network activity non-adaptive prediction across the 10 subjects for each 568
514 in the next time step can be predicted from its observed past latent state dimension (one sided Wilcoxon signed-rank 569
515 and current values. We quantified prediction performance test, Ns = 500, FDR-corrected P ≤ 1.27×10−9 , Fig. S2(a)). 570
516 as the explained variance (EV), which computes the In another control analysis, we found that the same 571
517 proportion of the variance in ECoG network dynamics that results hold when we alter the random selection of 572
518 is explained by the one-step-ahead prediction of the ECoG contacts. We repeated the random selection of 10 573
519 network dynamics (see sections 2.3, 2.4). Higher EV thus contacts in each subject (see section 2.5.2) five times 574
520 represents better prediction performance. independently. This led to five different random 10- 575
521 We found that adaptive modeling improved the contact sets in each subject, and 50 random 10-contact 576
522 prediction of ECoG network dynamics compared with non- sets across our 10 subjects. We again found that the 577
523 adaptive modeling across subjects. First, we examine adaptive prediction significantly outperformed the non- 578
524 the prediction traces in one subject as an illustrative adaptive prediction across 10 subjects for each latent state 579
525 example. Fig. 2(a) shows the prediction traces in three dimension (one-sided Wilcoxon signed-rank test, Ns = 580
526 example ECoG power feature time-series in subject S1 2500, FDR-corrected P ≤ 1.15 × 10−88 , Fig. S2(b)). 581
527 and for a fixed latent state dimension of nx = 4. We Finally, we confirmed that both adaptive and non- 582
528 can see that adaptive predictions more closely followed adaptive modeling can process ECoG network activity in 583
529 the observed ECoG power feature time-series than non- real time and have similar computational cost because 584
530 adaptive predictions, showing the advantage of adaptive the non-adaptive model is a special case of the adaptive 585
531 modeling. Across all ECoG power features in this subject model in which β = 1 (see section 2.6). In our Matlab 586
532 and for this latent state dimension, the EV of adaptive implementation, processing each 10s time step of data, 587
533 prediction was significantly larger than that of non-adaptive took less than 0.01s on a PC with an Intel Core i7-7700K 588
534 prediction (one-sided Wilcoxon signed-rank test, Ns = 50, CPU. Further software optimizations may allow even faster 589
535 P = 1.02 × 10−8 , Fig. 2(b), Ns indicates the number of processing for future applications. Taken together, these 590
536 samples in statistical comparisons). We next compared results show that adaptation improves the prediction of 591
P Ahmadipour et al 8
(a) (b)
ECoG signal Adaptive prediction
0.12
Probability density
Non-adaptive prediction
0.08
5 dB
0.04
5 min
0
0 0.1 0.2 0.3 0.4
Non-adaptive
5 dB
∗∗∗
5 min
Adaptive
5 dB
(c) (d)
∗∗
prediction performance
improvement (%)
∗
Normalized
0.1 Relative EV
100
0.05
0
0
2 6 10 14 18 2 6 10 14 18
Latent state dimension Latent state dimension
Figure 2: Adaptation improves the prediction of ECoG network dynamics (constructed from ECoG power feature time-
series) in subject S1 and the improvement is more prominent for lower latent state dimensions. (a) Three example
ECoG power features and their corresponding adaptive and non-adaptive prediction traces in subject S1. (b) Distribution
of prediction performance (EV) of ECoG power features from adaptive prediction (green) and non-adaptive prediction
(grey) in the same subject. The top panel shows the histogram of EV’s. In the box plot in the bottom panel, the line
represents the median, the box edges represent 25 and 75 percentiles, and whiskers represent the minimum and maximum,
respectively. Asterisks indicate significance (*: P < 0.05, **: P < 0.005, and ***: P < 0.0005), and n.s. indicates
P > 0.05. (c) Prediction performance of adaptive modeling (green) and non-adaptive modeling (grey) across multiple
latent state dimensions in the same subject. For easier visualization, we show a normalized prediction performance, which
is defined as the EV minus the mean of non-adaptive EV at latent state dimension of 2 (lowest dimension considered). This
means that normalized EV for non-adaptive prediction at a dimension of 2 is 0. Solid curves represent the mean across the
50 ECoG power features being modeled, and shaded area represents s.e.m. The top solid line represents the latent state
dimensions where the adaptive prediction is significantly better than the non-adaptive prediction (blank represents n.s.).
(d) Relative improvement in prediction performance (i.e. relative EV improvement) due to adaptive modeling across latent
state dimensions in the same subject.
P Ahmadipour et al 9
592 ECoG network dynamics. than the number of ECoG power features being modeled per 644
593 3.2. Advantage of adaptation is more prominent for lower Wilcoxon signed-rank test, Ns = 10, P = 9.76 × 10−4 ). 646
597 we found that the performance improvement due to time, the advantage of adaptive modeling over non-adaptive 649
598 adaptation was higher for lower latent state dimensions. modeling still persists into the future. In these analyses (see 650
599 First, as an illustrative example, we consider subject section 2.9), we stopped the adaptation at some time t; we 651
600 S1. In this subject, relative improvement in prediction then used a fixed LSSM equal to the adaptive LSSM fitted at 652
601 performance (relative EV improvement) varied strongly as time t to keep performing one-step-ahead prediction into the 653
602 a function of the latent state dimension, and decreased future at times larger than t. We compared these predictions 654
603 as the latent state dimension increased (Fig. 2(d)). with those using a non-adaptive LSSM fitted at time t, i.e. 655
604 More specifically, the rank correlation coefficient between using the data up to time t. We compared the predictions at 656
605 relative EV improvement and latent state dimension time-periods t f of 1 minute, 5 minutes and 30 minutes into 657
606 nx for this subject was significantly smaller than 0 the future (see section 2.9). 658
607 (Spearman’s rank correlation coefficient ρ = −0.733, Ns = We found that the adaptive modeling consistently 659
608 10, Spearman’s P = 2.21 × 10−2 ; Fig. 2(d)). Second, outperformed the non-adaptive modeling for prediction 660
609 this result also held across all 10 subjects (Fig. 3(b)). into the future after model fitting stops. For all the 661
610 We pooled the relative EV improvement across all time-periods, the EV of adaptive modeling across the 662
611 subjects for each latent state dimension and found that 10 subjects remained significantly better than that of 663
612 the relative EV improvement significantly varied as a non-adaptive modeling for any latent state dimension 664
613 monotonic decreasing function of the latent state dimension (Fig. 5(a); one-sided Wilcoxon signed-rank test, Ns = 665
614 (Spearman’s rank correlation coefficient ρ = −0.498, Ns = 500, FDR-corrected P < 1.05 × 10−2 for all three tested 666
615 100, Spearman’s P = 1.36 × 10−7 ; Fig. 3(b)). time-periods into future). Also, across subjects, the 667
616 3.3. Adaptation enables further dimensionality reduction better than the best EV of non-adaptive modeling selected 669
617 without degrading prediction performance over all latent state dimensions (EV∗non−adapt ) (one-sided 670
636 adaptive modeling (Fig. 4(a); nx,matched−adapt = 4.200 ± LSSM algorithm. In the main analyses above, for each 688
637 0.466 vs. nx,non−adapt = 10.200 ± 0.696 (mean±s.e.m.), subject, we had selected the forgetting factor based on 689
638 one-sided Wilcoxon signed-rank test, Ns = 10, P = the subject’s own ECoG network activity (see section 2.6). 690
639 9.76 × 10−4 ) while achieving similar prediction perfor- In practice, it could be convenient to select the forgetting 691
640 mance (Fig. 4(b); AEVmatched−adapt = 0.1925 ± 0.0169 factor for a given subject based on historical data from other 692
641 vs. AEV∗non−adapt = 0.1897 ± 0.0184, two-sided Wilcoxon subjects and thus it is interesting to assess performance 693
642 signed-rank test, Ns = 10, P = 1). Also this latent state in such a case. We thus performed a control analysis for 694
643 dimension for adaptive modeling was significantly smaller each given subject, where we used the average of the best 695
P Ahmadipour et al 10
(a) (b)
Adaptive Non-adaptive
∗∗∗ 80 Spearman’s ρ =-0.498
60
0.06
Normalized
40
0.04
0.02 20
0
0
2 6 10 14 18 2 6 10 14 18
Latent state dimension Latent state dimension
Figure 3: Adaptation improves the prediction of ECoG network dynamics (constructed from ECoG power feature time-
series) across 10 subjects, and the improvement is more prominent for lower latent state dimensions. (a) Prediction
performance (EV) of adaptive modeling (green) and non-adaptive modeling (grey) across all ECoG power features pooled
from 10 subjects. Prediction performance is shown across different latent state dimensions. For each subject, normalized
EV of each ECoG power feature is computed similar to Fig. 2, i.e. by subtracting the mean of non-adaptive EV at latent
state dimension of 2 (smallest dimension considered) from the EV of each of them. Solid line represents the mean across
the ECoG power features and shaded area represents s.e.m. (b) Relative improvement of prediction performance (relative
EV improvement) due to adaptive modeling across latent state dimensions. Solid line represents the mean across the 10
subjects and shaded area represents s.e.m. The top solid line in panel (a) and significance legends are defined as in Fig. 2.
(a) (b) forgetting factors in the other nine subjects to redo the
∗∗ n.s.
696
12 adaptive modeling in the given subject (leave-subject-out
0.20 697
Prediction performance
Latent state dimension
10 analysis). Thus the choice of the forgetting factor did not 698
own data. In particular, in this case and across all the 708
Figure 4: Adaptation enables further dimensionality ECoG power features in the 10 subjects, the EV of adaptive 709
reduction without degrading prediction performance. (a) prediction was significantly larger than non-adaptive 710
Comparison of the latent state dimension that achieves the prediction for almost every latent state dimension (Fig. 6(a); 711
best non-adaptive prediction (n∗x,non−adapt , grey) with the one-sided Wilcoxon signed-rank test, Ns = 500, FDR- 712
latent state dimension that achieves a similar performance corrected P ≤ 2.09 × 10−5 for all dimensions except 16 713
in adaptive prediction (nx,matched−adapt , green). (b) for which the predictions were not significantly different). 714
Comparison of the prediction performances (AEVs) for the Moreover, across subjects, the relative improvement in 715
latent state dimensions in panel (a). Bars represent mean prediction performance (relative EV improvement) was 716
across the 10 subjects, whiskers show s.e.m. Significance again a monotonically decreasing function of the latent 717
legends are defined as in Fig. 2. state dimension (Fig. 6(b); Spearman’s rank correlation 718
(a) Adaptive
Non-adaptive
tf=1 min tf=5 min tf=30 min
∗∗∗ ∗∗∗ ∗∗∗ ∗
0.10
performance
Normalized
prediction
0.08
0.06
0.04
0.02
0
(b) 2 6 10 14 18 2 6 10 14 18 2 6 10 14 18 2 6 10 14 18
60
40
20
0
2 6 10 14 18 2 6 10 14 18 2 6 10 14 18 2 6 10 14 18
Latent state dimension Latent state dimension Latent state dimension Latent state dimension
Figure 5: The fitted adaptive model remains advantageous even after adaptation stops. (a) Adaptive prediction
outperformed non-adaptive prediction consistently at time-periods into the future. Each plot corresponds to a different
prediction time-period into the future (t f ). The leftmost plot is a replicate of the main result as in Fig. 3(a). Prediction
performance is pooled across power features from 10 subjects. (b) Same as (a) but for the relative improvement in prediction
performance (relative EV improvement) of adaptive modeling pooled across 10 subjects. The relative EV improvement
was again more prominent for lower latent state dimensions across all time-periods into the future. The legend and figure
conventions in each plot are the same as in Fig. 3.
723 over the latent state dimensions and subjects was only This is because if dynamics are stationary, their statistics 747
724 3.29 ± 0.47% and was not significant at any latent state are time-invariant by definition. To perform dimensionality 748
725 dimension (Wilcoxon signed-rank test, Ns = 500, FDR- reduction and for a given lower latent state dimension, the 749
726 corrected P > 0.833 for each latent state dimension). LSSM fitting algorithm finds the closest possible approx- 750
727 We also compared adaptive and non-adaptive model- imation of these time-invariant statistics using the lower 751
728 ing using the Akaike’s Information Criterion (AIC) mea- latent state dimension. As the original statistics are time- 752
729 sure [76], which takes into account the one extra forget- invariant to begin with, their best approximation should also 753
730 ting factor parameter in adaptive modeling. Consistent with stay the same over time. Thus, adaptive modeling should 754
731 our above result, we found that adaptive modeling signifi- not help in the stationary case; this suggests that when adap- 755
732 cantly outperformed non-adaptive modeling in terms of the tation does help in modeling combined with dimensionality 756
733 AIC measure computed for one-step-ahead prediction er- reduction—as is the case in our data—, dynamics are non- 757
734 rors across 10 subjects for each latent state dimension (one stationary. 758
735 sided Wilcoxon signed-rank test, Ns = 10, FDR-corrected We further confirmed the above conclusion using 759
737 3.6. The non-stationarity in ECoG network dynamics adaptive LSSMs to our ECoG data whose dimension was 762
also chosen based on our data. We then used the simulated 763
738 The above results (sections 3.1–3.5) show that adaptive stationary ECoG network dynamics to fit both adaptive 764
739 modeling of ECoG network dynamics consistently out- and non-adaptive LSSMs but with a lower latent state 765
740 performed their non-adaptive modeling for a wide range dimension than the actual LSSM dimension from which 766
741 of latent state dimensions, with more prominent improve- data was simulated. We found that adaptive prediction 767
742 ment at lower latent state dimensions. These results sug- was almost the same as non-adaptive prediction (Fig. 7(a)) 768
743 gest that there exists non-stationarity in ECoG network dy- with the relative difference being less than 1.82 ± 0.83% 769
744 namics. Indeed, if the dynamics to be modeled were sta- for all latent state dimensions (Fig. 7(b)). This result 770
745 tionary, adaptive modeling would not have outperformed for simulated stationary ECoG network dynamics was in 771
746 non-adaptive modeling even after dimensionality reduction. contrast to what we observed in our real datasets of ECoG 772
P Ahmadipour et al 12
∗∗∗
60
0.06
Normalized
0.04 40
0.02 20
0
0
2 6 10 14 18 2 6 10 14 18
Latent state dimension Latent state dimension
Figure 6: Adaptive prediction outperforms non-adaptive prediction in a given subject even when forgetting factor is
selected based on other subjects’ ECoG network activity (constructed from ECoG power feature time-series). (a) Adaptive
prediction was advantageous regardless of whether the forgetting factor was selected based on a subject’s own ECoG
network activity (green) or other subjects’ ECoG network activity (orange). (b) The improvement of adaptive prediction
over non-adaptive prediction was more prominent at lower latent state dimensions, again regardless of whether the
forgetting factor was selected based on the subject’s own ECoG network activity (red) or other subjects’ ECoG network
activity (blue). Figure conventions are the same as Fig. 3.
773 network dynamics, where adaptation markedly improved over time. The fact that adaptive modeling outperforms 802
774 the prediction (Fig. 7(b)). In particular, the improvement non-adaptive modeling shows that the adaptive model can 803
775 averaged over latent state dimensions and subjects for successfully track the ECoG non-stationarity. 804
0.10 60
Normalized
0.08
40
0.06
0.04
20
0.02
0
0
2 4 6 8 10 2 4 6 8 10
Latent state dimension Latent state dimension
Figure 7: Comparison with adaptive tracking of simulated stationary ECoG network dynamics further suggests the
existence of non-stationarity in ECoG network dynamics (constructed from ECoG power feature time-series) (a) Adaptive
prediction (green) of simulated stationary ECoG network dynamics was essentially the same as its non-adaptive prediction
(grey) at each latent state dimension used for modeling. Note that here modeling is done combined with dimensionality
reduction as the latent states in the fitted models to simulated data are smaller than the actual simulated latent state (i.e. 10).
Solid line represents mean across 500 power features of 10 simulated stationary networks (each simulated from the fitted
model in one subject) and shaded area represents s.e.m. (b) Relative improvement in prediction performance (relative EV
improvement) of real ECoG network dynamics (red) was significantly larger than that in the simulated stationary ECoG
network dynamics (cyan) for which this improvement was essentially non-existent. Solid lines represent mean and shaded
area represents s.e.m.
829 4.1. Choice of ECoG as the signal modality representations in several brain regions can happen when 855
845 Our results show that adaptive tracking of ECoG network there is the additional possibility of stimulation-induced 873
846 dynamics can extract more accurate and lower-dimensional neural plasticity leading to non-stationarity [59, 60]. Fi- 874
847 latent representations and track the changes in these nally, these neurotechnologies need to operate over the peri- 875
848 low-dimensional representations. Thus, studying neural ods of months and years during which recording instability 876
849 mechanisms in behavioral contexts that involve a change can happen, leading to non-stationarity [50, 51]. Tracking 877
850 in brain state will likely benefit from such adaptive the changes in low-dimensional ECoG network dynamics 878
851 tracking to uncover whether and how low-dimensional due to any of the above reasons will be important for more 879
852 neural dynamics change. For example, learning in various precise decoding and modulation in these neurotechnolo- 880
882 4.3. Choice of LSSM model structure for adaptive tracking modeling especially in applications where dynamic dimen- 935
883 of ECoG network dynamics sionality reduction is desired. The further dimensionality 936
895 nonlinear dynamic model (using radial basis functions) Here we focused on adaptive tracking for ECoG network 947
896 does not improve the prediction of ECoG network dynamics dynamics given the clinical relevance of ECoG. However, 948
897 compared with a non-adaptive LSSM [46]. Further, brain functions and dysfunctions involve multiple spatial 949
898 compared with non-linear dynamic models, LSSM has and temporal scales of brain activity ranging from spiking 950
899 a simpler form and fewer parameters that can make activity to field potential activity including local field 951
900 adaptation easier [46]. Finally, LSSM has been able to potentials and ECoG [92–96]. Thus an important future 952
901 successfully decode mood from human ECoG activity [22]. direction is to develop adaptive algorithms to track dynamic 953
902 Together, these prior evidence about the usefulness of latent state models for combined multiscale spike and 954
903 LSSM for modeling ECoG motivated us to investigate the field potential activity [97]. Such adaptive tracking can 955
904 tracking and non-stationarity of ECoG network dynamics improve the decoding of behavior [97] and help investigate 956
905 using the LSSM model structure [61, 62]. the changes in neural dynamics and in connectivity and 957
906 4.4. Dimensionality reduction simultaneously with One key parameter in any adaptive algorithm is the 959
907 dynamic modeling forgetting factor, which dictates how fast model parameters 960
989 potentials (LFP) [5, 6, 35, 36, 69, 101], or even the band spikes,” Nature Reviews Neuroscience, vol. 13, pp. 407–420, 1044
990 pass filtered time-series of the original ECoG signal is an June 2012. 1045
[11] G. Hotson, D. P. McMullen, M. S. Fifer, M. S. Johannes, K. D. 1046
991 important area of future investigation. Katyal, M. P. Para, R. Armiger, W. S. Anderson, N. V. 1047
Thakor, B. A. Wester, and N. E. Crone, “Individual Finger 1048
Control of the Modular Prosthetic Limb using High-Density 1049
992 Code availability
Electrocorticography in a Human Subject,” Journal of neural 1050
engineering, vol. 13, p. 026017, Apr. 2016. 1051
993 The code for the adaptive modeling algorithm is available [12] S. Acharya, M. S. Fifer, H. L. Benz, N. E. Crone, and 1052
994 from the corresponding author (M.M.S.) and will be avail- N. V. Thakor, “Electrocorticographic amplitude predicts finger 1053
995 able online at https://github.com/ShanechiLab/AdaptiveLSSM. positions during slow grasping motions of the hand,” Journal of 1054
neural engineering, vol. 7, p. 046002, Aug. 2010. 1055
[13] K. E. Bouchard, N. Mesgarani, K. Johnson, and E. F. Chang, 1056
999 Bilateral Academic Research Initiative (BARI). The authors [15] D. K. Lee, E. Fedorenko, M. V. Simon, W. T. Curry, B. V. 1062
1000 also acknowledge support of the National Institute of Nahed, D. P. Cahill, and Z. M. Williams, “Neural encoding and 1063
production of functional morphemes in the posterior temporal 1064
1001 Health (NIH) under contracts 1UH3NS109956-01 and lobe,” Nature Communications, vol. 9, p. 1877, May 2018. 1065
1002 1UH3NS100544-01. Finally, the authors acknowledge [16] E. M. Mugler, M. C. Tate, K. Livescu, J. W. Templer, M. A. 1066
1003 support of the Defense Advanced Research Projects Goldrick, and M. W. Slutzky, “Differential representation of 1067
articulatory gestures and phonemes in precentral and inferior 1068
1004 Agency (DARPA) under Cooperative Agreement Number
frontal gyri,” Journal of Neuroscience, vol. 38, no. 46, pp. 9803– 1069
1005 W911NF-14-2-0043, issued by the Army Research Office 9813, 2018. 1070
1006 contracting office in support of DARPAS SUBNETS [17] O. Jensen, J. Kaiser, and J.-P. Lachaux, “Human gamma-frequency 1071
1007 program. The views, opinions, and/or findings expressed oscillations associated with attention and memory,” Trends in 1072
Neurosciences, vol. 30, pp. 317–324, July 2007. 1073
1008 are those of the author(s) and should not be interpreted as [18] A. Gunduz, P. Brunner, A. Daitch, E. C. Leuthardt, A. L. 1074
1009 representing the official views or policies of the Department Ritaccio, B. Pesaran, and G. Schalk, “Decoding covert 1075
1010 of Defense or the U.S. Government. spatial attention using electrocorticographic (ECoG) signals in 1076
humans,” NeuroImage, vol. 60, pp. 2285–2293, May 2012. 1077
[19] J. O’Sullivan, Z. Chen, J. Herrero, G. M. McKhann, S. A. Sheth, 1078
1011 Reference A. D. Mehta, and N. Mesgarani, “Neural decoding of attentional 1079
selection in multi-speaker environments without access to clean 1080
1012 [1] A. K. Engel, C. K. E. Moll, I. Fried, and G. A. Ojemann, “Invasive sources,” Journal of Neural Engineering, vol. 14, p. 056001, 1081
1013 recordings from the human brain: clinical insights and beyond,” Aug. 2017. 1082
1014 Nature Reviews Neuroscience, vol. 6, pp. 35–47, Jan. 2005. [20] E. L. Johnson and R. T. Knight, “Intracranial recordings and human 1083
1015 [2] J. Jacobs and M. J. Kahana, “Direct brain recordings fuel advances memory,” Current Opinion in Neurobiology, vol. 31, pp. 18–25, 1084
1016 in cognitive electrophysiology,” Trends in Cognitive Sciences, Apr. 2015. 1085
1017 vol. 14, pp. 162–171, Apr. 2010. [21] F. Roux and P. J. Uhlhaas, “Working memory and neural 1086
1018 [3] J.-P. Lachaux, N. Axmacher, F. Mormann, E. Halgren, and N. E. oscillations: alpha–gamma versus theta–gamma codes for 1087
1019 Crone, “High-frequency neural activity and human cognition: distinct WM information?,” Trends in Cognitive Sciences, 1088
1020 Past, present and possible future of intracranial EEG research,” vol. 18, pp. 16–25, Jan. 2014. 1089
1021 Progress in Neurobiology, p. 23, 2012. [22] O. G. Sani, Y. Yang, M. B. Lee, H. E. Dawes, E. F. Chang, and 1090
1022 [4] A. Gunduz and G. Schalk, “ECoG-based BCIs,” in Brain– M. M. Shanechi, “Mood variations decoded from multi-site 1091
1023 Computer Interfaces Handbook, pp. 297–322, CRC Press, 2018. intracranial human brain activity,” Nature Biotechnology, Sept. 1092
1024 [5] D. Moran, “Evolution of brain–computer interface: action 2018. 1093
1025 potentials, local field potentials and electrocorticograms,” [23] A. Vassileva, D. van Blooijs, F. Leijten, and G. Huiskamp, 1094
1026 Current Opinion in Neurobiology, vol. 20, pp. 741–745, Dec. “Neocortical electrical stimulation for epilepsy: Closed-loop 1095
1027 2010. versus open-loop,” Epilepsy Research, vol. 141, pp. 95–101, 1096
1028 [6] M. M. Shanechi, “Brain-machine interfaces from motor to mood,” Mar. 2018. 1097
1029 Nature Neurosceince, vol. 22, p. 1554–1564, 2019. [24] C. d. Hemptinne, E. S. Ryapolova-Webb, E. L. Air, P. A. Garcia, 1098
1030 [7] L. Karumbaiah, T. Saxena, D. Carlson, K. Patil, R. Patkar, E. A. K. J. Miller, J. G. Ojemann, J. L. Ostrem, N. B. Galifianakis, 1099
1031 Gaupp, M. Betancur, G. B. Stanley, L. Carin, and R. V. and P. A. Starr, “Exaggerated phase–amplitude coupling in the 1100
1032 Bellamkonda, “Relationship between intracortical electrode primary motor cortex in Parkinson disease,” Proceedings of the 1101
1033 design and chronic recording function,” Biomaterials, vol. 34, National Academy of Sciences, vol. 110, pp. 4780–4785, Mar. 1102
1034 pp. 8061–8074, Nov. 2013. 2013. 1103
1035 [8] E. Chang, “Towards Large-Scale, Human-Based, Mesoscopic [25] S. L. Schmidt, J. J. Peters, D. A. Turner, and W. M. Grill, 1104
1036 Neurotechnologies,” Neuron, vol. 86, pp. 68–78, Apr. 2015. “Continuous deep brain stimulation of the subthalamic nucleus 1105
1037 [9] V. Woods, M. Trumpis, B. Bent, K. Palopoli-Trojani, C.-H. Chiang, may not modulate beta bursts in patients with parkinson’s 1106
1038 C. Wang, C. Yu, M. N. Insanally, R. C. Froemke, and J. Viventi, disease,” Brain Stimulation, vol. 13, no. 2, pp. 433–443, 2020. 1107
1039 “Long-term recording reliability of liquid crystal polymer µecog [26] A. Beuter, J.-P. Lefaucheur, and J. Modolo, “Closed-loop cortical 1108
1040 arrays,” Journal of neural engineering, vol. 15, no. 6, p. 066024, neuromodulation in Parkinson’s disease: An alternative to 1109
1041 2018. deep brain stimulation?,” Clinical Neurophysiology, vol. 125, 1110
1042 [10] G. Buzsáki, C. A. Anastassiou, and C. Koch, “The origin of pp. 874–885, May 2014. 1111
1043 extracellular fields and currents — EEG, ECoG, LFP and [27] S. Little and P. Brown, “The functional role of beta oscillations in 1112
parkinson’s disease,” Parkinsonism & related disorders, vol. 20, 1113
P Ahmadipour et al 16
1114 pp. S44–S48, 2014. vol. 15, pp. 805–815, Oct. 2018. 1184
1115 [28] A. L. Crowell, E. S. Ryapolova-Webb, J. L. Ostrem, N. B. [44] A. Afshar, G. Santhanam, B. Yu, S. Ryu, M. Sahani, and K. Shenoy, 1185
1116 Galifianakis, S. Shimamoto, D. A. Lim, and P. A. Starr, “Single-Trial Neural Correlates of Arm Movement Preparation,” 1186
1117 “Oscillations in sensorimotor cortex in movement disorders: an Neuron, vol. 71, pp. 555–564, Aug. 2011. 1187
1118 electrocorticography study,” Brain, vol. 135, pp. 615–630, Feb. [45] J. H. Macke, L. Buesing, J. P. Cunningham, M. Y. Byron, K. V. 1188
1119 2012. Shenoy, and M. Sahani, “Empirical models of spiking in neural 1189
1120 [29] S. Miocinovic, N. C. Swann, C. de Hemptinne, A. Miller, J. L. populations,” in Advances in neural information processing 1190
1121 Ostrem, and P. A. Starr, “Cortical gamma oscillations in isolated systems, pp. 1350–1358, 2011. 1191
1122 dystonia,” Parkinsonism & Related Disorders, vol. 49, pp. 104– [46] Y. Yang, O. Sani, E. F. Chang, and M. M. Shanechi, “Dynamic 1192
1123 105, Apr. 2018. network modeling and dimensionality reduction for human 1193
1124 [30] N. C. Rowland, C. de Hemptinne, N. C. Swann, S. Qasim, ECoG activity,” Journal of Neural Engineering, p. 056014, 1194
1125 S. Miocinovic, J. Ostrem, R. T. Knight, and P. A. Starr, “Task- 2019. 1195
1126 related activity in sensorimotor cortex in Parkinson’s disease and [47] Y. Yang, A. T. Connolly, and M. M. Shanechi, “A control-theoretic 1196
1127 essential tremor: changes in beta and gamma bands,” Frontiers system identification framework and a real-time closed-loop 1197
1128 in Human Neuroscience, vol. 9, 2015. clinical simulation testbed for electrical brain stimulation,” 1198
1129 [31] E. D. Kondylis, M. J. Randazzo, A. Alhourani, W. J. Lipski, Journal of Neural Engineering, vol. 15, no. 6, p. 066007, 2018. 1199
1130 T. A. Wozny, Y. Pandya, A. S. Ghuman, R. S. Turner, [48] Y. Yang, S. Qiao, O. Sani, I. Sedillo, B. Ferrentino, B. Pesaran, and 1200
1131 D. J. Crammond, and R. M. Richardson, “Movement-related M. M. Shanechi, “Modeling large-scale brain network dynamics 1201
1132 dynamics of cortical oscillations in Parkinson’s disease and in response to electrical stimulation,” in Computational and 1202
1133 essential tremor,” Brain, vol. 139, pp. 2211–2223, Aug. 2016. Systems Neuroscience (Cosyne) Abstract, Denver, CO, 2020. 1203
1134 [32] G. Schalk, K. J. Miller, N. R. Anderson, J. A. Wilson, M. D. [49] E. B. Geller, T. L. Skarpaas, R. E. Gross, R. R. Goodman, G. L. 1204
1135 Smyth, J. G. Ojemann, D. W. Moran, J. R. Wolpaw, and Barkley, C. W. Bazil, M. J. Berg, G. K. Bergey, S. S. Cash, A. J. 1205
1136 E. C. Leuthardt, “Two-dimensional movement control using Cole, et al., “Brain-responsive neurostimulation in patients with 1206
1137 electrocorticographic signals in humans,” Journal of neural medically intractable mesial temporal lobe epilepsy,” Epilepsia, 1207
1138 engineering, vol. 5, no. 1, p. 75, 2008. vol. 58, no. 6, pp. 994–1004, 2017. 1208
1139 [33] W. Wang, J. L. Collinger, A. D. Degenhart, E. C. Tyler-Kabara, [50] A. Campbell and C. Wu, “Chronically implanted intracranial elec- 1209
1140 A. B. Schwartz, D. W. Moran, D. J. Weber, B. Wodlinger, R. K. trodes: tissue reaction and electrical changes,” Micromachines, 1210
1141 Vinjamuri, R. C. Ashmore, J. W. Kelly, and M. L. Boninger, vol. 9, no. 9, p. 430, 2018. 1211
1142 “An Electrocorticographic Brain Interface in an Individual with [51] W. M. Grill, S. E. Norman, and R. V. Bellamkonda, “Implanted 1212
1143 Tetraplegia,” PLOS ONE, vol. 8, p. e55344, Feb. 2013. neural interfaces: biochallenges and engineered solutions,” 1213
1144 [34] M. M. Shanechi, “Brain–Machine Interface Control Algorithms,” Annual review of biomedical engineering, vol. 11, pp. 1–24, 1214
1145 IEEE Transactions on Neural Systems and Rehabilitation 2009. 1215
1146 Engineering, vol. 25, pp. 1725–1734, Oct. 2017. [52] D. M. Taylor, S. I. H. Tillery, and A. B. Schwartz, “Direct Cortical 1216
1147 [35] K. B. Hoang, I. R. Cassar, W. M. Grill, and D. A. Turner, Control of 3d Neuroprosthetic Devices,” Science, vol. 296, 1217
1148 “Biomarkers and stimulation algorithms for adaptive brain pp. 1829–1832, June 2002. 1218
1149 stimulation,” Frontiers in Neuroscience, vol. 11, p. 564, 2017. [53] J. M. Carmena, M. A. Lebedev, R. E. Crist, J. E. O’Doherty, 1219
1150 [36] A. C. Meidahl, G. Tinkhauser, D. M. Herz, H. Cagnan, J. Debarros, D. M. Santucci, D. F. Dimitrov, P. G. Patil, C. S. Henriquez, 1220
1151 and P. Brown, “Adaptive deep brain stimulation for movement and M. A. L. Nicolelis, “Learning to Control a Brain–Machine 1221
1152 disorders: the long road to clinical therapy,” Movement Interface for Reaching and Grasping by Primates,” PLOS 1222
1153 disorders, vol. 32, no. 6, pp. 810–819, 2017. Biology, vol. 1, p. e42, Oct. 2003. 1223
1154 [37] J. P. Cunningham and B. M. Yu, “Dimensionality reduction for [54] K. Ganguly and J. M. Carmena, “Emergence of a Stable Cortical 1224
1155 large-scale neural recordings,” Nature Neuroscience, vol. 17, Map for Neuroprosthetic Control,” PLOS Biology, vol. 7, 1225
1156 pp. 1500–1509, Nov. 2014. p. e1000153, July 2009. 1226
1157 [38] M. M. Churchland, J. P. Cunningham, M. T. Kaufman, J. D. Foster, [55] B. Jarosiewicz, S. M. Chase, G. W. Fraser, M. Velliste, R. E. Kass, 1227
1158 P. Nuyujukian, S. I. Ryu, and K. V. Shenoy, “Neural population and A. B. Schwartz, “Functional network reorganization during 1228
1159 dynamics during reaching,” Nature, vol. 487, pp. 51–56, July learning in a brain-computer interface paradigm,” Proceedings 1229
1160 2012. of the National Academy of Sciences, vol. 105, pp. 19486– 1230
1161 [39] M. Aghagolzadeh and W. Truccolo, “Inference and Decoding of 19491, Dec. 2008. 1231
1162 Motor Cortex Low-Dimensional Dynamics via Latent State- [56] J. D. Wander, T. Blakely, K. J. Miller, K. E. Weaver, L. A. 1232
1163 Space Models,” IEEE Transactions on Neural Systems and Johnson, J. D. Olson, E. E. Fetz, R. P. Rao, and J. G. Ojemann, 1233
1164 Rehabilitation Engineering, vol. 24, pp. 272–282, Feb. 2016. “Distributed cortical adaptation during learning of a brain– 1234
1165 [40] B. Petreska, B. M. Yu, J. P. Cunningham, G. Santhanam, S. I. computer interface task,” Proceedings of the National Academy 1235
1166 Ryu, K. V. Shenoy, and M. Sahani, “Dynamical segmentation of Sciences, vol. 110, no. 26, pp. 10818–10823, 2013. 1236
1167 of single trials from population neural data,” in Advances in [57] E. R. Oby, M. D. Golub, J. A. Hennig, A. D. Degenhart, E. C. 1237
1168 Neural Information Processing Systems 24 (J. Shawe-Taylor, Tyler-Kabara, M. Y. Byron, S. M. Chase, and A. P. Batista, 1238
1169 R. S. Zemel, P. L. Bartlett, F. Pereira, and K. Q. Weinberger, “New neural activity patterns emerge with long-term learning,” 1239
1170 eds.), pp. 756–764, Curran Associates, Inc., 2011. Proceedings of the National Academy of Sciences, vol. 116, 1240
1171 [41] B. Yu, J. Cunningham, G. Santhanam, S. Ryu, K. Shenoy, no. 30, pp. 15210–15215, 2019. 1241
1172 and M. Sahani, “Gaussian-process factor analysis for low- [58] L. M. Williams, “Defining biotypes for depression and anxiety 1242
1173 dimensional single-trial analysis of neural population activity,” based on large-scale circuit dysfunction: a theoretical review 1243
1174 J Neurophysiol, vol. 102, no. 3, pp. 614–635, 2009. of the evidence and future directions for clinical translation,” 1244
1175 [42] J. C. Kao, P. Nuyujukian, S. I. Ryu, M. M. Churchland, J. P. Depression and anxiety, vol. 34, no. 1, pp. 9–24, 2017. 1245
1176 Cunningham, and K. V. Shenoy, “Single-trial dynamics of [59] P. V. Massey and Z. I. Bashir, “Long-term depression: multiple 1246
1177 motor cortex and their applications to brain-machine interfaces,” forms and implications for brain function,” Trends in Neuro- 1247
1178 Nature communications, vol. 6, p. 7759, 2015. sciences, vol. 30, pp. 176–184, Apr. 2007. 1248
1179 [43] C. Pandarinath, D. J. O’Shea, J. Collins, R. Jozefowicz, S. D. [60] R. A. Nicoll, “A Brief History of Long-Term Potentiation,” Neuron, 1249
1180 Stavisky, J. C. Kao, E. M. Trautmann, M. T. Kaufman, S. I. vol. 93, pp. 281–290, Jan. 2017. 1250
1181 Ryu, L. R. Hochberg, J. M. Henderson, K. V. Shenoy, L. F. [61] Y. Yang, E. F. Chang, and M. M. Shanechi, “Dynamic tracking 1251
1182 Abbott, and D. Sussillo, “Inferring single-trial neural population of non-stationarity in human ECoG activity,” in Engineering 1252
1183 dynamics using sequential auto-encoders,” Nature Methods, in Medicine and Biology Society (EMBC), 2017 39th Annual 1253
P Ahmadipour et al 17
1254 International Conference of the IEEE, pp. 1660–1663, IEEE, cortex in addiction: neuroimaging findings and clinical 1324
1255 2017. implications,” Nature Reviews Neuroscience, vol. 12, pp. 652– 1325
1256 [62] P. Ahmadipour, Y. Yang, and M. M. Shanechi, “Investigating the 669, Nov. 2011. 1326
1257 effect of forgetting factor on tracking non-stationary neural [80] L. M. Frank, U. T. Eden, V. Solo, M. A. Wilson, and E. N. 1327
1258 dynamics,” in 2019 9th International IEEE/EMBS Conference Brown, “Contrasting Patterns of Receptive Field Plasticity in the 1328
1259 on Neural Engineering (NER), pp. 291–294, IEEE, 2019. Hippocampus and the Entorhinal Cortex: An Adaptive Filtering 1329
1260 [63] M. S. Fifer, G. Hotson, B. A. Wester, D. P. McMullen, Y. Wang, Approach,” Journal of Neuroscience, vol. 22, pp. 3817–3830, 1330
1261 M. S. Johannes, K. D. Katyal, J. B. Helder, M. P. Para, May 2002. 1331
1262 R. J. Vogelstein, W. S. Anderson, N. V. Thakor, and N. E. [81] L. M. Frank, G. B. Stanley, and E. N. Brown, “Hippocampal 1332
1263 Crone, “Simultaneous Neural Control of Simple Reaching and Plasticity across Multiple Days of Exposure to Novel Environ- 1333
1264 Grasping With the Modular Prosthetic Limb Using Intracranial ments,” Journal of Neuroscience, vol. 24, pp. 7681–7689, Sept. 1334
1265 EEG,” IEEE Transactions on Neural Systems and Rehabilitation 2004. 1335
1266 Engineering, vol. 22, pp. 695–705, May 2014. [82] S. Cheng and L. M. Frank, “New Experiences Enhance Coordinated 1336
1267 [64] T. Yanagisawa, M. Hirata, Y. Saitoh, H. Kishima, K. Matsushita, Neural Activity in the Hippocampus,” Neuron, vol. 57, pp. 303– 1337
1268 T. Goto, R. Fukuma, H. Yokoi, Y. Kamitani, and T. Yoshimine, 313, Jan. 2008. 1338
1269 “Electrocorticographic control of a prosthetic arm in paralyzed [83] D. C. Millard, Q. Wang, C. A. Gollnick, and G. B. Stanley, “System 1339
1270 patients,” Annals of Neurology, vol. 71, no. 3, pp. 353–361, identification of the nonlinear dynamics in the thalamocortical 1340
1271 2012. circuit in response to patterned thalamic microstimulation in 1341
1272 [65] V. R. Rao, K. K. Sellers, D. L. Wallace, M. B. Lee, M. Bijanzadeh, vivo,” Journal of neural engineering, vol. 10, no. 6, p. 066011, 1342
1273 O. G. Sani, Y. Yang, M. M. Shanechi, H. E. Dawes, and E. F. 2013. 1343
1274 Chang, “Direct Electrical Stimulation of Lateral Orbitofrontal [84] L. Citi, E. N. Brown, and R. Barbieri, “A real-time automated point- 1344
1275 Cortex Acutely Improves Mood in Individuals with Symptoms process method for the detection and correction of erroneous 1345
1276 of Depression,” Current Biology, vol. 28, pp. 3893–3902.e4, and ectopic heartbeats,” IEEE transactions on biomedical 1346
1277 Dec. 2018. engineering, vol. 59, no. 10, pp. 2828–2837, 2012. 1347
1278 [66] K. W. Scangos, H. S. Ahmad, A. Shafi, K. K. Sellers, H. E. [85] G. Valenza, L. Citi, R. G. Garcia, J. N. Taylor, N. Toschi, and 1348
1279 Dawes, A. Krystal, and E. F. Chang, “Pilot study of an R. Barbieri, “Complexity variability assessment of nonlinear 1349
1280 intracranial electroencephalography biomarker of depressive time-varying cardiovascular control,” Scientific reports, vol. 7, 1350
1281 symptoms in epilepsy,” The Journal of Neuropsychiatry and no. 1, pp. 1–15, 2017. 1351
1282 Clinical Neurosciences, vol. 32, no. 2, pp. 185–190, 2020. [86] M. Bolus, A. Willats, C. Whitmire, C. Rozell, and G. Stanley, 1352
1283 [67] A. A. Fingelkurts and A. A. Fingelkurts, “Altered structure of “Design strategies for dynamic closed-loop optogenetic neuro- 1353
1284 dynamic electroencephalogram oscillatory pattern in major control in vivo,” Journal of neural engineering, vol. 15, no. 2, 1354
1285 depression,” Biological Psychiatry, vol. 77, no. 12, pp. 1050– p. 026011, 2018. 1355
1286 1060, 2015. [87] Y. Yang and M. M. Shanechi, “An adaptive and generalizable 1356
1287 [68] Y. Salimpour and W. S. Anderson, “Cross-frequency coupling closed-loop system for control of medically induced coma and 1357
1288 based neuromodulation for treating neurological disorders,” other states of anesthesia,” Journal of Neural Engineering, 1358
1289 Frontiers in neuroscience, vol. 13, p. 125, 2019. vol. 13, no. 6, p. 066019, 2016. 1359
1290 [69] A. Hyafil, A.-L. Giraud, L. Fontolan, and B. Gutkin, “Neural cross- [88] A. Charles, M. S. Asif, J. Romberg, and C. Rozell, “Sparsity 1360
1291 frequency coupling: connecting architectures, mechanisms, and penalties in dynamical system estimation,” in 2011 45th annual 1361
1292 functions,” Trends in neurosciences, vol. 38, no. 11, pp. 725– conference on information sciences and systems, pp. 1–6, IEEE, 1362
1293 740, 2015. 2011. 1363
1294 [70] W. C. Drevets, “Neuroimaging and neuropathological studies of [89] N. Sadras, B. Pesaran, and M. M. Shanechi, “A point-process 1364
1295 depression: implications for the cognitive-emotional features matched filter for event detection and decoding from population 1365
1296 of mood disorders,” Current opinion in neurobiology, vol. 11, spike trains,” Journal of Neural Engineering, vol. 16, p. 066016, 1366
1297 no. 2, pp. 240–249, 2001. oct 2019. 1367
1298 [71] P. v. Overschee and B. L. d. Moor, Subspace Identification for [90] G. Obinata and B. D. O. Anderson, Model Reduction for Control 1368
1299 Linear Systems: Theory — Implementation — Applications. System Design. Springer Science & Business Media, Dec. 2012. 1369
1300 Springer Science & Business Media, Dec. 2012. Google-Books-ID: Ij TBwAAQBAJ. 1370
1301 [72] L. Ljung, “System Identification,” in Wiley Encyclopedia of [91] Y. Yang, J. T. Lee, J. A. Guidera, K. Y. Vlasov, J. Pei, E. N. Brown, 1371
1302 Electrical and Electronics Engineering, American Cancer K. Solt, and M. M. Shanechi, “Developing a personalized 1372
1303 Society, 1999. closed-loop controller of medically-induced coma in a rodent 1373
1304 [73] K. Gadhoumi, J.-M. Lina, F. Mormann, and J. Gotman, “Seizure model,” Journal of Neural Engineering, vol. 16, p. 036022, Mar. 1374
1305 prediction for therapeutic devices: A review,” Journal of 2019. 1375
1306 neuroscience methods, vol. 260, pp. 270–282, 2016. [92] H.-L. Hsieh, Y. T. Wong, B. Pesaran, and M. M. Shanechi, 1376
1307 [74] Y. Benjamini and Y. Hochberg, “Controlling the false discovery “Multiscale modeling and decoding algorithms for spike-field 1377
1308 rate: a practical and powerful approach to multiple testing,” activity,” Journal of Neural Engineering, vol. 16, p. 016018, 1378
1309 Journal of the Royal statistical society: series B (Methodologi- Feb. 2019. 1379
1310 cal), vol. 57, no. 1, pp. 289–300, 1995. [93] C. Mehring, J. Rickert, E. Vaadia, S. C. de Oliveira, A. Aertsen, 1380
1311 [75] M. G. Kendall, “Rank correlation methods (london: Charles griffin, and S. Rotter, “Inference of hand movements from local field 1381
1312 1948),” KendallRank Correlation Methods1948, 1970. potentials in monkey motor cortex,” Nature neuroscience, vol. 6, 1382
1313 [76] J. E. Cavanaugh and A. A. Neath, “The akaike information crite- no. 12, pp. 1253–1254, 2003. 1383
1314 rion: Background, derivation, properties, application, interpre- [94] A. K. Bansal, W. Truccolo, C. E. Vargas-Irwin, and J. P. Donoghue, 1384
1315 tation, and refinements,” Wiley Interdisciplinary Reviews: Com- “Decoding 3d reach and grasp from hybrid signals in motor 1385
1316 putational Statistics, vol. 11, no. 3, p. e1460, 2019. and premotor cortices: spikes, multiunit activity, and local 1386
1317 [77] G. G. Calhoon and K. M. Tye, “Resolving the neural circuits of field potentials,” Journal of neurophysiology, vol. 107, no. 5, 1387
1318 anxiety,” Nature Neuroscience, vol. 18, pp. 1394–1404, Oct. pp. 1337–1355, 2012. 1388
1319 2015. [95] S. D. Stavisky, J. C. Kao, P. Nuyujukian, S. I. Ryu, and K. V. 1389
1320 [78] I. Tracey and M. C. Bushnell, “How Neuroimaging Studies Have Shenoy, “A high performing brain–machine interface driven 1390
1321 Challenged Us to Rethink: Is Chronic Pain a Disease?,” The by low-frequency local field potentials alone and together with 1391
1322 Journal of Pain, vol. 10, pp. 1113–1120, Nov. 2009. spikes,” Journal of neural engineering, vol. 12, no. 3, p. 036009, 1392
1323 [79] R. Z. Goldstein and N. D. Volkow, “Dysfunction of the prefrontal 2015. 1393
P Ahmadipour et al 18