Vector autoregressive models provide a simple generative model for multivariate, time-series data. The autoregressive coefficients of the vector autoregressive model describe a network process. However, in real-world applications such as… Click to show full abstract
Vector autoregressive models provide a simple generative model for multivariate, time-series data. The autoregressive coefficients of the vector autoregressive model describe a network process. However, in real-world applications such as macroeconomics or neuroimaging, time-series data arise not from isolated network processes but instead from the simultaneous occurrence of multiple network processes. Standard vector autoregressive models cannot provide insights about the underlying structure of such time-series data. In this work, we present the autoregressive linear mixture (ALM) model. The ALM proposes a decomposition of time-series data into co-occurring network processes that we call autoregressive components. We also present a non-convex likelihood-based estimator for fitting the ALM model and show that it can be solved using the proximal alternating linearized minimization (PALM) algorithm. We validate the ALM on both synthetic and real-world electroencephalography data, showing that we can disambiguate task-relevant autoregressive components that correspond with distinct network processes.
               
Click one of the above tabs to view related content.