System identification learns mathematical models of dynamic systems starting from input-output data. Despite its long history, such research area is still extremely active. New challenges are posed by identification of… Click to show full abstract
System identification learns mathematical models of dynamic systems starting from input-output data. Despite its long history, such research area is still extremely active. New challenges are posed by identification of complex physical processes given by the interconnection of dynamic systems. Examples arise in biology and industry, e.g., in the study of brain dynamics or sensor networks. In the last years, regularized kernel-based identification, with inspiration from machine learning, has emerged as an interesting alternative to the classical approach commonly adopted in the literature. In the linear setting, it uses the class of stable kernels to include fundamental features of physical dynamical systems, e.g., smooth exponential decay of impulse responses. Such class includes also unknown continuous parameters, called hyperparameters, which play a similar role as the model discrete order in controlling complexity. In this paper, we develop a linear system identification procedure by casting stable kernels in a full Bayesian framework. Our models incorporate hyperparameters uncertainty and consist of a mixture of dynamic systems over a continuum spectrum of dimensions. They are obtained by overcoming drawbacks related to classical Markov chain Monte Carlo schemes that, when applied to stable kernels, are proved to become nearly reducible (i.e., unable to reconstruct posteriors of interest in reasonable time). Numerical experiments show that full Bayes frequently outperforms the state-of-the-art results on typical benchmark problems. Two real applications related to brain dynamics (neural activity) and sensor networks are also included.
               
Click one of the above tabs to view related content.