Recent advances of kernel regression assume that target signals lie over a feature graph such that their values can be predicted with the assistance of the graph learned from training… Click to show full abstract
Recent advances of kernel regression assume that target signals lie over a feature graph such that their values can be predicted with the assistance of the graph learned from training data. In this article, we propose a novel kernel regression framework whose outputs follow a matrix-variate Gaussian distribution (MVGD) such that the kernel matrix can be viewed as the column covariance matrix of outputs, and the hyperparameters of a chosen kernel can be optimized using gradient methods. Furthermore, in contrast to the state-of-the-art kernel regression algorithms over graph (KRG), a sample graph of target outputs is introduced to work with regression coefficients and hyperparameters of a chosen kernel in our algorithms. The proposed KRG framework is decomposed into two stages, including the estimation of row and column covariance matrices of MVGD and graph learning along with the estimation of regression coefficients. Numerical approaches are developed to tackle the corresponding optimization problems. Experimental results over synthetic and real-world datasets demonstrate that the performance of the proposed algorithms is superior to that of the state-of-the-art methods.
               
Click one of the above tabs to view related content.