LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Learning Rates of Regularized Regression With Multiple Gaussian Kernels for Multi-Task Learning

Photo from wikipedia

This paper considers a least square regularized regression algorithm for multi-task learning in a union of reproducing kernel Hilbert spaces (RKHSs) with Gaussian kernels. It is assumed that the optimal… Click to show full abstract

This paper considers a least square regularized regression algorithm for multi-task learning in a union of reproducing kernel Hilbert spaces (RKHSs) with Gaussian kernels. It is assumed that the optimal prediction function of the target task and those of related tasks are in an RKHS with the same but with unknown Gaussian kernel width. The samples for related tasks are used to select the Gaussian kernel width, and the sample for the target task is used to obtain the prediction function in the RKHS with this selected width. With an error decomposition result, a fast learning rate is obtained for the target task. The key step is to estimate the sample errors of related tasks in the union of RKHSs with Gaussian kernels. The utility of this algorithm is illustrated with one simulated data set and four real data sets. The experiment results illustrate that the underlying algorithm can result in significant improvements in prediction error when few samples of the target task and more samples of related tasks are available.

Keywords: multi task; task; target task; task learning; gaussian kernels; regularized regression

Journal Title: IEEE Transactions on Neural Networks and Learning Systems
Year Published: 2018

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.