LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Learning Non-Parametric Models with Guarantees: A Smooth Lipschitz Regression Approach

Photo by hajjidirir from unsplash

Abstract We propose a non-parametric regression methodology that enforces the regressor to be fully consistent with the sample set and the ground-truth regularity assumptions. As opposed to the Nonlinear Set… Click to show full abstract

Abstract We propose a non-parametric regression methodology that enforces the regressor to be fully consistent with the sample set and the ground-truth regularity assumptions. As opposed to the Nonlinear Set Membership technique, this constraint guarantees the attainment of everywhere differentiable surrogate models, which are more suitable to optimization-based controllers that heavily rely on gradient computations. The presented approach is named Smooth Lipschitz Regression (SLR) and provides error bounds on the prediction error at unseen points in the space. A numerical example is given to show the effectiveness of this method when compared to the other alternatives in a Model Predictive Control setting.

Keywords: regression; non parametric; learning non; approach; smooth lipschitz; lipschitz regression

Journal Title: IFAC-PapersOnLine
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.