This article proposes an algorithm for solving multivariate regression and classification problems using piecewise linear predictors over a polyhedral partition of the feature space. The resulting algorithm that we call… Click to show full abstract
This article proposes an algorithm for solving multivariate regression and classification problems using piecewise linear predictors over a polyhedral partition of the feature space. The resulting algorithm that we call piecewise affine regression and classification (PARC) alternates between first, solving ridge regression problems for numeric targets, softmax regression problems for categorical targets, and either softmax regression or cluster centroid computation for piecewise linear separation, and second, assigning the training points to different clusters on the basis of a criterion that balances prediction accuracy and piecewise-linear separability. We prove that PARC is a block-coordinate descent algorithm that minimizes a suitably constructed objective function and that it converges in a finite number of steps. The algorithm is used to learn hybrid numerical/categorical dynamical models from data that contain real and discrete labeled values. The resulting model has a piecewise linear structure that is particularly useful to formulate model predictive control problems and solve them by mixed-integer programming.
               
Click one of the above tabs to view related content.