Non-interior-point smoothing Newton method (SNM) for optimization have been widely studied for over three decades. SNM is a popular approach for solving small- and medium-scale complementarity problem (CP) and many… Click to show full abstract
Non-interior-point smoothing Newton method (SNM) for optimization have been widely studied for over three decades. SNM is a popular approach for solving small- and medium-scale complementarity problem (CP) and many optimization problems. The main purpose of this paper is to revisit the SNM and show that the Hessian matrix in SNM becomes increasingly ill-conditioned while smoothing parameter approaches to zero, which leads to their practical use remains limited due to computational difficulties in solving large-scale CP. To tackle this, we redesign a new smoothing method, called accelerated preconditioned smoothing method (APSM) for the efficient solution of regularized support vector machines in machine learning. With the help of suitable preconditioner, we can correct the ill-conditioning of associated smoothing Hessian matrix and thereby the associated smoothing Hessian equation can be solved in a few of iterations by using iterative methods in linear algebra. Two accelerated techniques are designed in the paper to reduce our computation time. Finally we present numerical experiments to support our theoretical guarantees and test accelerated convergence obtained by APSM. The result showed that APSM computes faster than state of art algorithm without reducing classification accuracy.
               
Click one of the above tabs to view related content.