The paper is aimed to employ a modified secant equation in the framework of the hybrid conjugate gradient method based on Andrei’s approach to solve large-scale unconstrained optimization problems. The… Click to show full abstract
The paper is aimed to employ a modified secant equation in the framework of the hybrid conjugate gradient method based on Andrei’s approach to solve large-scale unconstrained optimization problems. The CG parameter in the mentioned hybrid CG method is a convex combination of CG parameters corresponding to the Hestenes–Stiefel and Dai–Yuan algorithms. The main feature of these hybrid methods is that the search direction is the Newton direction. The modified secant equation is derived by means of the fifth-order tensor model to improve the curvature information of the objective function. Also, to achieve convergence for general function, the revised version of the method based on the linear combination of the mentioned secant equation and Li and Fukushima’s modified secant equation is suggested. Under proper conditions, globally convergence properties of the new hybrid CG algorithm even without convexity assumption on the objective function is studied. Numerical experiments on a set of test problems of the CUTEr collection are done; they demonstrate the practical effectiveness of the proposed hybrid conjugate gradient algorithm.
               
Click one of the above tabs to view related content.