LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

An efficient hybrid conjugate gradient method for unconstrained optimization

Photo from wikipedia

In this paper, we propose a hybrid conjugate gradient method for unconstrained optimization, obtained by a convex combination of the LS and KMD conjugate gradient parameters. A favourite property of… Click to show full abstract

In this paper, we propose a hybrid conjugate gradient method for unconstrained optimization, obtained by a convex combination of the LS and KMD conjugate gradient parameters. A favourite property of the proposed method is that the search direction satisfies the Dai–Liao conjugacy condition and the quasi-Newton direction. In addition, this property does not depend on the line search. Under a modified strong Wolfe line search, we establish the global convergence of the method. Numerical comparison using a set of 109 unconstrained optimization test problems from the CUTEst library show that the proposed method outperforms the Liu–Storey and Hager–Zhang conjugate gradient methods.

Keywords: unconstrained optimization; conjugate gradient; method

Journal Title: Optimization Methods and Software
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.