LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

A subspace conjugate gradient algorithm for large-scale unconstrained optimization

Photo from wikipedia

In this paper, a subspace three-term conjugate gradient method is proposed. The search directions in the method are generated by minimizing a quadratic approximation of the objective function on a… Click to show full abstract

In this paper, a subspace three-term conjugate gradient method is proposed. The search directions in the method are generated by minimizing a quadratic approximation of the objective function on a subspace. And they satisfy the descent condition and Dai-Liao conjugacy condition. At each iteration, the subspace is spanned by the current negative gradient and the latest two search directions. Thereby, the dimension of the subspace should be 2 or 3. Under some appropriate assumptions, the global convergence result of the proposed method is established. Numerical experiments show the proposed method is competitive for a set of 80 unconstrained optimization test problems.

Keywords: gradient; subspace; unconstrained optimization; conjugate gradient

Journal Title: Numerical Algorithms
Year Published: 2017

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.