In a Hilbert framework, we discuss a continuous Newton-like model that is well-adapted in view to numerical purposes for solving convex minimization and more general monotone inclusion problems. Algorithmic solutions… Click to show full abstract
In a Hilbert framework, we discuss a continuous Newton-like model that is well-adapted in view to numerical purposes for solving convex minimization and more general monotone inclusion problems. Algorithmic solutions to these problems were recently inspired by implicit temporal discretizations of the (stabilized) continuous version of Nesterov’s accelerated gradient method with an additional Hessian damping term (so as to attenuate the oscillation effects). Unfortunately, due to the presence of the Hessian term, these discrete variants require several gradients or proximal evaluations (per iteration). An alternative methodology can be realized by means of a first-order model that no more involves the Hessian term and that can be extended to the case of an arbitrary maximally monotone operator. Our first-order model originates from the reformulation of a closely related variant to the Nesterov-like equation. Its dynamics are studied (simultaneously) with regard to convex minimization and monotone inclusion problems by considering it when governed by the sum of the gradient of a convex differentiable function and (up to a multiplicative constant) the Yosida approximation of a maximally monotone operator, with an appropriate adjustment of the regularization parameter. It turns out that our model, offers a new framework for discrete variants, while keeping the main asymptotic features of the (stabilized) Nesterov-like equation. Two new algorithms are then suggested relative to the considered optimization problems.
               
Click one of the above tabs to view related content.