We suggest a conjugate subgradient type method without any line search for minimization of convex non-differentiable functions. Unlike the custom methods of this class, it does not require monotone decrease… Click to show full abstract
We suggest a conjugate subgradient type method without any line search for minimization of convex non-differentiable functions. Unlike the custom methods of this class, it does not require monotone decrease in the goal function and reduces the implementation cost of each iteration essentially. At the same time, its step-size procedure takes into account behavior of the method along the iteration points. The preliminary results of computational experiments confirm the efficiency of the proposed modification.
               
Click one of the above tabs to view related content.