A Globally and $R$-Linearly Convergent Hybrid HS and PRP Method and its Inexact Version with Applications
Abstract
We present a hybrid HS- and PRP-type conjugate gradient method for smooth optimization that converges globally and $R$-linearly for general functions. We also introduce its inexact version for problems of this kind in which gradients or values of the functions are unknown or difficult to compute. Moreover, we apply the inexact method to solve a nonsmooth convex optimization problem by converting it into a one-time continuously differentiable function by the method of Moreau–Yosida regularization.
Published
25.06.2015
How to Cite
ZhouW. “A Globally and $R$-Linearly Convergent Hybrid HS and PRP Method and Its Inexact Version With Applications”. Ukrains’kyi Matematychnyi Zhurnal, Vol. 67, no. 6, June 2015, pp. 752–762, https://umj.imath.kiev.ua/index.php/umj/article/view/2019.
Issue
Section
Research articles