A Globally and $R$-Linearly Convergent Hybrid HS and PRP Method and its Inexact Version with Applications
We present a hybrid HS- and PRP-type conjugate gradient method for smooth optimization that converges globally and $R$-linearly for general functions. We also introduce its inexact version for problems of this kind in which gradients or values of the functions are unknown or difficult to compute. Moreover, we apply the inexact method to solve a nonsmooth convex optimization problem by converting it into a one-time continuously differentiable function by the method of Moreau–Yosida regularization.
English version (Springer): Ukrainian Mathematical Journal 67 (2015), no. 6, pp 853-865.
Citation Example: Zhou Weijun A Globally and $R$-Linearly Convergent Hybrid HS and PRP Method and its Inexact Version with Applications // Ukr. Mat. Zh. - 2015. - 67, № 6. - pp. 752–762.