Analogues of the relative strong convexity condition for relatively smooth problems and adaptive gradient-type methods

 pdf (381K)

This paper is devoted to some variants of improving the convergence rate guarantees of the gradient-type algorithms for relatively smooth and relatively Lipschitz-continuous problems in the case of additional information about some analogues of the strong convexity of the objective function. We consider two classes of problems, namely, convex problems with a relative functional growth condition, and problems (generally, non-convex) with an analogue of the Polyak – Lojasiewicz gradient dominance condition with respect to Bregman divergence. For the first type of problems, we propose two restart schemes for the gradient type methods and justify theoretical estimates of the convergence of two algorithms with adaptively chosen parameters corresponding to the relative smoothness or Lipschitz property of the objective function. The first of these algorithms is simpler in terms of the stopping criterion from the iteration, but for this algorithm, the near-optimal computational guarantees are justified only on the class of relatively Lipschitz-continuous problems. The restart procedure of another algorithm, in its turn, allowed us to obtain more universal theoretical results. We proved a near-optimal estimate of the complexity on the class of convex relatively Lipschitz continuous problems with a functional growth condition. We also obtained linear convergence rate guarantees on the class of relatively smooth problems with a functional growth condition. For a class of problems with an analogue of the gradient dominance condition with respect to the Bregman divergence, estimates of the quality of the output solution were obtained using adaptively selected parameters. We also present the results of some computational experiments illustrating the performance of the methods for the second approach at the conclusion of the paper. As examples, we considered a linear inverse Poisson problem (minimizing the Kullback – Leibler divergence), its regularized version which allows guaranteeing a relative strong convexity of the objective function, as well as an example of a relatively smooth and relatively strongly convex problem. In particular, calculations show that a relatively strongly convex function may not satisfy the relative variant of the gradient dominance condition.

Keywords: relative strong convexity, relative smoothness, relative functional growth, adaptive method, restarts
Citation in English: Stonyakin F.S., Savchuk O.S., Baran I.V., Alkousa M.S., Titov A.A. Analogues of the relative strong convexity condition for relatively smooth problems and adaptive gradient-type methods // Computer Research and Modeling, 2023, vol. 15, no. 2, pp. 413-432
Citation in English: Stonyakin F.S., Savchuk O.S., Baran I.V., Alkousa M.S., Titov A.A. Analogues of the relative strong convexity condition for relatively smooth problems and adaptive gradient-type methods // Computer Research and Modeling, 2023, vol. 15, no. 2, pp. 413-432
DOI: 10.20537/2076-7633-2023-15-2-413-432

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"