Adaptive first-order methods for relatively strongly convex optimization problems

 pdf (249K)

The article is devoted to first-order adaptive methods for optimization problems with relatively strongly convex functionals. The concept of relatively strong convexity significantly extends the classical concept of convexity by replacing the Euclidean norm in the definition by the distance in a more general sense (more precisely, by Bregman’s divergence). An important feature of the considered classes of problems is the reduced requirements concerting the level of smoothness of objective functionals. More precisely, we consider relatively smooth and relatively Lipschitz-continuous objective functionals, which allows us to apply the proposed techniques for solving many applied problems, such as the intersection of the ellipsoids problem (IEP), the Support Vector Machine (SVM) for a binary classification problem, etc. If the objective functional is convex, the condition of relatively strong convexity can be satisfied using the problem regularization. In this work, we propose adaptive gradient-type methods for optimization problems with relatively strongly convex and relatively Lipschitzcontinuous functionals for the first time. Further, we propose universal methods for relatively strongly convex optimization problems. This technique is based on introducing an artificial inaccuracy into the optimization model, so the proposed methods can be applied both to the case of relatively smooth and relatively Lipschitz-continuous functionals. Additionally, we demonstrate the optimality of the proposed universal gradient-type methods up to the multiplication by a constant for both classes of relatively strongly convex problems. Also, we show how to apply the technique of restarts of the mirror descent algorithm to solve relatively Lipschitz-continuous optimization problems. Moreover, we prove the optimal estimate of the rate of convergence of such a technique. Also, we present the results of numerical experiments to compare the performance of the proposed methods.

Keywords: adaptive method, relatively strongly convex functional, relatively smooth functional, relatively Lipschitz-continuous functional, optimal method, mirror descent
Citation in English: Savchuk O.S., Titov A.A., Stonyakin F.S., Alkousa M.S. Adaptive first-order methods for relatively strongly convex optimization problems // Computer Research and Modeling, 2022, vol. 14, no. 2, pp. 445-472
Citation in English: Savchuk O.S., Titov A.A., Stonyakin F.S., Alkousa M.S. Adaptive first-order methods for relatively strongly convex optimization problems // Computer Research and Modeling, 2022, vol. 14, no. 2, pp. 445-472
DOI: 10.20537/2076-7633-2022-14-2-445-472

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"