All issues

Subgradient methods for weakly convex and relatively weakly convex problems with a sharp minimum

pdf (280K)

The work is devoted to the study of subgradient methods with different variations of the Polyak stepsize for minimization functions from the class of weakly convex and relatively weakly convex functions that have the corresponding analogue of a sharp minimum. It turns out that, under certain assumptions about the starting point, such an approach can make it possible to justify the convergence of the subgradient method with the speed of a geometric progression. For the subgradient method with the Polyak stepsize, a refined estimate for the rate of convergence is proved for minimization problems for weakly convex functions with a sharp minimum. The feature of this estimate is an additional consideration of the decrease of the distance from the current point of the method to the set of solutions with the increase in the number of iterations. The results of numerical experiments for the phase reconstruction problem (which is weakly convex and has a sharp minimum) are presented, demonstrating the effectiveness of the proposed approach to estimating the rate of convergence compared to the known one. Next, we propose a variation of the subgradient method with switching over productive and non-productive steps for weakly convex problems with inequality constraints and obtain the corresponding analog of the result on convergence with the rate of geometric progression. For the subgradient method with the corresponding variation of the Polyak stepsize on the class of relatively Lipschitz and relatively weakly convex functions with a relative analogue of a sharp minimum, it was obtained conditions that guarantee the convergence of such a subgradient method at the rate of a geometric progression. Finally, a theoretical result is obtained that describes the influence of the error of the information about the (sub)gradient available by the subgradient method and the objective function on the estimation of the quality of the obtained approximate solution. It is proved that for a sufficiently small error $\delta > 0$, one can guarantee that the accuracy of the solution is comparable to $\delta$.

Keywords: subgradient method, sharp minimum, Lipschitz-continuous function, relatively Lipschitz-continuous function, relatively sharp minimum, phase retrieval problem
Citation in English: Stonyakin F.S., Ablaev S.S., Baran I.V., Alkousa M.S. Subgradient methods for weakly convex and relatively weakly convex problems with a sharp minimum // Computer Research and Modeling, 2023, vol. 15, no. 2, pp. 393-412
Citation in English: Stonyakin F.S., Ablaev S.S., Baran I.V., Alkousa M.S. Subgradient methods for weakly convex and relatively weakly convex problems with a sharp minimum // Computer Research and Modeling, 2023, vol. 15, no. 2, pp. 393-412
DOI: 10.20537/2076-7633-2023-15-2-393-412

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the List of Russian peer-reviewed journals publishing the main research results of PhD and doctoral dissertations.

International Interdisciplinary Conference "Mathematics. Computing. Education"

The journal is included in the RSCI

Indexed in Scopus