Linearly convergent gradient-free methods for minimization of parabolic approximation

 pdf (511K)

Finding the global minimum of a nonconvex function is one of the key and most difficult problems of the modern optimization. In this paper we consider special classes of nonconvex problems which have a clear and distinct global minimum.

In the first part of the paper we consider two classes of «good» nonconvex functions, which can be bounded below and above by a parabolic function. This class of problems has not been widely studied in the literature, although it is rather interesting from an applied point of view. Moreover, for such problems first-order and higher-order methods may be completely ineffective in finding a global minimum. This is due to the fact that the function may oscillate heavily or may be very noisy. Therefore, our new methods use only zero-order information and are based on grid search. The size and fineness of this grid, and hence the guarantee of convergence speed and oracle complexity, depend on the «goodness» of the problem. In particular, we show that if the function is bounded by fairly close parabolic functions, then the complexity is independent of the dimension of the problem. We show that our new methods converge with a linear convergence rate $\log(1/\varepsilon)$ to a global minimum on the cube.

In the second part of the paper, we consider the nonconvex optimization problem from a different angle. We assume that the target minimizing function is the sum of the convex quadratic problem and a nonconvex «noise» function proportional to the distance to the global solution. Considering functions with such noise assumptions for zero-order methods is new in the literature. For such a problem, we use the classical gradient-free approach with gradient approximation through finite differences. We show how the convergence analysis for our problems can be reduced to the standard analysis for convex optimization problems. In particular, we achieve a linear convergence rate for such problems as well.

Experimental results confirm the efficiency and practical applicability of all the obtained methods.

Keywords: zeroth-order optimization, nonconvex problem, linear rate
Citation in English: Bazarova A.I., Beznosikov A.N., Gasnikov A.V. Linearly convergent gradient-free methods for minimization of parabolic approximation // Computer Research and Modeling, 2022, vol. 14, no. 2, pp. 239-255
Citation in English: Bazarova A.I., Beznosikov A.N., Gasnikov A.V. Linearly convergent gradient-free methods for minimization of parabolic approximation // Computer Research and Modeling, 2022, vol. 14, no. 2, pp. 239-255
DOI: 10.20537/2076-7633-2022-14-2-239-255

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"