A hypothesis about the rate of global convergence for optimal methods (Newton’s type) in smooth convex optimization

 pdf (136K)  / List of references

In this paper we discuss lower bounds for convergence of convex optimization methods of high order and attainability of this bounds. We formulate a hypothesis that covers all the cases. It is noticeable that we provide this statement without a proof. Newton method is the most famous method that uses gradient and Hessian of optimized function. However, it converges locally even for strongly convex functions. Global convergence can be achieved with cubic regularization of Newton method [Nesterov, Polyak, 2006], whose iteration cost is comparable with iteration cost of Newton method and is equivalent to inversion of Hessian of optimized function. Yu.Nesterov proposed accelerated variant of Newton method with cubic regularization in 2008 [Nesterov, 2008]. R.Monteiro and B. Svaiter managed to improve global convergence of cubic regularized method in 2013 [Monteiro, Svaiter, 2013]. Y.Arjevani, O. Shamir and R. Shiff showed that convergence bound of Monteiro and Svaiter is optimal (cannot be improved by more than logarithmic factor with any second order method) in 2017 [Arjevani et al., 2017]. They also managed to find bounds for convex optimization methods of p-th order for $p ≥ 2$. However, they got bounds only for first and second order methods for strongly convex functions. In 2018 Yu.Nesterov proposed third order convex optimization methods with rate of convergence that is close to this lower bounds and with similar to Newton method cost of iteration [Nesterov, 2018]. Consequently, it was showed that high order methods can be practical. In this paper we formulate lower bounds for p-th order methods for $p ≥ 3$ for strongly convex unconstrained optimization problems. This paper can be viewed as a little survey of state of the art of high order optimization methods.

Keywords: Newton method, Hesse matrix, lower bounds, Chebyshev’s type methods, superliner rate of convergence
Citation in English: Gasnikov A.V., Kovalev D.A. A hypothesis about the rate of global convergence for optimal methods (Newton’s type) in smooth convex optimization // Computer Research and Modeling, 2018, vol. 10, no. 3, pp. 305-314
Citation in English: Gasnikov A.V., Kovalev D.A. A hypothesis about the rate of global convergence for optimal methods (Newton’s type) in smooth convex optimization // Computer Research and Modeling, 2018, vol. 10, no. 3, pp. 305-314
DOI: 10.20537/2076-7633-2018-10-3-305-314
According to Crossref, this article is cited by:
  • Alexander Vladimirovich Gasnikov, Eduard Alexandrovich Gorbunov, Dmitry A. Kovalev, Ahmed Abdelnafi Mahmoud Mohammed, Elena Olegovna Chernousova. The global rate of convergence for optimal tensor methods in smooth convex optimization. // Computer Research and Modeling. 2018. — V. 10, no. 6. — P. 737. DOI: 10.20537/2076-7633-2018-10-6-737-753
Please note that citation information may be incomplete as it includes data from Crossref cited-by program partners only.
Views (last year): 21. Citations: 1 (RSCI).

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"