All issues
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
A hypothesis about the rate of global convergence for optimal methods (Newton’s type) in smooth convex optimization
pdf (136K)
/ Annotation
List of references:
- Безградиентные двухточечные методы решения задач стохастической негладкой выпуклой оптимизации при наличии малых шумов не случайной природы // Автоматика и телемеханика. — 2018. — https://arxiv.org/ftp/arxiv/papers/1701/1701.03821.pdf.
- Gradient-less two-point methods for solving stochastic nonsmooth convex optimization problems in the presence of small non-random noises // Automatics and telemechanics. — 2018. — https://arxiv.org/ftp/arxiv/papers/1701/1701.03821.pdf . — in Russian. , , .
, , . - Методы оптимизации. — М: МЦНМО, 2011. — Т. 2. — 433 с.
- Optimization methods. — Moscow: MCCME, 2011. — V. 2. — P. 433. — in Russian. .
. - Ускоренные спуски по случайному направлению и безградиентные методы с неевклидовой прокс-структурой // Автоматика и телемеханика. — 2018. — https://arxiv.org/pdf/1710.00162.pdf.
- Accelerated descents in a random direction and gradientless methods with non-euclidean prox-structure // Automatics and telemechanics. — 2018. — https://arxiv.org/pdf/1710.00162.pdf . — in Russian. , , .
, , . - Эффективные численные методы поиска равновесий в больших транспортных сетях. — М: МФТИ, 2016. — 487 с. — диссертация на соискание ученой степени д. ф.-м. н. по специальности 05.13.18 — Математическое моделирование, численные методы, комплексы программ.
- Effective numerical methods for finding equilibrium in large transport networks. — Moscow: MFTI, 2016. — 487 p. — thesis for PhD on the specialty 05.13.18] — Matematicheskoye modelirovaniye, chislennyye metody, kompleksy programm [Mathematical modeling, numerical methods, program complexes]. — in Russian. .
. - Глубокое обучение. — ДМК Пресс, 2017. — 652 с.
- Deep Learning. — DMK Press, 2017. — 652 p. — in Russian. — MathSciNet: MR3617773. , , .
, , . - Метод внутренних точек в линейноми нелинейномпрограм м ировании. — М: КРАСАНД, 2010. — 120 с.
- Interior point methods in linear and nonlinear programming. — Moscow: KRASAND, 2010. — 120 p. — in Russian. .
. - Математический анализ задач естествознания. — М: МЦНМО, 2017. — 160 с.
- Mathematical analysis of problems in the natural sciences. — Moscow: MCCME, 2017. — 160 p. — in Russian. — MathSciNet: MR2762339. .
. - Оптимизация и быстрое автоматическое дифференцирование. — М: ВЦ РАН, 2013. — 144 с.
- Evtushenko. Optimization and fast automatic differentiation. — Moscow: CC RAS, 2013. — 144 p. — in Russian. .
. - Математическое программирование. — М: Наука, 1986. — 288 с.
- Mathematical programming. — M: Science, 1986. — 288 p. — in Russian. — MathSciNet: MR1024787. .
. - Алгоритмы: построение и анализ. — М: МЦНМО, 2002. — 960 с.
- Introduction to Algorithms. — Moscow: MCCME, 2002. — 960 p. — in Russian. — MathSciNet: MR1066870. , , .
, , . - Сложность задач и эффективность методов оптимизации. — М: Наука, 1979. — 384 с.
- Problem complexity and method efficiency in optimization. — Moscow: Science, 1979. — 384 p. — in Russian. — MathSciNet: MR0702836. , .
, . - Введение в выпуклую оптимизацию. — М: МЦНМО, 2010. — 262 с.
- Introductory lectures on convex optimization. — Moscow: MCCME, 2010. — 262 p. — in Russian. — MathSciNet: MR2142598. .
. - К вопросу об алгоритмах приближенного вычисления минимума выпуклой функции по ее значениям // Мат. заметки. — 1996. — Т. 59, № 1. — С. 95–102.
- On the question of algorithms for the approximate calculation of the minimum of a convex function from its values // Math. notes. — 1996. — V. 59, no. 1. — P. 95–102. — in Russian. — DOI: 10.1007/BF02312467. — Math-Net: Mi eng/mzm1697. — MathSciNet: MR1391825. .
. - Алгебраическая сложность. — М: МЦНМО, 2016. — 32 с.
- Algebraic complexity. — Moscow: MCCME, 2016. — 32 p. — in Russian. .
. - Finding approximate local minima faster than gradient descent / In Proceedings of the Forty-Ninth Annual ACM Symposium on the Theory of Computing. — 2017. — MathSciNet: MR3678262. , , , , .
- Oracle complexity of second-order methods for smooth convex optimization. — 2017. — https://arxiv.org/pdf/1705.07260.pdf. , , .
- Estimate sequence methods: extensions and approximations. — 2009. — http://www.optimization-online.org/DB_FILE/2009/08/2372.pdf. .
- Automatic differentiation in machine learning: a survey. — 2015. — https://arxiv.org/pdf/1502.05767.pdf. , , , .
- Convex optimization: algorithms and complexity // In Foundations and Trends in Machine Learning. — 2015. — V. 8, no. 3-4. — P. 231–357. — DOI: 10.1561/2200000050. .
- Accelerated methods for non-convex optimization. — 2017. — https://arxiv.org/pdf/1611.00756.pdf. — MathSciNet: MR3814027. , , , .
- Randomized Similar Triangles Method: A Unifying Framework for Accelerated Randomized Optimization Methods (Coordinate Descent, Directional Search, Derivative-Free Method). — https://arxiv.org/pdf/1707.08486.pdf. , , .
- Second-order methods with cubic regularization under inexact information. — 2017. — URL: https://arxiv.org/pdf/1710.05782.pdf. , , .
- Regularized Newton methods for minimazing functions with H¨older continuous Hessian // SIAM J. Optim. — 2017. — V. 27(1). — P. 478–506. — DOI: 10.1137/16M1087801. — MathSciNet: MR3625807. , .
- A faster cutting plane method and its implications for combinatorial and convex optimization. — 2015. — https://arxiv.org/pdf/1508.04874.pdf. — MathSciNet: MR3473356. , , .
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods // SIAM Journal on Optimization. — 2013. — V. 23. — P. 1092–1125. — DOI: 10.1137/110833786. — MathSciNet: MR3063151. , .
- Lectures on modern convex optimization analysis, algorithms, and engineering applications. — Philadelphia: SIAM, 2015. — http://www2.isye.gatech.edu/nemirovs/Lect_ModConvOpt.pdf . .
- Accelerating the cubic regularization of Newton’s method on convex problems // Math. Prog., Ser. A. — 2008. — V. 112. — P. 159–181. — DOI: 10.1007/s10107-006-0089-x. — MathSciNet: MR2327005. .
- Implementable tensor methods in unconstrained convex optmization. — 2018. — CORE Discussion Papers 2018005. — https://ideas.repec.org/p/cor/louvco/2018005.html . .
- Minimizing functions with bounded variation of subgradients. — 2005. — 13 p. — CORE Discussion Papers. 2005/79. — http://webdoc.sub.gwdg.de/ebook/serien/e/CORE/dp2005_79.pdf . .
- Cubic regularization of Newton method and its global performance // Math. Program. Ser. A. — 2006. — V. 108. — P. 177–205. — DOI: 10.1007/s10107-006-0706-8. — MathSciNet: MR2229459. , .
- Random gradient-free minimization of convex functions // Foundations of Computational Mathematics. — 2017. — V. 17(2). — P. 527–566. — DOI: 10.1007/s10208-015-9296-2. — MathSciNet: MR3627456. , .
- Numerical optimization. — Springer, 2006. , .
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"
Copyright © 2009–2024 Institute of Computer Science