One method for minimization a convex Lipschitz-continuous function of two variables on a fixed square

 pdf (3822K)  / Annotation

List of references:

  1. Anaconda python дистрибутив, официальный веб-сайт. — Электронный ресурс. — https://www.anaconda.com. — дата обращения: 20.02.2019.
    • Anaconda python distribution official website. — Electronic resource. — https://www.anaconda.com. — accessed 20.02.2019.
  2. А. С. Баяндина, А. В. Гасников, А. А. Лагуновская. Безградиентные двухточечные методы решения задач стохастической негладкой выпуклой оптимизации при наличии малых шумов неслучайной природы // Автоматика и телемеханика. — 2018. — № 8. — С. 38–49.
    • A. S. Bayandina, A. V. Gasnikov, A. A. Lagunovskaya. Gradient-free two-point methods for solving stochastic nonsmooth convex optimization problems with small non-random noises // Automation and Remote Control. — 2018. — V. 79, no. 8. — P. 1399–1408. — DOI: 10.1134/S0005117918080039. — MathSciNet: MR3860298.
    • A. S. Bayandina, A. V. Gasnikov, A. A. Lagunovskaya. Bezgradientnye dvukhtochechnye metody resheniya zadach stokhasticheskoi negladkoi vypukloi optimizatsii pri nalichii malykh shumov ne sluchainoi prirody // Automation and Remote Control. — 2018. — no. 8. — P. 38–49. — in Russian. — Math-Net: Mi eng/at14643.
  3. Ф. П. Васильев. Методы оптимизации. — М: МЦНМО, 2011. — Т. 2. — 433 с.
    • F. P. Vasiliev. Methods of Optimization. — Moscow: MCCME, 2011. — V. 2. — 433 p. — in Russian.
  4. Е. А. Воронцова, А. В. Гасников, Э. А. Горбунов. Ускоренные спуски по случайному направлению с неевклидовой прокс-структурой // Автоматика и телемеханика. — 2019. — № 4. — С. 126–143.
    • E. A. Vorontsova, A. V. Gasnikov, E. A. Gorbunov. Accelerated Directional Search with non-Euclidean prox-structure // Automation and Remote Control. — 2019. — no. 4. — P. 126–143. — in Russian. — Math-Net: Mi eng/at15270.
  5. А. В. Гасников. Современные численные методы оптимизации. Метод универсального градиентного спуска. — М: МФТИ, 2018. — 240 с.
    • A. V. Gasnikov. Modern numerical optimization methods. universal gradient descent method. — Moscow: MIPT, 2018. — 240 p. — in Russian.
  6. А. С. Немировский, Д. Б. Юдин. Сложность задач и эффективность методов оптимизации. — М: Наука, 1979.
    • A. S. Nemirovski, D. B. Yudin. Problem Complexity and Method Efficiency in Optimization. — New York: Wiley-Interscience, 1983. — MathSciNet: MR0702836.
    • A. S. Nemirovski, D. B. Yudin. Slozhnost’ zadach i effektivnost’ metodov optimizatsii. — Moscow: Nauka, 1979. — Russ. ed. — in Russian.
  7. Б. Т. Поляк. Введение в оптимизацию. — М: Наука, 1983.
    • B. T. Polyak. Introduction to Optimization. — New York: Optimization Software, 1987. — MathSciNet: MR1099605.
    • B. T. Polyak. Vvedenie v optimizatsiyu. — Moscow: Nauka, 1983. — Russ. ed. — in Russian. — MathSciNet: MR0719196.
  8. Н. З. Шор. Методы минимизации недифференцируемых функций и их приложения. — Киев: Наукова Думка, 1979.
    • N. Z. Shor. Minimization Methods for Non-Differentiable Functions. — Series in Computational Mathematics. — Springer, 1985. — MathSciNet: MR0775136.
    • N. Z. Shor. Metody minimizatsii nedifferentsiruemykh funktsii i ikh prilozheniya. — Kiev: Naukova Dumka, 1979. — Russ. ed. — in Russian.
  9. A. Bayandina, P. Dvurechensky, A. Gasnikov, F. Stonyakin, A. Titov. Mirror descent and convex optimization problems with non-smooth inequality constraints / Large-Scale and Distributed Optimization. — Springer International Publishing, 2018. — P. 181–215. — P. Giselsson, A. Rantzer (eds.). — Chap. 8. — MathSciNet: MR3888675.
  10. L. Bogolubsky, P. Dvurechensky, A. Gasnikov, G. Gusev, Yu. Nesterov, A. M. Raigorodskii, A. Tikhonov, M. Zhukovskii. Learning supervised pagerank with gradient-based and gradient-free optimization methods / Advances in Neural Information Processing Systems. — Curran Associates, Inc, 2016. — V. 29. — P. 4914–4922. — D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, R. Garnett (eds.).
  11. S. Bubec. Convex optimization: algorithms and complexity // Foundations and Trends in Machine Learning. — 2015. — V. 8, no. 3–4. — P. 231–357. — DOI: 10.1561/2200000050.
  12. Y. Chen, G. Lan, Y. Ouyang. Optimal primal-dual methods for a class of saddle point problems // SIAM Journal on Optimization. — 2014. — V. 24, no. 4. — P. 1779–1814. — DOI: 10.1137/130919362. — MathSciNet: MR3272627.
  13. C. D. Dang, G. Lan. Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization // SIAM J. on Optimization. — 2015. — V. 25, no. 2. — P. 856–881. — DOI: 10.1137/130936361. — MathSciNet: MR3341135.
  14. O. Devolder, F. Glineur, Yu. Nesterov. First-order methods of smooth convex optimization with inexact oracle // Mathematical Programming. — 2014. — V. 146, no. 1–2. — P. 37–75. — DOI: 10.1007/s10107-013-0677-5. — MathSciNet: MR3232608.
  15. P. Dvurechensky, A. Gasnikov. Stochastic Intermediate Gradient Method for Convex Problems with Stochastic Inexact Oracle // Journal of Optimization Theory and Applications. — 2016. — V. 171, no. 1. — P. 121–145. — DOI: 10.1007/s10957-016-0999-6. — MathSciNet: MR3547846.
  16. A. V. Gasnikov, A. A. Lagunovskaya, I. N. Usmanova, F. A. Fedorenko. Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex // Automation and Remote Control. — 2016. — V. 77, no. 11. — P. 2018–2034. — DOI: 10.1134/S0005117916110114. — MathSciNet: MR3664202.
  17. Y.-T. Lee, A. Sidford, S. C.-W. Wong. A faster cutting plane method and its implications for combinatorial and convex optimization. — 2015. — E-print. — https://arxiv.org/pdf/1508.04874.pdf. — accessed 02.01.2019. — MathSciNet: MR3473356.
  18. A. Nemirovski. Lectures on modern convex optimization analysis, algorithms, and engineering applications. — Philadelphia: SIAM, 2015. — http://www2.isye.gatech.edu/~nemirovs/Lect_ModConvOpt.pdf. — accessed 02.01.2019.
  19. Yu. Nesterov. Primal-dual subgradient methods for convex problems // Mathematical Programming. — 2009. — V. 120, no. 1. — P. 221–259. — DOI: 10.1007/s10107-007-0149-x. — MathSciNet: MR2496434.
  20. Yu. Nesterov, S. Shpirko. Primal-Dual Subgradient Method for Huge-Scale Linear Conic Problems // SIAM Journal on Optimization. — 2014. — V. 24, no. 3. — P. 1444–1457. — DOI: 10.1137/130929345. — MathSciNet: MR3257645.
  21. Yu. Nesterov. Lectures on convex optimization. — Springer, 2018. — MathSciNet: MR3839649.
  22. R. Tappenden, P. Richt´arik, Gondzio, J. . Inexact Coordinate Descent: Complexity and Preconditioning // Journal of Optimization Theory and Applications. — 2016. — V. 170, no. 1. — P. 144–176. — DOI: 10.1007/s10957-016-0867-4. — MathSciNet: MR3513271.

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"