Ellipsoid method for convex stochastic optimization in small dimension

 pdf (229K)

The article considers minimization of the expectation of convex function. Problems of this type often arise in machine learning and a variety of other applications. In practice, stochastic gradient descent (SGD) and similar procedures are usually used to solve such problems. We propose to use the ellipsoid method with mini-batching, which converges linearly and can be more efficient than SGD for a class of problems. This is verified by our experiments, which are publicly available. The algorithm does not require neither smoothness nor strong convexity of the objective to achieve linear convergence. Thus, its complexity does not depend on the conditional number of the problem. We prove that the method arrives at an approximate solution with given probability when using mini-batches of size proportional to the desired accuracy to the power −2. This enables efficient parallel execution of the algorithm, whereas possibilities for batch parallelization of SGD are rather limited. Despite fast convergence, ellipsoid method can result in a greater total number of calls to oracle than SGD, which works decently with small batches. Complexity is quadratic in dimension of the problem, hence the method is suitable for relatively small dimensionalities.

Keywords: stochastic optimization, convex optimization, ellipsoid method, mini-batching
Citation in English: Gladin E.L., Zainullina K.E. Ellipsoid method for convex stochastic optimization in small dimension // Computer Research and Modeling, 2021, vol. 13, no. 6, pp. 1137-1147
Citation in English: Gladin E.L., Zainullina K.E. Ellipsoid method for convex stochastic optimization in small dimension // Computer Research and Modeling, 2021, vol. 13, no. 6, pp. 1137-1147
DOI: 10.20537/2076-7633-2021-13-6-1137-1147

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"