Reachability of optimal convergence rate estimates for high-order numerical convex optimization methods
- Authors: Gasnikov A.V.1,2, Gorbunov E.A.1, Kovalev D.A.1, Mokhammed A.1, Chernousova E.A.1
-
Affiliations:
- Moscow Institute of Physics and Technology
- Institute for Information Transmission Problems (Kharkevich Institute), Russian Academy of Sciences
- Issue: Vol 484, No 6 (2019)
- Pages: 667-671
- Section: Mathematics
- URL: https://journals.eco-vector.com/0869-5652/article/view/12813
- DOI: https://doi.org/10.31857/S0869-56524846667-671
- ID: 12813
Cite item
Full Text
Abstract
The Monteiro-Svaiter accelerated hybrid proximal extragradient method (2013) with one step of Newton’s method used at every iteration for the approximate solution of an auxiliary problem is considered. The Monteiro-Svaiter method is optimal (with respect to the number of gradient and Hessian evaluations for the optimized function) for sufficiently smooth convex optimization problems in the class of methods using only the gradient and Hessian of the optimized function. An optimal tensor method involving higher derivatives is proposed by replacing Newton’s step with a step of Yu.E. Nesterov’s recently proposed tensor method (2018) and by using a special generalization of the step size selection condition in the outer accelerated proximal extragradient method. This tensor method with derivatives up to the third order inclusive is found fairly practical, since the complexity of its iteration is comparable with that of Newton’s one. Thus, a constructive solution is obtained for Nesterov’s problem (2018) of closing the gap between tight lower and overstated upper bounds for the convergence rate of existing tensor methods of order p ≥ 3.
Full Text
accelerated proximal method; tensor method; Newton method; lower bounds
About the authors
A. V. Gasnikov
Moscow Institute of Physics and Technology; Institute for Information Transmission Problems (Kharkevich Institute), Russian Academy of Sciences
Author for correspondence.
Email: gasnikov@yandex.ru
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701; 19, B. Karetny per., GSP-4, 127994
E. A. Gorbunov
Moscow Institute of Physics and Technology
Email: gasnikov@yandex.ru
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701
D. A. Kovalev
Moscow Institute of Physics and Technology
Email: gasnikov@yandex.ru
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701
A. A. M. Mokhammed
Moscow Institute of Physics and Technology
Email: gasnikov@yandex.ru
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701
E. A. Chernousova
Moscow Institute of Physics and Technology
Email: gasnikov@yandex.ru
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701
References
- Arjevani Y., Shamir O., Shiff R. Oracle Complexity of Second-Order Methods for Smooth Convex Optimization // Math. Programming. 2017. P. 1-34.
- Nesterov Yu. Implementable Tensor Methods in Unconstrained Convex Optimization. Prepr. Univ. catholique de Louvain; Center for Operations Res. and Econometrics (CORE). 2018. № 2018005.
- Гасников А.В., Ковалев Д.А. Гипотеза об оптимальных оценках скорости сходимости численных методов выпуклой оптимизации высоких порядков // Компьют. исслед. и моделирование. 2018. Т. 10. № 3. С. 305-314.
- Немировский А.С., Юдин Д.Б. Сложность задач и эффективность методов оптимизации. М.: Наука, 1979.
- Monteiro R., Svaiter B. An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and its Implications to Second-Order Methods // SIAM J. Optim. 2013. V. 23. № 2. P. 1092-1125.
- Nesterov Yu. Introductory Lectures on Convex Optimization: A Basic Course. B.: Springer Science & Business Media, 2013. 87 p.
- Нестеров Ю.Е. Введение в выпуклую оптимизацию. М.: МЦНМО, 2010.
- Lin H., Mairal J., Harchaoui Z. Catalyst Acceleration for First-Order Convex Optimization: from Theory to Practice // J. Machine Learning Res. 2018. V. 18. № 212. С. 1-54.
- Nesterov Yu., Polyak B. Cubic Regularization of Newton Method and Its Global Performance // Math. Program. 2006. V. 108. № 1. P. 177-205.
- Гасников А.В. Современные численные методы оптимизации. Метод универсального градиентного спуска. М.: МФТИ, 2018. 166 с.
- Кормен Т., Лейзерсон Ч., Ривест Р., Штайн К. Алгоритмы. Построение и анализ. М.: Изд. дом “Вильямс”, 2009.