Reachability of optimal convergence rate estimates for high-order numerical convex optimization methods

Cover Page

Cite item

Full Text

Abstract

The Monteiro-Svaiter accelerated hybrid proximal extragradient method (2013) with one step of Newton’s method used at every iteration for the approximate solution of an auxiliary problem is considered. The Monteiro-Svaiter method is optimal (with respect to the number of gradient and Hessian evaluations for the optimized function) for sufficiently smooth convex optimization problems in the class of methods using only the gradient and Hessian of the optimized function. An optimal tensor method involving higher derivatives is proposed by replacing Newton’s step with a step of Yu.E. Nesterov’s recently proposed tensor method (2018) and by using a special generalization of the step size selection condition in the outer accelerated proximal extragradient method. This tensor method with derivatives up to the third order inclusive is found fairly practical, since the complexity of its iteration is comparable with that of Newton’s one. Thus, a constructive solution is obtained for Nesterov’s problem (2018) of closing the gap between tight lower and overstated upper bounds for the convergence rate of existing tensor methods of order p ≥ 3.

Full Text

accelerated proximal method; tensor method; Newton method; lower bounds

×

About the authors

A. V. Gasnikov

Moscow Institute of Physics and Technology; Institute for Information Transmission Problems (Kharkevich Institute), Russian Academy of Sciences

Author for correspondence.
Email: gasnikov@yandex.ru
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701; 19, B. Karetny per., GSP-4, 127994

E. A. Gorbunov

Moscow Institute of Physics and Technology

Email: gasnikov@yandex.ru
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701

D. A. Kovalev

Moscow Institute of Physics and Technology

Email: gasnikov@yandex.ru
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701

A. A. M. Mokhammed

Moscow Institute of Physics and Technology

Email: gasnikov@yandex.ru
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701

E. A. Chernousova

Moscow Institute of Physics and Technology

Email: gasnikov@yandex.ru
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701

References

  1. Arjevani Y., Shamir O., Shiff R. Oracle Complexity of Second-Order Methods for Smooth Convex Optimization // Math. Programming. 2017. P. 1-34.
  2. Nesterov Yu. Implementable Tensor Methods in Unconstrained Convex Optimization. Prepr. Univ. catholique de Louvain; Center for Operations Res. and Econometrics (CORE). 2018. № 2018005.
  3. Гасников А.В., Ковалев Д.А. Гипотеза об оптимальных оценках скорости сходимости численных методов выпуклой оптимизации высоких порядков // Компьют. исслед. и моделирование. 2018. Т. 10. № 3. С. 305-314.
  4. Немировский А.С., Юдин Д.Б. Сложность задач и эффективность методов оптимизации. М.: Наука, 1979.
  5. Monteiro R., Svaiter B. An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and its Implications to Second-Order Methods // SIAM J. Optim. 2013. V. 23. № 2. P. 1092-1125.
  6. Nesterov Yu. Introductory Lectures on Convex Optimization: A Basic Course. B.: Springer Science & Business Media, 2013. 87 p.
  7. Нестеров Ю.Е. Введение в выпуклую оптимизацию. М.: МЦНМО, 2010.
  8. Lin H., Mairal J., Harchaoui Z. Catalyst Acceleration for First-Order Convex Optimization: from Theory to Practice // J. Machine Learning Res. 2018. V. 18. № 212. С. 1-54.
  9. Nesterov Yu., Polyak B. Cubic Regularization of Newton Method and Its Global Performance // Math. Program. 2006. V. 108. № 1. P. 177-205.
  10. Гасников А.В. Современные численные методы оптимизации. Метод универсального градиентного спуска. М.: МФТИ, 2018. 166 с.
  11. Кормен Т., Лейзерсон Ч., Ривест Р., Штайн К. Алгоритмы. Построение и анализ. М.: Изд. дом “Вильямс”, 2009.

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2019 Russian academy of sciences

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies