Primal-dual accelerated gradient descent with line search for convex and nonconvex optimization problems

Cover Page

Cite item

Full Text

Abstract

In this paper a new variant of accelerated gradient descent is proposed. The proposed method does not require any information about the objective function, uses exact line search for the practical accelerations of convergence, converges according to the well-known lower bounds for both convex and non-convex objective functions and possesses primal-dual properties. We also provide a universal version of said method, which converges according to the known lower bounds for both smooth and non-smooth problems.

About the authors

S. V. Guminov

Moscow Institute of Physics and Technology; Institute for Information Transmission Problems of the Russian Academy of Sciences

Author for correspondence.
Email: sergey.guminov@phystech.edu
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701; 19, B.Karetny, Moscow, 127051

Yu. E. Nesterov

Center for Operations Research and Econometrics (CORE), Catholic University of Louvain; Higher School of Economics

Email: sergey.guminov@phystech.edu
Russian Federation, Belgium; 20, Myasnitskaya str., Moscow, 101000

P. E. Dvurechensky

Institute for Information Transmission Problems of the Russian Academy of Sciences; Weierstrass Institute for Applied Analysis and Stochastics

Email: sergey.guminov@phystech.edu
Russian Federation, 19, B.Karetny, Moscow, 127051; Berlin, Germany

A. V. Gasnikov

Moscow Institute of Physics and Technology; Institute for Information Transmission Problems of the Russian Academy of Sciences

Email: sergey.guminov@phystech.edu
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701; 19, B.Karetny, Moscow, 127051

References

  1. Нестеров Ю.Е. Эффективные методы в нелиней- ном программировании. М.: радио и связь, 1989.
  2. Waltz R. A., Morales J.L., Nocedal J., Orban D. An Interior Algorithm for Nonlinear Optimization that Combines Line Search and Trust Region Steps //Math. Progr. 2006. V. 107. № 3. P. 391–408.
  3. Narkiss G., Zibulevsky M. Sequential Subspace Optimization Method for Large-Scale Unconstrained Problems. Tech. Report CCIT № 559. Haifa: EE Dept.Technion, 2005.
  4. Nesterov Yu. Smooth Minimization of Non-Smooth Functions // Math. Progr. 2005. V. 103. № 1. P. 127–152.
  5. Nesterov Yu. Primal-Dual Subgradient Methods for Convex Problems // Math. Progr. 2009. V. 120. № 1. P. 221–259.
  6. Dvurechensky P., Gasnikov A., Kroshnin A. Computational Optimal Transport: Complexity by Accelerated Gradient Descent Is Better Than by Sinkhorn’s Algorithm. Proc. of the 35th International Conference on Machine Learning. Stockholm: PMLR, 2018. P. 1367–1376.
  7. Dvurechensky P., Gasnikov A., Matsievsky S., Rodomanov A., Usik I. Primal-Dual Method for Searching Equilibrium in Hierarchical Congestion Population Games CEUR-WS // Supplementary Proc. the 9th In- tern. Conf. on Discrete Optimization and Operations Research and Scientific School (DOOR 2016). B.: Springer, 2016. P. 584–595. http://ceurws.org/Vol-1623/
  8. Chernov A., Dvurechensky P., Gasnikov A. Fast Primal- Dual Gradient Method for Strongly Convex Minimiza- tion Problems with Linear Constraints // Proceeding of the 9th Intern. Conf. on Discrete Optimization and Operations Research (DOOR 2016). B.: Springer, 2016. P. 391–403.
  9. Нестеров Ю.Е. Метод решения задач выпуклого программирования с трудоемкостью O(1/k2) // ДАН. 1983. Т. 269. № 3. С. 543–547.
  10. Нестеров Ю.Е. Введение в выпуклую оптимиза- цию. М.: МЦНМО, 2010. 280 с.
  11. Allen-Zhu Z., Orecchia L. Linear Coupling: An Ulti- mate Unification of Gradient and Mirror Descent. Proc. of the 8th Innovations in Theoretical Computer Science. Saarbrücken: Schloss Dagstuhl, 2017.
  12. Nesterov Yu. Universal Gradient Methods for Convex Optimization Problems // Math. Progr. 2015. V. 152. № 1/2. P. 381–404.
  13. Guminov S., Gasnikov A., Anikin A., Gornov A. A Uni- versal Modification of the Linear Coupling Method // Optim. Met. and Soft. 2018. doi: 10.1080/10556788. 2018.1517158.

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2019 Russian academy of sciences

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies