Primal-dual accelerated gradient descent with line search for convex and nonconvex optimization problems
- Authors: Guminov S.V.1,2, Nesterov Y.E.3,4, Dvurechensky P.E.2,5, Gasnikov A.V.1,2
-
Affiliations:
- Moscow Institute of Physics and Technology
- Institute for Information Transmission Problems of the Russian Academy of Sciences
- Center for Operations Research and Econometrics (CORE), Catholic University of Louvain
- Higher School of Economics
- Weierstrass Institute for Applied Analysis and Stochastics
- Issue: Vol 485, No 1 (2019)
- Pages: 15-18
- Section: Mathematics
- URL: https://journals.eco-vector.com/0869-5652/article/view/12803
- DOI: https://doi.org/10.31857/S0869-5652485115-18
- ID: 12803
Cite item
Full Text
Abstract
In this paper a new variant of accelerated gradient descent is proposed. The proposed method does not require any information about the objective function, uses exact line search for the practical accelerations of convergence, converges according to the well-known lower bounds for both convex and non-convex objective functions and possesses primal-dual properties. We also provide a universal version of said method, which converges according to the known lower bounds for both smooth and non-smooth problems.
About the authors
S. V. Guminov
Moscow Institute of Physics and Technology; Institute for Information Transmission Problems of the Russian Academy of Sciences
Author for correspondence.
Email: sergey.guminov@phystech.edu
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701; 19, B.Karetny, Moscow, 127051
Yu. E. Nesterov
Center for Operations Research and Econometrics (CORE), Catholic University of Louvain; Higher School of Economics
Email: sergey.guminov@phystech.edu
Russian Federation, Belgium; 20, Myasnitskaya str., Moscow, 101000
P. E. Dvurechensky
Institute for Information Transmission Problems of the Russian Academy of Sciences; Weierstrass Institute for Applied Analysis and Stochastics
Email: sergey.guminov@phystech.edu
Russian Federation, 19, B.Karetny, Moscow, 127051; Berlin, Germany
A. V. Gasnikov
Moscow Institute of Physics and Technology; Institute for Information Transmission Problems of the Russian Academy of Sciences
Email: sergey.guminov@phystech.edu
Russian Federation, 9, Institutskij, Dolgoprudny, Moscow region, 141701; 19, B.Karetny, Moscow, 127051
References
- Нестеров Ю.Е. Эффективные методы в нелиней- ном программировании. М.: радио и связь, 1989.
- Waltz R. A., Morales J.L., Nocedal J., Orban D. An Interior Algorithm for Nonlinear Optimization that Combines Line Search and Trust Region Steps //Math. Progr. 2006. V. 107. № 3. P. 391–408.
- Narkiss G., Zibulevsky M. Sequential Subspace Optimization Method for Large-Scale Unconstrained Problems. Tech. Report CCIT № 559. Haifa: EE Dept.Technion, 2005.
- Nesterov Yu. Smooth Minimization of Non-Smooth Functions // Math. Progr. 2005. V. 103. № 1. P. 127–152.
- Nesterov Yu. Primal-Dual Subgradient Methods for Convex Problems // Math. Progr. 2009. V. 120. № 1. P. 221–259.
- Dvurechensky P., Gasnikov A., Kroshnin A. Computational Optimal Transport: Complexity by Accelerated Gradient Descent Is Better Than by Sinkhorn’s Algorithm. Proc. of the 35th International Conference on Machine Learning. Stockholm: PMLR, 2018. P. 1367–1376.
- Dvurechensky P., Gasnikov A., Matsievsky S., Rodomanov A., Usik I. Primal-Dual Method for Searching Equilibrium in Hierarchical Congestion Population Games CEUR-WS // Supplementary Proc. the 9th In- tern. Conf. on Discrete Optimization and Operations Research and Scientific School (DOOR 2016). B.: Springer, 2016. P. 584–595. http://ceurws.org/Vol-1623/
- Chernov A., Dvurechensky P., Gasnikov A. Fast Primal- Dual Gradient Method for Strongly Convex Minimiza- tion Problems with Linear Constraints // Proceeding of the 9th Intern. Conf. on Discrete Optimization and Operations Research (DOOR 2016). B.: Springer, 2016. P. 391–403.
- Нестеров Ю.Е. Метод решения задач выпуклого программирования с трудоемкостью O(1/k2) // ДАН. 1983. Т. 269. № 3. С. 543–547.
- Нестеров Ю.Е. Введение в выпуклую оптимиза- цию. М.: МЦНМО, 2010. 280 с.
- Allen-Zhu Z., Orecchia L. Linear Coupling: An Ulti- mate Unification of Gradient and Mirror Descent. Proc. of the 8th Innovations in Theoretical Computer Science. Saarbrücken: Schloss Dagstuhl, 2017.
- Nesterov Yu. Universal Gradient Methods for Convex Optimization Problems // Math. Progr. 2015. V. 152. № 1/2. P. 381–404.
- Guminov S., Gasnikov A., Anikin A., Gornov A. A Uni- versal Modification of the Linear Coupling Method // Optim. Met. and Soft. 2018. doi: 10.1080/10556788. 2018.1517158.