Доклады Академии наукДоклады Академии наук0869-5652The Russian Academy of Sciences1280310.31857/S0869-5652485115-18Research ArticlePrimal-dual accelerated gradient descent with line search for convex and nonconvex optimization problemsGuminovS. V.sergey.guminov@phystech.eduNesterovYu. E.sergey.guminov@phystech.eduDvurechenskyP. E.sergey.guminov@phystech.eduGasnikovA. V.sergey.guminov@phystech.eduMoscow Institute of Physics and TechnologyInstitute for Information Transmission Problems of the Russian Academy of SciencesCenter for Operations Research and Econometrics (CORE), Catholic University of LouvainHigher School of EconomicsWeierstrass Institute for Applied Analysis and Stochastics19052019485115182205201922052019Copyright © 2019, Russian academy of sciences2019<p>In this paper a new variant of accelerated gradient descent is proposed. The proposed method does not require any information about the objective function, uses exact line search for the practical accelerations of convergence, converges according to the well-known lower bounds for both convex and non-convex objective functions and possesses primal-dual properties. We also provide a universal version of said method, which converges according to the known lower bounds for both smooth and non-smooth problems.</p>accelerated gradient descentline-searchprimal-dual methodsconvex optimizationnonconvex optimizationускоренный градиентный спускодномерный поискпрямо-двойственные методывыпуклая оптимизацияневыпуклая оптимизация[Нестеров Ю.Е. Эффективные методы в нелиней- ном программировании. М.: радио и связь, 1989.][Waltz R. A., Morales J.L., Nocedal J., Orban D. An Interior Algorithm for Nonlinear Optimization that Combines Line Search and Trust Region Steps //Math. Progr. 2006. V. 107. № 3. P. 391–408.][Narkiss G., Zibulevsky M. Sequential Subspace Optimization Method for Large-Scale Unconstrained Problems. Tech. Report CCIT № 559. Haifa: EE Dept.Technion, 2005.][Nesterov Yu. Smooth Minimization of Non-Smooth Functions // Math. Progr. 2005. V. 103. № 1. P. 127–152.][Nesterov Yu. Primal-Dual Subgradient Methods for Convex Problems // Math. Progr. 2009. V. 120. № 1. P. 221–259.][Dvurechensky P., Gasnikov A., Kroshnin A. Computational Optimal Transport: Complexity by Accelerated Gradient Descent Is Better Than by Sinkhorn’s Algorithm. Proc. of the 35th International Conference on Machine Learning. Stockholm: PMLR, 2018. P. 1367–1376.][Dvurechensky P., Gasnikov A., Matsievsky S., Rodomanov A., Usik I. Primal-Dual Method for Searching Equilibrium in Hierarchical Congestion Population Games CEUR-WS // Supplementary Proc. the 9th In- tern. Conf. on Discrete Optimization and Operations Research and Scientific School (DOOR 2016). B.: Springer, 2016. P. 584–595. http://ceurws.org/Vol-1623/][Chernov A., Dvurechensky P., Gasnikov A. Fast Primal- Dual Gradient Method for Strongly Convex Minimiza- tion Problems with Linear Constraints // Proceeding of the 9th Intern. Conf. on Discrete Optimization and Operations Research (DOOR 2016). B.: Springer, 2016. P. 391–403.][Нестеров Ю.Е. Метод решения задач выпуклого программирования с трудоемкостью O(1/k2) // ДАН. 1983. Т. 269. № 3. С. 543–547.][Нестеров Ю.Е. Введение в выпуклую оптимиза- цию. М.: МЦНМО, 2010. 280 с.][Allen-Zhu Z., Orecchia L. Linear Coupling: An Ulti- mate Unification of Gradient and Mirror Descent. Proc. of the 8th Innovations in Theoretical Computer Science. Saarbrücken: Schloss Dagstuhl, 2017.][Nesterov Yu. Universal Gradient Methods for Convex Optimization Problems // Math. Progr. 2015. V. 152. № 1/2. P. 381–404.][Guminov S., Gasnikov A., Anikin A., Gornov A. A Uni- versal Modification of the Linear Coupling Method // Optim. Met. and Soft. 2018. DOI: 10.1080/10556788. 2018.1517158.]