机器学习实验室博士生系列论坛(第十八期)——Recent Progress on the Explicit Superlinear Convergence Rates of Quasi-Newton Methods
报告人:Dachao Lin (PKU)
时间:2021-11-10 15:10-16:10
地点:北大理科一号楼1513会议室&腾讯会议 761 4699 1810
Abstract: Quasi-Newton methods have a reputation as the most efficient numerical schemes for smooth unconstrained optimization. The most attractive feature of quasi-Newton methods is their superlinear convergence. However, previous convergence rates are asymptotic.
Recently, Rodomanov and Nesterov (2021 a,b,c) firstly gave the explicit superlinear convergence rates for the greedy and standard quasi-Newton methods. We focus on these three papers to have a rough understanding of famous quasi-newton methods.