Convergence of Adaptive Stochastic Mirror Descent
报告学者:胡婷
报告者单位:西安交通大学
报告时间:11月16日 10:30-11:30
报告地点:七教7215
报告摘要:In this talk, we present a family of adaptive optimization methods, which are associated with mirror maps that are widely used to capture the geometry properties of optimization problems during iteration processes. The well-known adaptive moment estimation (Adam) type algorithm falls into the family when the mirror maps take the form of temporal adaptation. In the context of convex objective functions, we show that with proper step sizes and hyperparameters, the average regret can achieve the convergence rate after iterations under some standard assumptions. We further improve it when the objective functions are strongly convex. In the context of smooth objective functions (not necessarily convex), based on properties of the strongly convex differentiable mirror map, our algorithms achieve convergence rates of order
up to a logarithmic term, requiring large or increasing hyperparameters that are coincident with practical usage of Adam-type algorithms. Thus, our work gives explanations for the selection of the hyperparameters in Adam-type algorithms’ implementation.
报告学者简介:胡婷,西安交通大学管理学院教授、博士生导师,入选西安交通大学青年拔尖人才A类,主持国自科四项。主要从事机器学习领域中数学问题和学习算法的理论研究。现阶段已在应用数学和机器学习领域中有影响力的期刊上发表了一系列学术论文,主要包括Applied and Computational Harmonic Analysis,Journal of Machine Learning Research,IEEE Transactions on Signal Processing,Inverse Problems,Constructive Approximation, Neural Networks, Journal of Multivariate Analysis等。