Seminar第2909讲 机器学习中随机牛顿迭代法的切比雪夫加速算法

创建时间:  2025/10/09  谭福平   浏览次数:   返回

报告题目 (Title):Chebyshev polynomial acceleration of stochastic Newton method for machine learning(机器学习中随机牛顿迭代法的切比雪夫加速算法)

报告人 (Speaker):潘建瑜 教授(华东师范大学)

报告时间 (Time):2025年10月14日(周二) 15:00

报告地点 (Place):校本部GJ303

邀请人(Inviter):刘巧华


报告摘要:In this talk, we consider the acceleration of stochastic Newton method for the large scale optimization problems arising from machine learning. In order to reduce the cost of computing Hessian and Hessian inverse, we propose to utilize the Chebyshev polynomial to approximate the Hessian inverse. We show that, by utilizing the short-term recurrence formula, Chebyshev polynomial approximation can effectively reduce the computational cost. The convergence analysis are given and experiments on multiple benchmarks are carried out to illustrate the performance of our proposed acceleration method.

上一条:Seminar第2910讲 Householder正交化的一些新进展

下一条:Seminar第2908讲 光学荧光问题的基本解方法

  版权所有 © 上海大学   沪ICP备09014157   沪公网安备31009102000049号  地址:上海市宝山区上大路99号    邮编:200444   电话查询
 技术支持:上海大学信息化工作办公室   联系我们