Seminar第2065期 Smoothing fast iterative hard thresholding algorithm for L0 regularized nonsmooth convex regression problem

创建时间:  2020/12/15  谭福平   浏览次数:   返回

报告主题:Smoothing fast iterative hard thresholding algorithm for L0 regularized nonsmooth convex regression problem

报 告 人:边伟 教授 (哈尔滨工业大学)

报告时间:2020年12月21日(周一) 9:30

会议地点:G507

邀 请 人:徐姿

主办部门:理学院数学系

报告摘要: We first investigate a class of constrained sparse regression problem with cardinality penalty, where the feasible set is box constraint, and the loss function is convex, not differentiable. We put forward a smoothing fast iterative hard thresholding (SFIHT) algorithm for solving such optimization problems, which combines smoothing approximations, extrapolation techniques and iterative hard thresholding methods. The extrapolation coefficients $\lim_{k->\inf}\beta_k = 1$ satisfy in the proposed algorithm. We establish that any accumulated point of the iterative sequence is a local minimizer of the original cardinality penalty problem. We then consider that the case where the loss function is differentiable. We propose the fast iterative hard thresholding (FIHT) algorithm to solve such problems. We prove that the iterates converges to a local minimizer with lower bound property of the problem. In particular, we show that the convergence rate of the corresponding objective function value sequence is $O(k^{-2})$. Finally, we perform some numerical examples to illustrate the theoretical results.


欢迎教师、学生参加!

上一条:Seminar第2064期 Linearized Proximal Algorithms for Convex Composite Optimization with Applications

下一条:Seminar第2064期 Linearized Proximal Algorithms for Convex Composite Optimization with Applications

  版权所有 © 上海大学   沪ICP备09014157   沪公网安备31009102000049号  地址:上海市宝山区上大路99号    邮编:200444   电话查询
 技术支持:上海大学信息化工作办公室   联系我们