Seminar第2228讲 An Introduction to Hyperparameter Optimization

创建时间:  2021/12/13  谭福平   浏览次数:   返回

报告题目 (Title):An Introduction to Hyperparameter Optimization(超参数优化问题简介)

报告人 (Speaker):方慧副教授(上海财经大学)

报告时间 (Time):2021年12月14日(周二) 16:00 - 17:00

报告地点 (Place):G507

邀请人(Inviter):余长君


报告摘要:Machine learning (ML) has been widely exploited in both academia and industry. Building an effective machine learning model is a time-consuming process that involves obtaining an optimal model architecture with fine-tuned hyperparameters. Besides, recent interest in complex ML models with a relatively large volume of hyperparameters (e.g., autoML and deep learning methods) has resulted in an increasing volume of studies on hypeparameter optimization (HPO).

In this talk, Iwill first formally define the HPO problem, and give an overview of existing wok in this field of research. Secondly, three types of HPO methods, i.e., sampling-based, model-based and gradient-based, are elaborated. Finally, I will conclude the talk by summarizing challenging issues on the topic.

上一条:Seminar第2229讲 Low regularity ill-posedness for 3D elastic waves and for 3D ideal compressible MHD driven by shock formation

下一条:Seminar第2227讲 On Ramanujan's cubic theory of elliptic functions

  版权所有 © 上海大学   沪ICP备09014157   沪公网安备31009102000049号  地址:上海市宝山区上大路99号    邮编:200444   电话查询
 技术支持:上海大学信息化工作办公室   联系我们