巢湖学院学报 ›› 2021, Vol. 23 ›› Issue (3): 51-54+60.doi: 10.12152/j.issn.1672-2868.2021.03.007
• 数理科学 • 上一篇 下一篇
费经泰,查星星,王冬银:巢湖学院 数学与统计学院
收稿日期:
出版日期:
发布日期:
作者简介:
基金资助:
FEI Jing-tai,ZHA Xing-xing,WANG Dong-yin:School of Mathematics and Statistics, Chaohu University
Received:
Online:
Published:
摘要: 对随机经典动量算法(CM)的收敛速度问题进行深入研究,通过对传统带动量随机梯度下降算法的迭代公式进行改造,在非强凸和光滑的条件下得到了算法的收敛阶。当动量系数pt取常数的时候,收敛阶为O,当动量系数pt取变系数的时候,通过设置不同的学习率,分别得到收敛速率。最后通过数值实验说明其合理性。
关键词: 机器学习, 随机经典动量算法, 收敛阶, 动量系数, 学习率
Abstract: In this paper, we have conducted in-depth research on the convergence rate of the Stochastic Classical Momentum Algorithm (CM). By modifying the iterative formula of traditional stochastic gradient descent algorithm with momentum, we obtain the convergence rate of algorithm with non-strongly convex and non-smooth. When the momentum coefficient pt is a constant, the algorithm achieves a convergence rate of O.When the momentum coefficient pt is not a constant, we obtain the convergence rate by setting different learning rates. Finally, the rationality is demonstrated through numerical experiments.
Key words: machine learning, Stochastic Classical Momentum Algorithm, convergence rate, momentum coefficient, learning rates
中图分类号:
费经泰, 查星星, 王冬银. 非强凸和光滑条件下随机经典动量算法的收敛性[J]. 巢湖学院学报, 2021, 23(3): 51-54+60.
FEI Jing-tai, ZHA Xing-xing, WANG Dong-yin. Convergence of Stochastic Classical Momentum Algorithm with Non-strongly Convex and Non-smooth[J]. Journal of Chaohu University, 2021, 23(3): 51-54+60.
0 / / 推荐
导出引用管理器 EndNote|Reference Manager|ProCite|BibTeX|RefWorks
链接本文: http://xb.chu.edu.cn/CN/10.12152/j.issn.1672-2868.2021.03.007
http://xb.chu.edu.cn/CN/Y2021/V23/I3/51
Cited