巢湖学院学报 ›› 2019, Vol. 21 ›› Issue (6): 70-74.doi: 10.12152/j.issn.1672-2868.2019.06.010

• 数理科学 • 上一篇    下一篇

L1-正则的最小二乘回归的加速随机梯度逼近算法

1. 巢湖学院数学与统计学院,安徽 巢湖238000;2. 安徽建筑大学城市建设学院基础部,安徽 合肥 238076
  

  1. 1. 巢湖学院数学与统计学院;2. 安徽建筑大学城市建设学院基础部
  • 收稿日期:2019-10-18 出版日期:2019-11-25 发布日期:2020-03-13
  • 通讯作者: 程一元(1992-),男,安徽安庆人,巢湖学院数学与统计学院助教,主要从事机器学习研究。
  • 作者简介:程一元(1992-),男,安徽安庆人,巢湖学院数学与统计学院助教,主要从事机器学习研究。
  • 基金资助:
    安徽省高校青年人才支持项目(项目编号:gxyq2019082);巢湖学院校级科研项目(项目编号:XLY-201903);巢
    湖学院省级大学生创新创业训练计划项目(项目编号:S201910380067)

On Accelerated Gradient Approximation for Least Square Regression with L1-regularization

1. School of Mathematics and Statistics, Chaohu University, Chaohu Anhui 238000;2. Urban Construction College of AHJZU, Foundation department, Hefei Anhui 238076#br#   

  1. 1. School of Mathematics and Statistics, Chaohu University;2. Urban Construction College of AHJZU, Foundation department
  • Received:2019-10-18 Online:2019-11-25 Published:2020-03-13
  • Contact: CHENG Yi-yuan:School of Mathematics and Statistics, Chaohu University
  • About author:CHENG Yi-yuan:School of Mathematics and Statistics, Chaohu University

摘要: 对随机优化算法的收敛速度问题进行深入研究,考虑目标函数由L1正则项组成的最小二乘回归问题,提出一个有效的加速随机逼近算法。基于一个非强凸性的条件和利用一个光滑函数近似L1 正则项,讨论了学习算法的收敛速度,并得到算法的收敛速度。该结论优于前人的收敛结果。

关键词: 随机优化, 最小二乘, 学习算法, 收敛速度

Abstract: In this paper, we have in-depth and systematic research on the convergence rate of stochastic optimization problems; the least-square regression problem that the objective function consists of the L1 regular term is concerned and an effective accelerating stochastic approximation algorithm is proposed. Based on a non-strong convexity condition and using a smooth function to approximate the L1-regular term, the convergence speed of the learning algorithm is considered, and we obtain the convergence speed of the algorithm. This conclusion is superior to the previous convergence results.
.

Key words: stochastic optimization, least square regression, learning algorithm, convergence speed

中图分类号: 

  • TP181