Journal of Chaohu University ›› 2019, Vol. 21 ›› Issue (6): 70-74.doi: 10.12152/j.issn.1672-2868.2019.06.010

Previous Articles     Next Articles

On Accelerated Gradient Approximation for Least Square Regression with L1-regularization

1. School of Mathematics and Statistics, Chaohu University, Chaohu Anhui 238000;2. Urban Construction College of AHJZU, Foundation department, Hefei Anhui 238076#br#   

  1. 1. School of Mathematics and Statistics, Chaohu University;2. Urban Construction College of AHJZU, Foundation department
  • Received:2019-10-18 Online:2019-11-25 Published:2020-03-13
  • Contact: CHENG Yi-yuan:School of Mathematics and Statistics, Chaohu University
  • About author:CHENG Yi-yuan:School of Mathematics and Statistics, Chaohu University

Abstract: In this paper, we have in-depth and systematic research on the convergence rate of stochastic optimization problems; the least-square regression problem that the objective function consists of the L1 regular term is concerned and an effective accelerating stochastic approximation algorithm is proposed. Based on a non-strong convexity condition and using a smooth function to approximate the L1-regular term, the convergence speed of the learning algorithm is considered, and we obtain the convergence speed of the algorithm. This conclusion is superior to the previous convergence results.
.

Key words: stochastic optimization, least square regression, learning algorithm, convergence speed

CLC Number: 

  • TP181