论文标题

关于PGMS在非平滑非凸的稀疏回归中PGMS的优势

On the superiority of PGMs to PDCAs in nonsmooth nonconvex sparse regression

论文作者

Nakayama, Shummin, Gotoh, Jun-ya

论文摘要

本文对近端梯度方法(PGM)和DC近端DC算法(PDCA)进行了比较研究,以解决稀疏回归问题,可以将其视为两种差异 - 连接功能(DC)优化问题。已经表明,对于DC优化问题,通用迭代收缩和阈值算法(GIST)(GIST),PGM的修改版本和PDCA收敛到关键点。最近,一些增强的PDCA版本被证明会融合到D-Stationare点,这些点比关键点要比关键点更强。在本文中,我们声称,如果没有进行任何修改,PGM不仅将DC问题收敛到DC问题,而且在某些技术假设下也将更一般的非平滑非凸问题收敛。虽然与台阶大小足够小的情况下,汇聚到D-station点是众所周知的,但本文的发现也适用于GIST及其交替优化版本等扩展版本,该版本将在本文中开发。数值结果表明,在这两种类别的几种算法中,PGM的修改版本不仅在解决方案质量方面而且在计算时间中都表现最佳。

This paper conducts a comparative study of proximal gradient methods (PGMs) and proximal DC algorithms (PDCAs) for sparse regression problems which can be cast as Difference-of-two-Convex-functions (DC) optimization problems. It has been shown that for DC optimization problems, both General Iterative Shrinkage and Thresholding algorithm (GIST), a modified version of PGM, and PDCA converge to critical points. Recently some enhanced versions of PDCAs are shown to converge to d-stationary points, which are stronger necessary condition for local optimality than critical points. In this paper we claim that without any modification, PGMs converge to a d-stationary point not only to DC problems but also to more general nonsmooth nonconvex problems under some technical assumptions. While the convergence to d-stationary points is known for the case where the step size is small enough, the finding of this paper is valid also for extended versions such as GIST and its alternating optimization version, which is to be developed in this paper. Numerical results show that among several algorithms in the two categories, modified versions of PGM perform best among those not only in solution quality but also in computation time.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源