论文标题
分数最小化的协调下降方法
Coordinate Descent Methods for Fractional Minimization
论文作者
论文摘要
我们考虑了一类结构化的分数最小化问题,其中,该目标的分子部分是可区分的凸函数和凸的非平滑函数的总和,而分母部分是凸或凹形函数。由于它是非凸面,因此难以解决此问题。通过利用问题的结构,我们提出了两个坐标下降(CD)方法来解决此问题。所提出的方法迭代地求解了一维的子问题\ textit {loballyline},并保证它们会收敛到坐标的固定点。在凸点的情况下,在弱\ textIt {局部界限的非凸状条件下},我们证明,坐标的固定点的最佳性比标准临界点和方向点的最佳性更强。在其他合适的条件下,CD方法将Q线性收敛到坐标固定点。在凹点分母的情况下,我们表明任何临界点都是全球最小值,CD方法以均方根收敛速率收敛到全球最小值。我们证明了所提出的方法对某些机器学习和信号处理模型的适用性。我们对现实世界数据的实验表明,我们的方法在准确性方面显着超过现有方法。
We consider a class of structured fractional minimization problems, in which the numerator part of the objective is the sum of a differentiable convex function and a convex non-smooth function, while the denominator part is a convex or concave function. This problem is difficult to solve since it is non-convex. By exploiting the structure of the problem, we propose two Coordinate Descent (CD) methods for solving this problem. The proposed methods iteratively solve a one-dimensional subproblem \textit{globally}, and they are guaranteed to converge to coordinate-wise stationary points. In the case of a convex denominator, under a weak \textit{locally bounded non-convexity condition}, we prove that the optimality of coordinate-wise stationary point is stronger than that of the standard critical point and directional point. Under additional suitable conditions, CD methods converge Q-linearly to coordinate-wise stationary points. In the case of a concave denominator, we show that any critical point is a global minimum, and CD methods converge to the global minimum with a sublinear convergence rate. We demonstrate the applicability of the proposed methods to some machine learning and signal processing models. Our experiments on real-world data have shown that our method significantly and consistently outperforms existing methods in terms of accuracy.