论文标题

利用二维二次终止属性的梯度方法

A gradient method exploiting the two dimensional quadratic termination property

论文作者

Li, Xinrui, Huang, Yakui

论文摘要

二次终止属性对于梯度方法的效率很重要。我们考虑装备一个梯度方法的家族,其中步骤尺寸由两个规范的比率给出,并具有二维二次终止。通过与新的步骤大小合作,可以通过在下一次迭代中最大化所考虑的家族的步骤来实现这种期望的财产。事实证明,家庭中的每种方法都将在与对应于最大和最小的特征值相对应的特征向量跨越的二维子空间中渐近交替。基于这种渐近行为,我们表明,新的步骤大小会收敛于Hessian最大的特征值的倒数。此外,通过自适应地采取长长的barzilai--borwein步骤并重复使用延迟的新步骤大小,我们提出了一种有效的梯度方法,用于无约束的二次优化。我们证明,新方法是$ r $ - 线性收敛的,费率为$ 1-1/κ$,其中$κ$是Hessian的状况。数值实验显示了我们提出的方法的效率。

The quadratic termination property is important to the efficiency of gradient methods. We consider equipping a family of gradient methods, where the stepsize is given by the ratio of two norms, with two dimensional quadratic termination. Such a desired property is achieved by cooperating with a new stepsize which is derived by maximizing the stepsize of the considered family in the next iteration. It is proved that each method in the family will asymptotically alternate in a two dimensional subspace spanned by the eigenvectors corresponding to the largest and smallest eigenvalues. Based on this asymptotic behavior, we show that the new stepsize converges to the reciprocal of the largest eigenvalue of the Hessian. Furthermore, by adaptively taking the long Barzilai--Borwein stepsize and reusing the new stepsize with retard, we propose an efficient gradient method for unconstrained quadratic optimization. We prove that the new method is $R$-linearly convergent with a rate of $1-1/κ$, where $κ$ is the condition number of Hessian. Numerical experiments show the efficiency of our proposed method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源