论文标题
用于多类SVM的层次凸优化,可实现最大成对边缘,而经验铰链最少
A Hierarchical Convex Optimization for Multiclass SVM Achieving Maximum Pairwise Margins with Least Empirical Hinge-Loss
论文作者
论文摘要
在本文中,我们针对多类SVM进行了新的层次凸优化,该凸优化可实现最大的成对边缘,并具有最少的经验铰链损失。这个优化问题是NP-HARD层次优化的最忠实和强大的多类扩展,这是C.〜Cortes and V.〜Vapnik在大约25年前的开创性论文中首次出现。通过扩展具有广义铰链损耗函数[Crammer-Singer 2001]的最新固定点理论思想[Yamada-Yamagishi 2019],我们表明,在计算固定点理论中,混合型陡峭下降方法[Yamada 2001]适用于此更为复杂的高级结构高级凸优化问题。
In this paper, we formulate newly a hierarchical convex optimization for multiclass SVM achieving maximum pairwise margins with least empirical hinge-loss. This optimization problem is a most faithful as well as robust multiclass extension of an NP-hard hierarchical optimization appeared for the first time in the seminal paper by C.~Cortes and V.~Vapnik almost 25 years ago. By extending the very recent fixed point theoretic idea [Yamada-Yamagishi 2019] with the generalized hinge loss function [Crammer-Singer 2001], we show that the hybrid steepest descent method [Yamada 2001] in the computational fixed point theory is applicable to this much more complex hierarchical convex optimization problem.