论文标题
$ \ ell_0 $ regression的安全筛选规则
Safe Screening Rules for $\ell_0$-Regression
论文作者
论文摘要
我们提供安全的筛选规则,以消除$ \ ell_0 $正则化或基数约束的回归中的变量。这些规则基于保证,可以在最佳解决方案中选择或可能不会选择功能。筛选规则可以在线性时间内从凸松弛解决方案计算,而无需求解$ \ ell_0 $优化问题。因此,它们可以在预处理步骤中使用,以安全地删除考虑变量。关于真实和合成数据的数值实验表明,平均而言,可以将76%的变量固定在其最佳值中,从而减少计算负担以实质上优化。因此,提议的快速有效筛选规则将算法的范围扩展到$ \ ell_0 $ - 回归的范围到较大的数据集。
We give safe screening rules to eliminate variables from regression with $\ell_0$ regularization or cardinality constraint. These rules are based on guarantees that a feature may or may not be selected in an optimal solution. The screening rules can be computed from a convex relaxation solution in linear time, without solving the $\ell_0$ optimization problem. Thus, they can be used in a preprocessing step to safely remove variables from consideration apriori. Numerical experiments on real and synthetic data indicate that, on average, 76\% of the variables can be fixed to their optimal values, hence, reducing the computational burden for optimization substantially. Therefore, the proposed fast and effective screening rules extend the scope of algorithms for $\ell_0$-regression to larger data sets.