论文标题

关于线性假设集的Rademacher复杂性

On the Rademacher Complexity of Linear Hypothesis Sets

论文作者

Awasthi, Pranjal, Frank, Natalie, Mohri, Mehryar

论文摘要

线性预测因子形成了各种学习算法中使用的丰富假设。我们对线性假设类别的经验Rademacher复杂性进行了严格的分析,其重量向量以$ \ ell_p $ -norm为$ \ ell_p $ -norm,对于任何$ p \ geq 1 $。这提供了使用这些假设集对概括进行严格的分析,并有助于得出尖锐的数据依赖性学习保证。我们在这些家族的Rademacher复杂性上给出上限和下限,并表明我们的界限改善或匹配现有界限,仅以$ 1 \ leq p \ leq 2 $而闻名。

Linear predictors form a rich class of hypotheses used in a variety of learning algorithms. We present a tight analysis of the empirical Rademacher complexity of the family of linear hypothesis classes with weight vectors bounded in $\ell_p$-norm for any $p \geq 1$. This provides a tight analysis of generalization using these hypothesis sets and helps derive sharp data-dependent learning guarantees. We give both upper and lower bounds on the Rademacher complexity of these families and show that our bounds improve upon or match existing bounds, which are known only for $1 \leq p \leq 2$.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源