论文标题

偏置差异的偏差分解

Bias-Variance Decompositions for Margin Losses

论文作者

Wood, Danny, Mu, Tingting, Brown, Gavin

论文摘要

我们引入了一种新颖的偏见变化分解,以用于一系列严格的凸缘损失,包括逻辑损失(通过经典的LogitBoost算法最小化),以及平方的边缘损失和规范的增强损失。此外,我们表明,对于所有严格凸出的边缘损失,预期风险将“中心”模型的风险分解为“中央”模型的风险和量化功能余量的术语变化,而训练数据的变化。这些分解为从业者提供了一种诊断工具,以了解模型过度拟合/不足的模型,并对添加剂合奏模型产生影响 - 例如,当我们的偏见变化分解成立时,可以使用相应的“歧义性”分解,可以用来量化模型多样性。

We introduce a novel bias-variance decomposition for a range of strictly convex margin losses, including the logistic loss (minimized by the classic LogitBoost algorithm), as well as the squared margin loss and canonical boosting loss. Furthermore, we show that, for all strictly convex margin losses, the expected risk decomposes into the risk of a "central" model and a term quantifying variation in the functional margin with respect to variations in the training data. These decompositions provide a diagnostic tool for practitioners to understand model overfitting/underfitting, and have implications for additive ensemble models -- for example, when our bias-variance decomposition holds, there is a corresponding "ambiguity" decomposition, which can be used to quantify model diversity.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源