论文标题

简单有效的预防模式崩溃的深度分类

Simple and Effective Prevention of Mode Collapse in Deep One-Class Classification

论文作者

Chong, Penny, Ruff, Lukas, Kloft, Marius, Binder, Alexander

论文摘要

异常检测算法在各个领域都发现了广泛使用。由于深度学习,这一研究领域最近已取得了长足的进步。最新的方法是,深层支持矢量数据描述(Deep SVDD),灵感来自经典基于内核的支持矢量数据描述(SVDD),能够同时学习数据的功能表示和数据包含数据的超孔。该方法在无监督和半监督的设置中显示出令人鼓舞的结果。但是,如果模型的架构不符合某些建筑约束,例如删除偏见条款。这些约束限制了模型的适应性,在某些情况下,可能会因学习次优特征而影响模型性能。在这项工作中,我们考虑了两个正规化器,以防止深度SVDD中的超晶体崩溃。第一个正常器是基于通过标准跨透明镜损失注入随机噪声的。第二个常规器会惩罚Minibatch差异太小时。此外,我们引入了一种自适应加权方案,以控制SVDD损失与各自的正常使用者之间的惩罚量。我们提出的深SVDD的正则变体显示了令人鼓舞的结果,并且在设置上的最新方法优于没有明显的几何结构的设置。

Anomaly detection algorithms find extensive use in various fields. This area of research has recently made great advances thanks to deep learning. A recent method, the deep Support Vector Data Description (deep SVDD), which is inspired by the classic kernel-based Support Vector Data Description (SVDD), is capable of simultaneously learning a feature representation of the data and a data-enclosing hypersphere. The method has shown promising results in both unsupervised and semi-supervised settings. However, deep SVDD suffers from hypersphere collapse -- also known as mode collapse, if the architecture of the model does not comply with certain architectural constraints, e.g. the removal of bias terms. These constraints limit the adaptability of the model and in some cases, may affect the model performance due to learning sub-optimal features. In this work, we consider two regularizers to prevent hypersphere collapse in deep SVDD. The first regularizer is based on injecting random noise via the standard cross-entropy loss. The second regularizer penalizes the minibatch variance when it becomes too small. Moreover, we introduce an adaptive weighting scheme to control the amount of penalization between the SVDD loss and the respective regularizer. Our proposed regularized variants of deep SVDD show encouraging results and outperform a prominent state-of-the-art method on a setup where the anomalies have no apparent geometrical structure.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源