论文标题
像水:通过自适应特征归一化对无关变量的鲁棒性
Be Like Water: Robustness to Extraneous Variables Via Adaptive Feature Normalization
论文作者
论文摘要
无关的变量是与某个任务无关的变量,但会严重影响可用数据的分布。在这项工作中,我们表明这种变量的存在会降低深度学习模型的性能。我们研究了三个数据集,其中已知的外部变量具有很强的影响:中风患者中上身运动的分类,手术活动的注释以及对损坏的图像的识别。经过批处理训练的模型学习特征,这些功能高度依赖于外部变量。在批处理中,用于使特征归一化的统计数据是从训练集中学到的,并在测试时间固定,这在存在不同的外部变量的情况下会导致不匹配。我们证明,在推理期间自适应地估算特征统计数据,例如归一化,可以解决此问题,从而产生标准化特征,这些特征对外在变量的变化更为强大。这导致不同网络体系结构和特征统计选择的性能显着提高。
Extraneous variables are variables that are irrelevant for a certain task, but heavily affect the distribution of the available data. In this work, we show that the presence of such variables can degrade the performance of deep-learning models. We study three datasets where there is a strong influence of known extraneous variables: classification of upper-body movements in stroke patients, annotation of surgical activities, and recognition of corrupted images. Models trained with batch normalization learn features that are highly dependent on the extraneous variables. In batch normalization, the statistics used to normalize the features are learned from the training set and fixed at test time, which produces a mismatch in the presence of varying extraneous variables. We demonstrate that estimating the feature statistics adaptively during inference, as in instance normalization, addresses this issue, producing normalized features that are more robust to changes in the extraneous variables. This results in a significant gain in performance for different network architectures and choices of feature statistics.