论文标题

潜在的域学习具有动态残留适配器

Latent Domain Learning with Dynamic Residual Adapters

论文作者

Deecke, Lucas, Hospedales, Timothy, Bilen, Hakan

论文摘要

深度神经网络的实际缺点是它们专门针对单个任务和领域。尽管最新的域适应和多域学习技术可以学习更多的领域 - 静态特征,但它们的成功依赖于域标签的存在,通常需要手动注释和仔细的数据集策划。在这里,我们专注于一个较少但更现实的情况:从多个域中从数据中学习,而无需访问域注释。在这种情况下,标准模型培训会导致大型域的过度拟合,同时无视较小的域。我们通过动态残留适配器来解决此限制,这是一种自适应门控机制,有助于解释潜在领域,再加上受最近样式转移技术启发的增强策略。我们提出的方法在包含多个潜在领域的图像分类任务上进行了检查,我们展示了其在这些方面获得稳健性能的能力。动态剩余适配器的容量要大得多,并且可以以端到端的方式与现有架构无缝合并。

A practical shortcoming of deep neural networks is their specialization to a single task and domain. While recent techniques in domain adaptation and multi-domain learning enable the learning of more domain-agnostic features, their success relies on the presence of domain labels, typically requiring manual annotation and careful curation of datasets. Here we focus on a less explored, but more realistic case: learning from data from multiple domains, without access to domain annotations. In this scenario, standard model training leads to the overfitting of large domains, while disregarding smaller ones. We address this limitation via dynamic residual adapters, an adaptive gating mechanism that helps account for latent domains, coupled with an augmentation strategy inspired by recent style transfer techniques. Our proposed approach is examined on image classification tasks containing multiple latent domains, and we showcase its ability to obtain robust performance across these. Dynamic residual adapters significantly outperform off-the-shelf networks with much larger capacity, and can be incorporated seamlessly with existing architectures in an end-to-end manner.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源