论文标题

生成对抗网络的一般性单发域改编

Generalized One-shot Domain Adaptation of Generative Adversarial Networks

论文作者

Zhang, Zicheng, Liu, Yinglu, Han, Congying, Guo, Tiande, Yao, Ting, Mei, Tao

论文摘要

生成对抗网络(GAN)的适应旨在将预训练的GAN转移到有限的培训数据的情况下。在本文中,我们专注于单次案例,这在以前的作品中更具挑战性,很少探索。我们认为,从源域到目标域的适应性可以分为两个部分:全球样式(如纹理和颜色)的转移,以及不属于源域的新实体的出现。虽然先前的作品主要关注样式转移,但我们提出了一个新颖而简洁的框架,以解决样式和实体传输的\ textIt {概括的一击自适应}任务,其中提供了参考图像及其二进制实体掩码。我们的核心思想是通过切成薄片的Wasserstein距离来限制参考的内部分布与合成的内部分布之间的差距。为了更好地实现这一目标,首先使用样式固定来大致获取模范样式,并将辅助网络引入发电机以解开实体和样式传输。此外,为了实现跨域的对应关系,我们提出了变异的拉普拉斯正则化,以限制适应性发生器的平滑度。定量和定性实验都证明了我们方法在各种情况下的有效性。代码可在\ url {https://github.com/zhangzc21/generalized-one-shot-gan-apaptation}获得。

The adaptation of a Generative Adversarial Network (GAN) aims to transfer a pre-trained GAN to a target domain with limited training data. In this paper, we focus on the one-shot case, which is more challenging and rarely explored in previous works. We consider that the adaptation from a source domain to a target domain can be decoupled into two parts: the transfer of global style like texture and color, and the emergence of new entities that do not belong to the source domain. While previous works mainly focus on style transfer, we propose a novel and concise framework to address the \textit{generalized one-shot adaptation} task for both style and entity transfer, in which a reference image and its binary entity mask are provided. Our core idea is to constrain the gap between the internal distributions of the reference and syntheses by sliced Wasserstein distance. To better achieve it, style fixation is used at first to roughly obtain the exemplary style, and an auxiliary network is introduced to the generator to disentangle entity and style transfer. Besides, to realize cross-domain correspondence, we propose the variational Laplacian regularization to constrain the smoothness of the adapted generator. Both quantitative and qualitative experiments demonstrate the effectiveness of our method in various scenarios. Code is available at \url{https://github.com/zhangzc21/Generalized-One-shot-GAN-adaptation}.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源