论文标题

使用零射的学习,较少见到的课堂示例

Towards Zero-Shot Learning with Fewer Seen Class Examples

论文作者

Verma, Vinay Kumar, Mishra, Ashish, Pandey, Anubha, Murthy, Hema A., Rai, Piyush

论文摘要

当每个\ emph {Sew}类中的训练示例数量很少时,我们提出了一个基于元学习的生成模型(ZSL),朝着具有挑战性的环境。该设置与常规ZSL方法形成鲜明对比,在该方法中,培训通常假设从每个可见类别中提供了足够数量的培训示例。所提出的方法利用元学习来训练一个深入的生成模型,该模型整合了变异自动编码器和生成对抗网络。我们提出了一个新颖的任务分布,其中元训练和元验证类别是模拟训练中ZSL行为的脱节。经过培训后,该模型可以从可见和看不见的类中生成合成示例。然后可以使用合成样品以监督的方式训练ZSL框架。元学习者使我们的模型仅使用来自见证类的少数培训示例生成高保真样本。我们对ZSL的四个基准数据集进行了广泛的实验和消融研究,并观察到,当每个看到类的示例数量很少时,所提出的模型的表现优于最先进的方法。

We present a meta-learning based generative model for zero-shot learning (ZSL) towards a challenging setting when the number of training examples from each \emph{seen} class is very few. This setup contrasts with the conventional ZSL approaches, where training typically assumes the availability of a sufficiently large number of training examples from each of the seen classes. The proposed approach leverages meta-learning to train a deep generative model that integrates variational autoencoder and generative adversarial networks. We propose a novel task distribution where meta-train and meta-validation classes are disjoint to simulate the ZSL behaviour in training. Once trained, the model can generate synthetic examples from seen and unseen classes. Synthesize samples can then be used to train the ZSL framework in a supervised manner. The meta-learner enables our model to generates high-fidelity samples using only a small number of training examples from seen classes. We conduct extensive experiments and ablation studies on four benchmark datasets of ZSL and observe that the proposed model outperforms state-of-the-art approaches by a significant margin when the number of examples per seen class is very small.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源