论文标题

从小样本中学习时,生成潜在的隐式有条件优化

Generative Latent Implicit Conditional Optimization when Learning from Small Sample

论文作者

Azuri, Idan, Weinshall, Daphna

论文摘要

我们重新审视了从一个小样本中学习的长期存在的问题,我们提出了一种称为Glico(生成潜在隐式有条件优化)的新方法。 Glico从训练示例到潜在空间和一个生成来自潜在空间的向量的图像的生成器学习映射。与最近依靠访问大量未标记数据的最新作品不同,Glico不需要访问以外的任何其他数据。实际上,Glico学会了使用每类的5或10个示例为每个班级合成全新的样本,其中只有10个这样的班级而没有任何先验。然后,Glico用于在小样本上训练分类器时增强小型训练集。为此,我们提出的方法使用球形插值对学习的潜在空间进行了示例,并使用训练有素的发电机生成新的示例。经验结果表明,新的采样集已经足够多样化,与从CIFAR-10,CIFAR-100和CUB-200的小样本进行培训时,与最新的样本进行了培训,从而改善了图像分类。

We revisit the long-standing problem of learning from a small sample, to which end we propose a novel method called GLICO (Generative Latent Implicit Conditional Optimization). GLICO learns a mapping from the training examples to a latent space and a generator that generates images from vectors in the latent space. Unlike most recent works, which rely on access to large amounts of unlabeled data, GLICO does not require access to any additional data other than the small set of labeled points. In fact, GLICO learns to synthesize completely new samples for every class using as little as 5 or 10 examples per class, with as few as 10 such classes without imposing any prior. GLICO is then used to augment the small training set while training a classifier on the small sample. To this end, our proposed method samples the learned latent space using spherical interpolation, and generates new examples using the trained generator. Empirical results show that the new sampled set is diverse enough, leading to improvement in image classification in comparison with the state of the art, when trained on small samples obtained from CIFAR-10, CIFAR-100, and CUB-200.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源