论文标题

K-NN在局部平稳性假设下积极学习

K-NN active learning under local smoothness assumption

论文作者

Njike, Boris Ndjia, Siebert, Xavier

论文摘要

在被动学习或主动学习中,关于收敛率的工作大量工作。在这里,我们首先概述了已经获得的一些主要结果,更具体地说是在非参数设置中,在假设回归函数(或类之间的边界)和边缘噪声之间的平稳性下。我们通过在被动学习方面将积极学习的视角介绍了这些基本假设的相对优点。我们使用针对K-Nearest邻居定制的特定平滑度假设设计了一种主动学习算法,其收敛速率比被动学习率更好。与以前的主动学习算法不同,我们使用平滑度假设,该假设依赖于实例空间的边际分布。此外,我们的算法避免了强密度假设,即假定实例空间边缘分布的密度函数的存在,因此更普遍地适用。

There is a large body of work on convergence rates either in passive or active learning. Here we first outline some of the main results that have been obtained, more specifically in a nonparametric setting under assumptions about the smoothness of the regression function (or the boundary between classes) and the margin noise. We discuss the relative merits of these underlying assumptions by putting active learning in perspective with recent work on passive learning. We design an active learning algorithm with a rate of convergence better than in passive learning, using a particular smoothness assumption customized for k-nearest neighbors. Unlike previous active learning algorithms, we use a smoothness assumption that provides a dependence on the marginal distribution of the instance space. Additionally, our algorithm avoids the strong density assumption that supposes the existence of the density function of the marginal distribution of the instance space and is therefore more generally applicable.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源