论文标题
Pac-Bayes范围的一般框架用于元学习
A General framework for PAC-Bayes Bounds for Meta-Learning
论文作者
论文摘要
元学习会自动渗透一种归纳偏差,其中包括基础学习算法的超参数,通过观察来自有限数量相关任务的数据。本文研究了pac-bayes在元概括差距上的界限。元化差距包括两个概括差距的来源:分别观察到有限数量的任务和数据示例的环境级别和任务级别差距,分别是每个任务。在本文中,通过上边界任意凸功能,将环境的预期和经验损失与每个任务水平联系起来,我们获得了新的pac-bayes边界。使用这些边界,我们开发了新的Pac-Bayes元学习算法。数值示例证明了与先前的pac-bayes界限进行元学习相比,提出的新型界限和算法的优点。
Meta learning automatically infers an inductive bias, that includes the hyperparameter of the base-learning algorithm, by observing data from a finite number of related tasks. This paper studies PAC-Bayes bounds on meta generalization gap. The meta-generalization gap comprises two sources of generalization gaps: the environment-level and task-level gaps resulting from observation of a finite number of tasks and data samples per task, respectively. In this paper, by upper bounding arbitrary convex functions, which link the expected and empirical losses at the environment and also per-task levels, we obtain new PAC-Bayes bounds. Using these bounds, we develop new PAC-Bayes meta-learning algorithms. Numerical examples demonstrate the merits of the proposed novel bounds and algorithm in comparison to prior PAC-Bayes bounds for meta-learning.