论文标题

模态线性回归的内核选择:最佳内核和IRLS算法

Kernel Selection for Modal Linear Regression: Optimal Kernel and IRLS Algorithm

论文作者

Yamasaki, Ryoya, Tanaka, Toshiyuki

论文摘要

模态线性回归(MLR)是一种获得条件模式预测器作为线性模型的方法。我们从两个角度研究了MLR的内核选择:“哪些内核达到较小的错误?”和“哪个内核在计算上有效?”。首先,我们表明,在最小化所得MLR参数的渐近平方平方误差的意义上,双重内核是最佳的。该结果来自我们对MLR渐近统计行为的精致分析。其次,我们提供一个内核类别,迭代重新加权最小二乘算法(IRLS)保证会收敛,尤其是证明具有Epanechnikov内核的IRL在有限数量的迭代次数中终止。仿真研究从经验上证明,使用双重核提供了良好的估计准确性,并且使用Epanechnikov内核具有计算效率。我们的结果改善了MLR,其中现有研究通常坚持使用核心选择的指南,而专门针对它的高斯内核和模态EM算法。

Modal linear regression (MLR) is a method for obtaining a conditional mode predictor as a linear model. We study kernel selection for MLR from two perspectives: "which kernel achieves smaller error?" and "which kernel is computationally efficient?". First, we show that a Biweight kernel is optimal in the sense of minimizing an asymptotic mean squared error of a resulting MLR parameter. This result is derived from our refined analysis of an asymptotic statistical behavior of MLR. Secondly, we provide a kernel class for which iteratively reweighted least-squares algorithm (IRLS) is guaranteed to converge, and especially prove that IRLS with an Epanechnikov kernel terminates in a finite number of iterations. Simulation studies empirically verified that using a Biweight kernel provides good estimation accuracy and that using an Epanechnikov kernel is computationally efficient. Our results improve MLR of which existing studies often stick to a Gaussian kernel and modal EM algorithm specialized for it, by providing guidelines of kernel selection.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源