论文标题

借助套索的小型调谐参数选择

Small Tuning Parameter Selection for the Debiased Lasso

论文作者

Shinkyu, Akira, Sueishi, Naoya

论文摘要

在这项研究中,当选择节点套索的调谐参数比以前的研究小时,我们研究了线性回归中发络拉索的偏差和方差特性。我们考虑的情况是,协变量$ p $的数量受样本大小$ n $的常数倍数的界定。 First, we show that the bias of the debiased Lasso can be reduced without diverging the asymptotic variance by setting the order of the tuning parameter to $1/\sqrt{n}$.This implies that the debiased Lasso has asymptotic normality provided that the number of nonzero coefficients $s_0$ satisfies $s_0=o(\sqrt{n/\log p})$,而先前的研究要求$ s_0 = o(\ sqrt {n}/\ log p)$,如果在Precision矩阵上没有稀疏假设。其次,我们为节点套索提出了一个数据驱动的调整参数选择程序,与我们的理论结果一致。仿真研究表明,我们的程序在各种环境中产生具有良好覆盖范围的置信区间。我们还提供了一个真实的经济数据示例,以证明我们选择程序的功效。

In this study, we investigate the bias and variance properties of the debiased Lasso in linear regression when the tuning parameter of the node-wise Lasso is selected to be smaller than in previous studies. We consider the case where the number of covariates $p$ is bounded by a constant multiple of the sample size $n$. First, we show that the bias of the debiased Lasso can be reduced without diverging the asymptotic variance by setting the order of the tuning parameter to $1/\sqrt{n}$.This implies that the debiased Lasso has asymptotic normality provided that the number of nonzero coefficients $s_0$ satisfies $s_0=o(\sqrt{n/\log p})$, whereas previous studies require $s_0 =o(\sqrt{n}/\log p)$ if no sparsity assumption is imposed on the precision matrix. Second, we propose a data-driven tuning parameter selection procedure for the node-wise Lasso that is consistent with our theoretical results. Simulation studies show that our procedure yields confidence intervals with good coverage properties in various settings. We also present a real economic data example to demonstrate the efficacy of our selection procedure.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源