论文标题

关于二元假设检验的普遍性和培训

On Universality and Training in Binary Hypothesis Testing

论文作者

Bell, Michael, Kochman, Yuval

论文摘要

经典的二元假设检验问题被重新审视。我们注意到,当其中一个假设是综合时,定义既有信息又完善的最佳标准存在固有的困难。为了在简单的正常位置问题中进行测试(即对多元高斯人的平均值进行测试),我们克服了难度如下。在此问题中,参数之间存在自然硬度顺序,而对于不同的参数,错误概率曲线(已知参数是相同的),或者一个曲线是相同的。因此,我们可以将Minimax性能定义为低于某些硬度水平的参数中最差的案例。幸运的是,存在一个通用的minimax测试,从某种意义上说,它同时是所有硬度级别的最小值。在此标准下,我们还找到了使用培训数据进行复合假设检验的最佳测试。该标准以渐近含义是误差概率是加性的,该标准扩展到了广泛的局部渐近普通模型。由于我们在具有和没有培训数据的情况下进行了渐近的合成假设检验的最佳测试,因此我们量化了这些模型的普遍性和培训数据的增益。

The classical binary hypothesis testing problem is revisited. We notice that when one of the hypotheses is composite, there is an inherent difficulty in defining an optimality criterion that is both informative and well-justified. For testing in the simple normal location problem (that is, testing for the mean of multivariate Gaussians), we overcome the difficulty as follows. In this problem there exists a natural hardness order between parameters as for different parameters the error-probailities curves (when the parameter is known) are either identical, or one dominates the other. We can thus define minimax performance as the worst-case among parameters which are below some hardness level. Fortunately, there exists a universal minimax test, in the sense that it is minimax for all hardness levels simultaneously. Under this criterion we also find the optimal test for composite hypothesis testing with training data. This criterion extends to the wide class of local asymptotic normal models, in an asymptotic sense where the approximation of the error probabilities is additive. Since we have the asymptotically optimal tests for composite hypothesis testing with and without training data, we quantify the loss of universality and gain of training data for these models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源