论文标题

单杆MC脱落近似

Single Shot MC Dropout Approximation

论文作者

Brach, Kai, Sick, Beate, Dürr, Oliver

论文摘要

深度神经网络(DNN)以其高预测性能而闻名,尤其是在诸如对象识别或自主驾驶之类的感知任务中。尽管如此,在遇到全新情况时,DNN仍然容易产生不可靠的预测,而不会表明其不确定性。 DNNS(BDNNS)的贝叶斯变体,例如MC辍学BDNNS,确实提供了不确定性度量。但是,BDNN在测试期间的速度很慢,因为它们依赖于抽样方法。在这里,我们提出了一个镜头MC辍学近似,该近似可保留BDNN的优势,而不会比DNN慢。我们的方法是在完全连接的网络中分析近似于MC辍学信号的预期值和方差。我们在不同的基准数据集和模拟玩具示例上评估我们的方法。我们证明,我们的单个SHOT MC辍学近似类似于点估计值和通过MC方法实现的预测分布的不确定性估计,同时足够快地进行BDNNS的实时部署。

Deep neural networks (DNNs) are known for their high prediction performance, especially in perceptual tasks such as object recognition or autonomous driving. Still, DNNs are prone to yield unreliable predictions when encountering completely new situations without indicating their uncertainty. Bayesian variants of DNNs (BDNNs), such as MC dropout BDNNs, do provide uncertainty measures. However, BDNNs are slow during test time because they rely on a sampling approach. Here we present a single shot MC dropout approximation that preserves the advantages of BDNNs without being slower than a DNN. Our approach is to analytically approximate for each layer in a fully connected network the expected value and the variance of the MC dropout signal. We evaluate our approach on different benchmark datasets and a simulated toy example. We demonstrate that our single shot MC dropout approximation resembles the point estimate and the uncertainty estimate of the predictive distribution that is achieved with an MC approach, while being fast enough for real-time deployments of BDNNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源