论文标题
可扩展的随机参数验证,并具有随机变化平滑模型检查
Scalable Stochastic Parametric Verification with Stochastic Variational Smoothed Model Checking
论文作者
论文摘要
随机模型的线性时间属性的参数验证可以表示为计算某个属性作为模型参数的函数的满意度概率。平滑的模型检查(SMMC)旨在从通过模拟获得的一组有限的观测值来推断整个参数空间上的满意度函数。随着观察结果昂贵且嘈杂,SMMC被构成贝叶斯推论问题,因此估计值对不确定性具有额外的量化。在SMMC中,作者使用高斯过程(GP),通过期望传播算法推断出来。这种方法可通过对不确定性的统计清晰量化提供准确的重建。但是,它继承了GP的众所周知的可伸缩性问题。在本文中,我们利用了概率机器学习的最新进展来向前推动此限制,从而使SMMC可扩展对较大数据集的推断,并将其应用于具有高维参数空间的模型。我们提出了随机变化平滑模型检查(SV-SMMC),该解决方案利用随机变异推理(SVI)以近似SMMC问题的后验分布。 SVI的强度和灵活性使SV-SMMC适用于两个替代概率模型:高斯过程(GP)和贝叶斯神经网络(BNN)。 SVI的核心成分是一种基于随机梯度的优化,使推理易于并行化,并实现GPU加速度。在本文中,我们通过查看重建满意度函数的可扩展性,计算效率和准确性来比较SMMC与SV-SMMC的性能。
Parametric verification of linear temporal properties for stochastic models can be expressed as computing the satisfaction probability of a certain property as a function of the parameters of the model. Smoothed model checking (smMC) aims at inferring the satisfaction function over the entire parameter space from a limited set of observations obtained via simulation. As observations are costly and noisy, smMC is framed as a Bayesian inference problem so that the estimates have an additional quantification of the uncertainty. In smMC the authors use Gaussian Processes (GP), inferred by means of the Expectation Propagation algorithm. This approach provides accurate reconstructions with statistically sound quantification of the uncertainty. However, it inherits the well-known scalability issues of GP. In this paper, we exploit recent advances in probabilistic machine learning to push this limitation forward, making Bayesian inference of smMC scalable to larger datasets and enabling its application to models with high dimensional parameter spaces. We propose Stochastic Variational Smoothed Model Checking (SV-smMC), a solution that exploits stochastic variational inference (SVI) to approximate the posterior distribution of the smMC problem. The strength and flexibility of SVI make SV-smMC applicable to two alternative probabilistic models: Gaussian Processes (GP) and Bayesian Neural Networks (BNN). The core ingredient of SVI is a stochastic gradient-based optimization that makes inference easily parallelizable and that enables GPU acceleration. In this paper, we compare the performances of smMC against those of SV-smMC by looking at the scalability, the computational efficiency and the accuracy of the reconstructed satisfaction function.