论文标题
DeepAbtract:用于加速验证的神经网络抽象
DeepAbstract: Neural Network Abstraction for Accelerating Verification
论文作者
论文摘要
虽然抽象是验证扩展的经典工具,但它并不经常用于验证神经网络。但是,它可以帮助将现有算法扩展到最先进的网络体系结构的仍在开放的任务中。我们引入了一个适用于完全连接的馈送前馈神经网络的抽象框架,基于对某些输入类似的神经元的聚类的聚类。对于Relu的特定情况,我们还提供抽象引起的误差界。我们展示了抽象如何降低网络的大小,同时保留其准确性,以及如何将抽象网络上的验证转移回原始网络。
While abstraction is a classic tool of verification to scale it up, it is not used very often for verifying neural networks. However, it can help with the still open task of scaling existing algorithms to state-of-the-art network architectures. We introduce an abstraction framework applicable to fully-connected feed-forward neural networks based on clustering of neurons that behave similarly on some inputs. For the particular case of ReLU, we additionally provide error bounds incurred by the abstraction. We show how the abstraction reduces the size of the network, while preserving its accuracy, and how verification results on the abstract network can be transferred back to the original network.