论文标题

卫星式:基于变压器的Unsat Core学习

SATformer: Transformer-Based UNSAT Core Learning

论文作者

Shi, Zhengyuan, Li, Min, Liu, Yi, Khan, Sadaf, Huang, Junhua, Zhen, Hui-Ling, Yuan, Mingxuan, Xu, Qiang

论文摘要

本文介绍了Satformer,这是一种基于新颖的变压器令人满意(SAT)问题的新型方法。 Satformer没有直接解决问题,而是通过关注不可满足性来从相反的方向解决问题。具体而言,它模拟子句交互以识别任何难以满足的子问题。使用图神经网络,我们将条款转换为子句嵌入,并采用基于层次变压器的模型来了解子句相关。 Satformer通过多任务学习方法进行了培训,使用单位可满足结果和最小的不满意的核心(MUC)作为子句监督。作为基于端到端的学习满足性分类器,卫星仪的性能大大超过了神经肌的表现。此外,我们将Satformer做出的子句预测集成到现代启发式的SAT求解器中,并通过逻辑等价检查任务验证我们的方法。实验结果表明,我们的卫生型可以将现有求解器的运行时间降低21.33%。

This paper introduces SATformer, a novel Transformer-based approach for the Boolean Satisfiability (SAT) problem. Rather than solving the problem directly, SATformer approaches the problem from the opposite direction by focusing on unsatisfiability. Specifically, it models clause interactions to identify any unsatisfiable sub-problems. Using a graph neural network, we convert clauses into clause embeddings and employ a hierarchical Transformer-based model to understand clause correlation. SATformer is trained through a multi-task learning approach, using the single-bit satisfiability result and the minimal unsatisfiable core (MUC) for UNSAT problems as clause supervision. As an end-to-end learning-based satisfiability classifier, the performance of SATformer surpasses that of NeuroSAT significantly. Furthermore, we integrate the clause predictions made by SATformer into modern heuristic-based SAT solvers and validate our approach with a logic equivalence checking task. Experimental results show that our SATformer can decrease the runtime of existing solvers by an average of 21.33%.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源