计算机科学 ›› 2024, Vol. 51 ›› Issue (5): 193-199.doi: 10.11896/jsjkx.230300193
兰咏琪1, 何星星1, 李莹芳2, 李天瑞3
LAN Yongqi1, HE Xingxing1, LI Yingfang2, LI Tianrui3
摘要: 自动定理证明器在证明问题时其搜索空间通常会呈现爆炸式增长,前提选择为该问题提供了新的解决思路。针对现有前提选择方法中逻辑公式图、图神经网络模型难以捕捉到公式图潜在信息的问题,提出了一种基于删除重复量词的简化逻辑公式图表示和具有注意力机制的项游走图神经网络模型,充分利用逻辑公式的语法和语义信息提高前提选择问题的分类精度。首先,将一阶逻辑猜想和前提公式转化为基于删除重复量词的简化一阶逻辑公式图;其次,利用消息传递图神经网络对节点和节点的项游走特征信息进行聚合和更新,随后使用注意力机制为图上的节点分配权重,进而调整图节点嵌入信息;最后,将前提图向量和猜想图向量拼接并输入二元分类器中实现前提分类。实验结果表明,所提方法在MPTP数据集和CNF数据集上的准确率分别达到了88.61%和84.74%,超越现有最优的前提选择方法。
中图分类号:
[1]LE SERGENT T.SCADE:A comprehensive framework for cri-tical system and software engineering[C]//Integrating System and Software Modeling:15th International SDL Forum Toulouse(SDL 2011).France,2011.Revised Papers 15.Berlin Heidelberg:Springer,2012:2-3. [2]LARRABEE T.Test pattern generation using Boolean satisfiability[J].IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems,1992,11(1):4-15. [3]EGGERSGLUSS S,DRECHSLER R.Efficient data structures and methodologies for SAT-based ATPG providing high fault coverage in industrial application[J].IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems,2011,30(9):1411-1415. [4]DARWIVHE A.Compiling knowledge into decomposable negation normal form[C]//IJCAI.1999,99:284-289. [5]JACKSON D,SCHECHTER I,SHLYAHTER H.Alcoa:the alloy constraint analyzer[C]//Proceedings of the 22nd International Conference on Software Engineering.2000:730-733. [6]KALISZYK C,URBAN J.Mizar 40 for mizar 40[J].Journal of Automated Reasoning,2015,55(3):245-256. [7]MENG J,PAULSON L C.Lightweight relevance filtering for machine-generated resolution problems[J].Journal of Applied Logic,2009,7(1):41-57. [8]KALISZYK C,URBAN J.Learning-assisted automated reasoning with flyspeck[J].Journal of Automated Reasoning,2014,53(2):173-213. [9]IRVING G,SZEGEDY C,ALEMI A A,et al.Deepmath-deep sequence models for premise selection[J].arXiv:1606.04442,2016. [10]LOOS S,IRVING G,SZEGEDY C,et al.Deep network guided proof search[J].arXiv:1701.06972,2017. [11]CHVALOVSKY K,JAKUBUV J,SUDA M,et al.ENIGMA-NG:efficient neural and gradient-boosted inference guidance for E[C]//Automated Deduction-CADE 27:27th International Conference on Automated Deduction.Natal,Brazil:Springer International Publishing,2019:197-215. [12]SCARSELLI F,GORI M,TSOI A C,et al.The graph neural network model[J].IEEE Transactions on Neural Networks,2008,20(1):61-80. [13]WANG M,TANG Y,WANG J,et al.Premise selection for theorem proving by deep graph embedding[C]//Advances in Neural Information Processing Systems.2017:2786-2796. [14]SCHLICHTKRULL M,KIPF T N,BLOEM P,et al.Modeling relational data with graph convolutional networks[C]//The Semantic Web:15th International Conference(ESWC 2018).Hera-klion,Crete,Greece,Springer International Publishing,2018:593-607. [15]CROUSE M,ABDELAZIZ I,CORNELIO C,et al.Improvinggraph neural network representations of logical formulae with subgraph pooling[J].arXiv:1911.06904,2019. [16]PALIWAL A,LOOS S,RABE M,et al.Graph representations for higher-order logic and theorem proving[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2020:2967-2974. [17]LIU X H.Automatic Reasoning Based on Conclusion Method[M].Science Press,1994:15-16. [18]BOLLOBAS B.Modern graph theory[M].Springer Science & Business Media,1998. [19]LIU Q H.Research on First-order Logic Premise Selection for Grand Theory [D].Chengdu:Southwest Jiaotong University,2021. [20]JAKUBUV J,URBAN J.ENIGMA:efficient learning-based inference guiding machine[C]//10th International Conference Intelligent Computer Mathematics(CICM 2017).Edinburgh,UK,2017,Springer International Publishing,2017:292-302. [21]PASZKE A,GROSS S,MASSA F,et al.Pytorch:An imperative style,high-performance deep learning library[J].arXiv:1912.01703,2019. [22]RUDNICKI P.An overview of the Mizar project[C]//Procee-dings of the 1992 Workshop on Types for Proofs and Programs.1992:311-330. [23]KINGMA D P,BA J.Adam:A method for stochastic optimization[J].arXiv:1412.6980,2014. [24]KIPF T N,WELLING M.Semi-supervised classification withgraph convolutional networks[J].arXiv:1609.02907,2016. [25]VELICKOVIC P,CUCURULL G,CASANOVA A,et al.Graph attention networks[J].arXiv:1710.10903,2017. [26]HAMILTON W,YING Z,LESKOVEC J.Inductive representation learning on large graphs[J].arXiv:1706.02216,2017. [27]WU F,SOUZA A,ZHANG T,et al.Simplifying graph convolutional networks[C]//International Conference on Machine Learning.PMLR,2019:6861-6871. [28]DEFFERRARD M,BRESSON X,VANDERGHEYNST P.Convolutional neural networks on graphs with fast localized spectral filtering[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems.2016:3844-3852. |
|