Computer Science ›› 2023, Vol. 50 ›› Issue (3): 42-48.doi: 10.11896/jsjkx.220600239

• Special Issue of Knowledge Engineering Enabled By Knowledge Graph: Theory, Technology and System • Previous Articles     Next Articles

BGPNRE:A BERT-based Global Pointer Network for Named Entity-Relation Joint Extraction Method

DENG Liang1,2,3, QI Panhu4, LIU Zhenlong4, LI Jingxin4, TANG Jiqiang5   

  1. 1 University of Chinese Academy of Sciences,Beijing 100049,China
    2 Shenyang Institute of Computing Technology,Chinese Academy of Sciences,Shenyang 110168,China
    3 China National Intellectual Property Administration,Beijing 100083,China
    4 Institute of Computing Technology,Chinese Academy of Sciences,Beijing 100190,China
    5 National Computer Network Emergency Response Technical Team/Coordination Center of China,Beijing 100029,China
  • Received:2022-06-27 Revised:2022-12-15 Online:2023-03-15 Published:2023-03-15
  • About author:DENG Liang,born in 1980,postgra-duate.His main research interests include deep learning and knowledge graph.
    QI Panhu,born in 1990,postgraduate.His main research interests include deep learning andnatural language Processing.

Abstract: Named entity-relation joint extraction refers to extracting entity-relation triples from unstructured text.It's an important task for information extraction and knowledge graph construction.This paper proposes a new method--BERT-based global pointer network for named entity-relation joint extraction(BGPNRE).Firstly,the potential relation prediction module is used to predict the relations contained in the text,filters out the impossible relations,and limits the predicted relation subset for entity recognition.Then a relation-specific global pointer-net is used to obtain the location of all subject and object entities.Finally,a global pointer network correspondence component is designed to align the subject and object position into named entity-relation triples.This method avoids error propagation frompipeline model,and also solves the the redundancy of relation prediction,entity overlapping,and poor generalization of span-based extraction.Extensive experiments show that our model achieves state-of-the-art performance on NYT and WebNLG public benchmarks with higher performance gain on multi relations and entities overlapping.

Key words: Named entity-relation joint extraction, BGPNRE, Global pointer network, BERT

CLC Number: 

  • TP391
[1]SANG E,MEULDER F.Language-independent named entityrecognition[C]//Proceedings of the Seventh Conference onNa-tural Language Learning at HLT-NAACL 2003.2003:142-147.
[2]RATINOV L,ROTH D.Design challenges and misconceptions in named entity recognition[C]//Proceedings of the Thirteenth Conference on Computational Natural Language Learning.2009:147-155.
[3]ZELNKO D,AONE C,RICHARDELLA A.Kernel methods for relation extraction[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing-Volume 10.2002:77-78.
[4]RAZVAN C,RAYMOND J.A shortest path dependency kernel for relation extraction[C]//Proceedings of the Human Language Technology Conference and Conference on Empirical Methods in Natural Language Processing.2005:724-731.
[5]PAWAR S,GIRISH K,BHATTACHARYYA P.Relation ex-traction:A survey[J/OL].https://doi.org/10.48550/arXiv.1712.05191.
[6]WANG Z,WEN R,CHEN X,et al.Finding in influential instances for distantly supervisedrelation extraction[J/OL].https://doi.org/10.48550/arXiv.1712.05191.
[7]YU X,LAM W.Jointly identifying entities and extracting relations in encyclopedia text via agraphical model approach[C]//The 28th International Conference on Computational Linguistics.2010:1399-1407.
[8]LI Q,JI H.Incremental joint extraction of entity mentions and relations[C]//Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics(Volume 1:Long Papers).2014:402-412.
[9]MIWA M,SASAKI Y.Modeling joint entity and relation extraction with tablerepresentation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing.2014:1858-1869.
[10]REN X,WU Z,HE W,et al.Cotype:Joint extraction of typed entities and relations with knowledge bases[C]//Proceedings of the 26th International Conference on World Wide Web.2017:1015-1024.
[11]ZHENG S,WANG F,BAO H,et al.Joint extraction of entities and relations based on a noveltatagging scheme[C]//Procee-dings of the 55th Annual Meeting of the Association for Computational Linguistics(Volume 1:Long Papers).2017:1227-1236.
[12]BEKOULIS G,DElEU J,DEMEESTER T,et al.Joint entityrecognition and relation extraction as a multi-head selection problem[J].Expert System with Applications,2018,114:34-45.
[13]NAYAK T,TOU H.Effective modeling of encoder-decoder architecture for joint entity and relation extraction[C]//Procee-dings of the AAAI Conference on Artifificial Intelligence.2020:8528-8535.
[14]WEI Z,SU J,WANG Y,et al.A novel cascade binary tagging framework for relational tripleextraction[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.2020:1476-1488.
[15]YUAN Y,ZHOU X,PAN S,et al.A relation-specifific attention network for joint entity andrelation extraction[C]//Inter-national Joint Confernce on Artifificial Intelligence.2020:4054-4060.
[16]WANG Y,YU B,ZHANG Y,et al.TPLinker:Single-stage joint extraction of entities and relations through token pair linking[C]//Proceedings of the 28th International Conference on Computational Linguistics.2020:1572-1582.
[17]SUI D,CHEN Y,LIU K,et al.Joint Entity and Relation Extraction with Set Prediction Networks [J/OL].https://doi.org/ 10.48550/arXiv.2011.01675.
[18]ZENG X,HE S,ZENG D,et al.PRGC:Potential Relation andGlobal Correspondence Based Joint Relational Triple Extraction[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics.2021:6225-6235.
[19]SHANG Y,HUANG H,MAO X.OneRel:Joint Entity and Relation Extraction with One Module in One Step[C]//Procee-dings of the AAAI Conference on Artificial Intelligence.2022:11285-11293.
[20]VINYALS O,FORTUNATO M,JAITLY N.Pointer networks[C]//Advances in Neural Information Processing Systems.2015:2674-2682.
[21]DEVLIN J,CHANG M,LEE K.Bert:Pre-training of deep bidirectional transformers forlanguage understanding[C]//Annual Conference of the North American Chapter of the Association for Computational Linguistics.2019:4171-4186.
[22]VASWANI A,SHAZEER N,PARMAR N,et al.Attention isall you need[C]//Advances in Neural Information Processing Systems.2017:6000-6010.
[23]SU J,LU Y,PAN S,et al.RoFormer:Enhanced Transformer with Rotary Position Embedding[J/OL].https://doi.org/10.48550/arXiv.2104.09864.
[24]LIN M,CHEN Q,YAN S.Network in network[C]//2nd International Conference on Learning Representations.2014:14-16.
[25]SUN Y,CHENG C,ZHANG Y,et al.Circle Loss:A Unified Perspective of Pair Similarity Optimization[J/OL].https://doi.org/ 10.48550/arXiv.2002.10857.
[26]YU B,ZHANG Z,SU J,et al.Joint extraction of entities and relations based on a novel decomposition strategy[C]//24th European Conference on Artifificial Intelligence-ECAI 2020.2019.
[27]RIEDEL S,YAO L,MCCALLUM A.Modeling relations andtheir mentions without labeled text [C]//Proceedings of Joint European Conference on Machine Learning and Knowledge Discovery in Databases.2010.
[28]GARDENT C,SHIMORINA A,NARAYAN S,et al.Creating training corpora for NLG microplanners[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics(Volume 1:Long Papers).2017:179-188.
[29]DIEDERIK P,BA J.Adam:A method for stochastic optimization[C]//3rd International Conference on Learning Representations.2015.
[30]LOSHCHILOV I,HUTTER F.Fixing weight decay regularization in Adam [J/OL].https://doi.org/10.48550/arXiv.1711.05101.
[31]ZENG X,ZENG D,HE S,et al.Extracting relational facts by an end-to-end neural model withCopy Mechanism[C]//Procee-dings of the 56th Annual Meeting of the Association for Computational(Volume1:Long Papers).2018:506-514.
[32]FU T,LI P,MA W.GraphRel:Modeling text as Relationalgraphs for joint entity and relation extraction[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.2019:1409-1418.
[33]ZENG X,HE S,ZENG D,et al.Learning the extraction order of multiple relational facts in a sentence with reinforcement learning[C]//Proceedings of the 56th Annual Meeting of the Association for Computational(Volume1:Long Papers).2018:506-514.
[1] LIU Zhe, YIN Chengfeng, LI Tianrui. Chinese Spelling Check Based on BERT and Multi-feature Fusion Embedding [J]. Computer Science, 2023, 50(3): 282-290.
[2] YU Jia-qi, KANG Xiao-dong, BAI Cheng-cheng, LIU Han-qing. New Text Retrieval Model of Chinese Electronic Medical Records [J]. Computer Science, 2022, 49(6A): 32-38.
[3] KANG Yan, WU Zhi-wei, KOU Yong-qi, ZHANG Lan, XIE Si-yu, LI Hao. Deep Integrated Learning Software Requirement Classification Fusing Bert and Graph Convolution [J]. Computer Science, 2022, 49(6A): 150-158.
[4] YU Ben-gong, ZHANG Zi-wei, WANG Hui-ling. TS-AC-EWM Online Product Ranking Method Based on Multi-level Emotion and Topic Information [J]. Computer Science, 2022, 49(6A): 165-171.
[5] GUO Yu-xin, CHEN Xiu-hong. Automatic Summarization Model Combining BERT Word Embedding Representation and Topic Information Enhancement [J]. Computer Science, 2022, 49(6): 313-318.
[6] WEI Ru-ming, CHEN Ruo-yu, LI Han, LIU Xu-hong. Analysis of Technology Trends Based on Deep Learning and Text Measurement [J]. Computer Science, 2022, 49(11A): 211100119-6.
[7] CHEN Zi-zhuo, LIN Xi, WANG Zhong-qing. Stance Detection Based on Argument Boundary Recognition [J]. Computer Science, 2022, 49(11A): 210800180-5.
[8] ZHU Ruo-chen, YANG Chang-chun, ZHANG Deng-hui. EGOS-DST:Efficient Schema-guided Approach to One-step Dialogue State Tracking for Diverse Expressions [J]. Computer Science, 2022, 49(11A): 210900246-7.
[9] CHENG Si-wei, GE Wei-yi, WANG Yu, XU Jian. BGCN:Trigger Detection Based on BERT and Graph Convolution Network [J]. Computer Science, 2021, 48(7): 292-298.
[10] DONG Zhe, SHAO Ruo-qi, CHEN Yu-liang, ZHAI Wei-feng. Named Entity Recognition in Food Field Based on BERT and Adversarial Training [J]. Computer Science, 2021, 48(5): 247-253.
[11] SHANG Xi-xue, HAN Hai-ting, ZHU Zheng-zhou. Mechanism Design of Right to Earnings of Data Utilization Based on Evolutionary Game Model [J]. Computer Science, 2021, 48(3): 144-150.
[12] CHEN De, SONG Hua-zhu, ZHANG Juan, ZHOU Hong-lin. Entity Recognition Fusing BERT and Memory Networks [J]. Computer Science, 2021, 48(10): 91-97.
[13] TIAN Wei-wei, ZHOU Yue, YIN Wang, HE Ling, DENG Li-hua and LI Yuan-yuan. Automatic Voice Detection Algorithm for Schizophrenic Combining EHHT and CI [J]. Computer Science, 2020, 47(6A): 187-195.
[14] DU Lin, CAO Dong, LIN Shu-yuan, QU Yi-qian, YE Hui. Extraction and Automatic Classification of TCM Medical Records Based on Attention Mechanism of BERT and Bi-LSTM [J]. Computer Science, 2020, 47(11A): 416-420.
[15] WANG Zi-niu, JIANG Meng, GAO Jian-ling, CHEN Ya-xian. Chinese Named Entity Recognition Method Based on BERT [J]. Computer Science, 2019, 46(11A): 138-142.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!