Computer Science ›› 2019, Vol. 46 ›› Issue (9): 237-242.doi: 10.11896/j.issn.1002-137X.2019.09.035

• Artificial Intelligence • Previous Articles     Next Articles

Chinese Named Entity Recognition Method Based on BGRU-CRF

SHI Chun-dan, QIN Lin   

  1. (School of Computer Science and Technology,Nanjing Tech University,Nanjing 211816,China)
  • Received:2018-08-13 Online:2019-09-15 Published:2019-09-02

Abstract: Aiming at the problem that the traditional named entity recognition method relies heavily on plenty of hand-crafted features,domain knowledge,word segmentation effect,and does not make full use of word order information,anamed entity recognition model based on BGRU(bidirectional gated recurrent unit) was proposed.This model utilizes external data and integrates potential word information into character-based BGRU-CRF by pre-training words into dictionaries on large automatic word segmentation texts,making full use of the information of potentialwords,extracting comprehensive information of context,and more effectively avoiding ambiguity of entity.In addition,attention mechanism is used to allocate the weight of specific information in BGRU network structure,which can select the most relevant characters and words from the sentence,effectively obtain long-distance dependence of specific words in the text,recognize the classification of information expression,and identify named entities.The model explicitly uses the sequence information between words,and is not affected by word segmentation errors.Compared with the traditional sequence labeling model and the neural network model,the experimental results on MSRA and OntoNotes show that the proposed model is 3.08% and 0.16% higher than the state-of-art complaint models on the overall F1 value respectively.

Key words: Named entity recognition, Bidirectional gated recurrent unit, Attention mechanism

CLC Number: 

  • TP391
[1]DUAN H,ZHENG Y.A Study on Features of the CRFs-based Chinese Named Entity Recognition[J].International Journal of Advanced Intelligence Paradigms,2011,3(2):287-294.
[2]ZHOU G D,SU J.Named Entity Recognition Using an HMM-based Chunk Tagger[C]//Proceedings of the 40th Annual Mee-ting of the Association for Computational Linguistics(ACL).2002:473-480.
[3]HUANG Z H,XU W,YU K.Bidirectional LSTM-CRF Models for Sequence Tagging[J/OL].
[4]MA X Z,HOVY E.End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF[J/OL].
[5]CHIU J P C,NICHOLS E.Named Entity Recognition with Bidi-rectional LSTM-CNNs[J/OL].
[6]LAMPLE G,BALLESTEROS M,SUBRAMANIAN S,et al.Neural Architectures for Named Entity Recognition[C]//Proceedings of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies(NAACL-HLT2016).2016:260-270.
[7]PETERS M E,AMMAR W,BHAGAVATULA C,et al.Semi-supervised Sequence Tagging with Bidirectional Language Mo-dels[C]//Proceedings of the 55th Annual Meeting of the Asso-ciation for Computational Linguistics.2017:1756-1765.
[8]YANG Z,SALAKHUTDINOV R,COHEN W W.TransferLearning for Sequence Tagging with Hierarchical Recurrent Networks[C]//International Conference on Learning Representations(ICLR 2017).2017.
[9]LIU Z,ZHU C,ZHAO T.Chinese Named Entity Recognition with a Sequence Labeling Approach:Based on Characters,or Based on Words?[M].Berlin:Springer Berlin Heidelberg,2010:634-640.
[10]YANG Z,SALAKHUTDINOV R,COHEN W.Multi-TaskCross-Lingual Sequence Tagging from Scratch[J].arXiv:1603.06270,2016.
[11]SHIMAOKA S,STENETORP P,INUI K,et al.An Attentive Neural Architecture for Fine-grained Entity Type Classification[C]//Proceedings of the 5th Workshop on Automated Know-ledge Base Construction.2016.
[12]LAFFERTY J,MCCALLUM A,PEREIRA F.Conditional Random Fields:Probabilistic Models for Segmenting and Labeling Sequence Data[C]//Proceedings of the Eighteenth International Conference on Machine Learning (ICML-2001).2001:282-289.
[13]MIKOLOV T,SUTSKEVER I,CHEN K,et al.Distributed representations of words and phrases and their compositionality[C]//Advances in Neural Information Processing Systems.Berlin:Springer,2013:3111-3119.
[14]YANG J,ZHANG Y,DONG F.Neural word segmentation with rich pretraining[EB/OL].
[15]CHEN A,PENG F,SHAN R,et al.Chinese named entity recognition with conditional probabilistic models[EB/OL].
[16]ZHANG S,WEN J,WANG X.Word Segmentation and Named Entity Recognition for SIGHAN Bakeoff3[C]//Proceedings of the Fifth SIGHAN Workshop on Chinese Language Processing.2006:158-161.
[17]ZHOU J,HE L,DAI X,et al.Chinese Named Entity Recognition with a Multi-Phase Model[EB/OL].
[18]ZHOU J,QU W,ZHANG F.Chinese Named Entity Recognition via Joint Identification and Categorization[J].Chinese Journal of Electronics,2013,22(2):225-230.
[19]DONG C,ZHANG J,ZONG C,et al.Character-Based LSTM-CRF with Radical-Level Features for Chinese Named Entity Recognition[C]//International Conference on Computer Processing of Oriental Languages.Springer International Publi-shing,2016:239-250.
[20]WANG M,CHE W,MANNING C D.Effective bilingual constraints for semi-supervised learning of named entity recognizers[C]//Twenty-Seventh AAAI Conference on Artificial Intelligence.AAAI Press,2013:919-925.
[21]CHE W X,WANG M Q,MANNING C D,et al.Named entity recognition with bilingual constraints[C]//Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2013).2013:52-62.
[22]YANG J,ZHANG Y,DONG F.Neural Word Segmentationwith Rich Pretraining[EB/OL].
[23]ZHANG H N,WU D Y,LIU Y,et al.Chinese Named EntityRecognition Based on Deep Neural Network[J].Journal of Chinese Information Processing,2017,31(4):28-35.(in Chinese)张海楠,伍大勇,刘悦,等.基于深度神经网络的中文命名实体识别[J].中文信息学报,2017,31(4):28-35.
[24]FENG Y H,YU H,SUN G,et al.Named Entity Recognition Method Based on BLSTM[J].Computer Science,2018,45(2):261-268.(in Chinese)冯艳红,于红,孙庚,等.基于BLSTM的命名实体识别方法[J].计算机科学,2018,45(2):261-268.
[1] ZHAO Jia-qi, WANG Han-zheng, ZHOU Yong, ZHANG Di, ZHOU Zi-yuan. Remote Sensing Image Description Generation Method Based on Attention and Multi-scale Feature Enhancement [J]. Computer Science, 2021, 48(1): 190-196.
[2] LIU Yang, JIN Zhong. Fine-grained Image Recognition Method Combining with Non-local and Multi-region Attention Mechanism [J]. Computer Science, 2021, 48(1): 197-203.
[3] WANG Rui-ping, JIA Zhen, LIU Chang, CHEN Ze-wei, LI Tian-rui. Deep Interest Factorization Machine Network Based on DeepFM [J]. Computer Science, 2021, 48(1): 226-232.
[4] WANG Run-zheng, GAO Jian, HUANG Shu-hua, TONG Xin. Malicious Code Family Detection Method Based on Knowledge Distillation [J]. Computer Science, 2021, 48(1): 280-286.
[5] PAN Zu-jiang, LIU Ning, ZHANG Wei, WANG Jian-yong. MTHAM:Multitask Disease Progression Modeling Based on Hierarchical Attention Mechanism [J]. Computer Science, 2020, 47(9): 185-189.
[6] ZHAO Wei, LIN Yu-ming, WANG Chao-qiang, CAI Guo-yong. Opinion Word-pairs Collaborative Extraction Based on Dependency Relation Analysis [J]. Computer Science, 2020, 47(8): 164-170.
[7] YUAN Ye, HE Xiao-ge, ZHU Ding-kun, WANG Fu-lee, XIE Hao-ran, WANG Jun, WEI Ming-qiang, GUO Yan-wen. Survey of Visual Image Saliency Detection [J]. Computer Science, 2020, 47(7): 84-91.
[8] LIU Yan, WEN Jing. Complex Scene Text Detection Based on Attention Mechanism [J]. Computer Science, 2020, 47(7): 135-140.
[9] YU Yi-lin, TIAN Hong-tao, GAO Jian-wei and WAN Huai-yu. Relation Extraction Method Combining Encyclopedia Knowledge and Sentence Semantic Features [J]. Computer Science, 2020, 47(6A): 40-44.
[10] ZU Shi-cheng, WANG Xiu-lai, CAO Yang, ZHANG Yu-tao andLIANG Shan. Resume Parsing Based on Novel Text Block Segmentation Methodology [J]. Computer Science, 2020, 47(6A): 95-101.
[11] NI Hai-qing, LIU Dan, SHI Meng-yu. Chinese Short Text Summarization Generation Model Based on Semantic-aware [J]. Computer Science, 2020, 47(6): 74-78.
[12] HUANG Yong-tao, YAN Hua. Scene Graph Generation Model Combining Attention Mechanism and Feature Fusion [J]. Computer Science, 2020, 47(6): 133-137.
[13] ZHANG Zhi-yang, ZHANG Feng-li, CHEN Xue-qin, WANG Rui-jin. Information Cascade Prediction Model Based on Hierarchical Attention [J]. Computer Science, 2020, 47(6): 201-209.
[14] DENG Yi-jiao, ZHANG Feng-li, CHEN Xue-qin, AI Qing, YU Su-zhe. Collaborative Attention Network Model for Cross-modal Retrieval [J]. Computer Science, 2020, 47(4): 54-59.
[15] ZHANG Peng-fei, LI Guan-yu, JIA Cai-yan. Truncated Gaussian Distance-based Self-attention Mechanism for Natural Language Inference [J]. Computer Science, 2020, 47(4): 178-183.
Full text



[1] LEI Li-hui and WANG Jing. Parallelization of LTL Model Checking Based on Possibility Measure[J]. Computer Science, 2018, 45(4): 71 -75 .
[2] SUN Qi, JIN Yan, HE Kun and XU Ling-xuan. Hybrid Evolutionary Algorithm for Solving Mixed Capacitated General Routing Problem[J]. Computer Science, 2018, 45(4): 76 -82 .
[3] ZHANG Jia-nan and XIAO Ming-yu. Approximation Algorithm for Weighted Mixed Domination Problem[J]. Computer Science, 2018, 45(4): 83 -88 .
[4] WU Jian-hui, HUANG Zhong-xiang, LI Wu, WU Jian-hui, PENG Xin and ZHANG Sheng. Robustness Optimization of Sequence Decision in Urban Road Construction[J]. Computer Science, 2018, 45(4): 89 -93 .
[5] SHI Wen-jun, WU Ji-gang and LUO Yu-chun. Fast and Efficient Scheduling Algorithms for Mobile Cloud Offloading[J]. Computer Science, 2018, 45(4): 94 -99 .
[6] ZHOU Yan-ping and YE Qiao-lin. L1-norm Distance Based Least Squares Twin Support Vector Machine[J]. Computer Science, 2018, 45(4): 100 -105 .
[7] LIU Bo-yi, TANG Xiang-yan and CHENG Jie-ren. Recognition Method for Corn Borer Based on Templates Matching in Muliple Growth Periods[J]. Computer Science, 2018, 45(4): 106 -111 .
[8] GENG Hai-jun, SHI Xin-gang, WANG Zhi-liang, YIN Xia and YIN Shao-ping. Energy-efficient Intra-domain Routing Algorithm Based on Directed Acyclic Graph[J]. Computer Science, 2018, 45(4): 112 -116 .
[9] CUI Qiong, LI Jian-hua, WANG Hong and NAN Ming-li. Resilience Analysis Model of Networked Command Information System Based on Node Repairability[J]. Computer Science, 2018, 45(4): 117 -121 .
[10] WANG Zhen-chao, HOU Huan-huan and LIAN Rui. Path Optimization Scheme for Restraining Degree of Disorder in CMT[J]. Computer Science, 2018, 45(4): 122 -125 .