Computer Science ›› 2021, Vol. 48 ›› Issue (11A): 154-158.doi: 10.11896/jsjkx.210100215

• Intelligent Computing • Previous Articles     Next Articles

Chinese Ship Fault Relation Extraction Method Based on Bidirectional GRU Neural Network and Attention Mechanism

HOU Tong-jia, ZHOU Liang   

  1. School of Computer Science and Technology,Nanjing University of Aeronautics and Astronautics,Nanjing 210016,China
  • Online:2021-11-10 Published:2021-11-12
  • About author:HOU Tong-jia,born in 1997,postgra-duate.Her main research interests include information system integration and knowledge graph.

Abstract: With the development of deep learning,more and more deep learning models are applied to relational extraction tasks.The traditional deep learning model can not solve the long-distance learning task,and the performance of the traditional deep learning model is worse when the noise of text extraction is large.To solve the above two problems,a deep learning model based on bidirectional GRU (gated recurrent unit) neural network and attention mechanism is proposed to extract the relationship between Chinese ship faults.Firstly,by using bidirectional GRU neural network to extract text features,the problem of long dependence of text is solved,and the running time loss and iteration times of the model are also reduced.Secondly,by establishing sentence level attention mechanism,the negative impact of noisy sentences on the whole relationship extraction is reduced.Finally,the model is trained on the training set,and the accuracy,recall,and F1 values are calculated on the real test set to compare the model with existing methods.

Key words: Attention mechanism, Deep learning, Gating cycle unit, Relation extraction, Ship fault

CLC Number: 

  • TP183
[1]ZHAO Y Y,QIN B,CHE W X,et al.Research on Chinese event extraction[J].Journal of Chinese Information Processing,2008,22(1):3-8.
[2]BOLLACKER K D,EVANS C,PARITOSH P,et al.Freebase:a collaboratively created graph database for structuring human knowledge[C]//Proc of ACM SIGMOD International Confe-rence on Management of Data.New York:ACM Press,2008:1247-1250
[3]AUER S,BIZER C,KOBILAROV G,et al.DBpedia:a nucleus for a Web of open data[C]//Proc of International Semantic Web Conference.Berlin:Springer,2007:722-735
[4]YAGO:a high-quality knowledge base[EB/OL].https://www.mpiinf.mpg.De/department/databases-and-information-systems/research/yago-naga/yago/.
[5]SUN Z Y,SHUN J Z,YANG J.Chinese entity relation extraction method based on deep learning[J].Computer Engineering,2018(9):164-170.
[6]ZHANG S X.Research on automatic extraction of entity rela-tionship[J].Journal of Harbin Engineering University,2006(7):370-373.
[7]DONG J.Research on feature selection in Chinese entity relation extraction[J].Chinese Journal of information,2007(4):80-85.
[8]RINK B,HARABAGIU S.UTD:classifying semantic relations by combining lexical and semantic resources[C]//Proc of the 5th International Workshop on Semantic Evaluation.Stroudsburg,PA:Association for Computational Linguistics,2010:256-259.
[9]KAMBHATLA N.Combining lexical,syntactic,and semanticfeatures with maximum entropy models for extracting relations[C]//Proc of Interactive Poster and Demonstration Sessions.Stroudsburg,PA:Association for Computational Linguistics,2004:22.
[10]LIU K,FU H D,ZOU Y W,et al.Chinese medical weak supervision relation extraction based on convolution neural network[J].Computer Science,2017,44(10):249-253.
[11]HUANG Z W,WEI L,BIING C Z,et al.Remote supervised relation extraction based on Gru and attention mechanism[J].Computer Application Research,2019(10):76-79.
[12]ZHANG S,ZHENG D,HU,et al.Bidirectional long short-term memory networks for relation classification[C]//29th Pacific Asia Conference on Language,Information and Computation.2015:73-78.
[13]HOCHREITER S,JURGEN S.Long short-term memory[J].Neural Computation,1997,9(8):1735-1780.
[14]MNIH V,HEESS N.Recurrent Models of Visual Attention[J].Advances in Neural Information Processing Systems,2014,3.
[1] ZHOU Fang-quan, CHENG Wei-qing. Sequence Recommendation Based on Global Enhanced Graph Neural Network [J]. Computer Science, 2022, 49(9): 55-63.
[2] DAI Yu, XU Lin-feng. Cross-image Text Reading Method Based on Text Line Matching [J]. Computer Science, 2022, 49(9): 139-145.
[3] ZHOU Le-yuan, ZHANG Jian-hua, YUAN Tian-tian, CHEN Sheng-yong. Sequence-to-Sequence Chinese Continuous Sign Language Recognition and Translation with Multi- layer Attention Mechanism Fusion [J]. Computer Science, 2022, 49(9): 155-161.
[4] XU Yong-xin, ZHAO Jun-feng, WANG Ya-sha, XIE Bing, YANG Kai. Temporal Knowledge Graph Representation Learning [J]. Computer Science, 2022, 49(9): 162-171.
[5] XIONG Li-qin, CAO Lei, LAI Jun, CHEN Xi-liang. Overview of Multi-agent Deep Reinforcement Learning Based on Value Factorization [J]. Computer Science, 2022, 49(9): 172-182.
[6] RAO Zhi-shuang, JIA Zhen, ZHANG Fan, LI Tian-rui. Key-Value Relational Memory Networks for Question Answering over Knowledge Graph [J]. Computer Science, 2022, 49(9): 202-207.
[7] TANG Ling-tao, WANG Di, ZHANG Lu-fei, LIU Sheng-yun. Federated Learning Scheme Based on Secure Multi-party Computation and Differential Privacy [J]. Computer Science, 2022, 49(9): 297-305.
[8] WANG Ming, PENG Jian, HUANG Fei-hu. Multi-time Scale Spatial-Temporal Graph Neural Network for Traffic Flow Prediction [J]. Computer Science, 2022, 49(8): 40-48.
[9] ZHU Cheng-zhang, HUANG Jia-er, XIAO Ya-long, WANG Han, ZOU Bei-ji. Deep Hash Retrieval Algorithm for Medical Images Based on Attention Mechanism [J]. Computer Science, 2022, 49(8): 113-119.
[10] SUN Qi, JI Gen-lin, ZHANG Jie. Non-local Attention Based Generative Adversarial Network for Video Abnormal Event Detection [J]. Computer Science, 2022, 49(8): 172-177.
[11] YAN Jia-dan, JIA Cai-yan. Text Classification Method Based on Information Fusion of Dual-graph Neural Network [J]. Computer Science, 2022, 49(8): 230-236.
[12] WANG Jian, PENG Yu-qi, ZHAO Yu-fei, YANG Jian. Survey of Social Network Public Opinion Information Extraction Based on Deep Learning [J]. Computer Science, 2022, 49(8): 279-293.
[13] HAO Zhi-rong, CHEN Long, HUANG Jia-cheng. Class Discriminative Universal Adversarial Attack for Text Classification [J]. Computer Science, 2022, 49(8): 323-329.
[14] JIANG Meng-han, LI Shao-mei, ZHENG Hong-hao, ZHANG Jian-peng. Rumor Detection Model Based on Improved Position Embedding [J]. Computer Science, 2022, 49(8): 330-335.
[15] HU Yan-yu, ZHAO Long, DONG Xiang-jun. Two-stage Deep Feature Selection Extraction Algorithm for Cancer Classification [J]. Computer Science, 2022, 49(7): 73-78.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!