Computer Science ›› 2017, Vol. 44 ›› Issue (1): 60-64.doi: 10.11896/j.issn.1002-137X.2017.01.011

Previous Articles     Next Articles

Self-adaptation Multi-gram Weight Learning Strategy for Sentence Representation Based on Convolutional Neural Network

ZHANG Chun-yun, QIN Peng-da and YIN Yi-long   

  • Online:2018-11-13 Published:2018-11-13

Abstract: Nowadays,with the explosive growth of the information,nature language processing has been paid more attention.The traditional nature language processing systems are overly dependent on the expensive handcrafted features annotated by experts and synatx information of language analysis tools.Deep neural network can achieve end-to-end learning even without costly features.In order to extract more information from input sentences,most neural networks of nature language processing combines with multi-gram strategy.However,due to various tasks or various datasets,the information distribution of diverse n-gram is different.With this consideration,this paper proposed a self-adaptation weight learning strategy of multi-gram,which generates the importance order of multi-gram by the training procedure of neural network.Moreover,a novel combination method of multi-gram feature vectors was exploited.Experimental results show that such method can not only reduce the complexity of network,but also can improve performances of positive and negative tendency classification of movie criticism,and relation classification.

Key words: Deep learning,Natural language processing,Self-adaptation,Multi-gram

[1] GRISHMAN R.Information extraction:Capabilities and challenges[Z].Lecture Notes of 2012 International Winter School in Language and Speech Technologies,Roviral Virgili,2012.
[2] WANG T,et al.End-to-end text recognition with convolutional neural networks[C]∥2012 21st International Conference on Pattern Recognition (ICPR).2012.
[3] LECUN Y,BENGIO Y,HINTON G.Deep learning[J].Nature,2015,521(7553):436-444.
[4] HINTON G E,Salakhutdinov R R.Reducing the dimensionality of data with neural networks[J].Science,2006,313(5786):504-507.
[5] HINTON G,et al.Deep neural networks for acoustic modeling in speech recognition:The shared views of four research groups[J].Signal Processing Magazine,IEEE,2012,29(6):82-97.
[6] NGUYEN T H,GRISHMAN R.Relation Extraction:Perspec-tive from Convolutional Neural Networks[C]∥Workshop on Vector Modeling for NLP.2015:39-48 .
[7] IYYER M,et al.A neural network for factoid question answering over paragraphs[C]∥Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP).2014.
[8] LECUN Y,BENGIO Y.Convolutional networks for images,s-peech,and time series[M]∥ The Handbook of Brain Theory and Neural Networks.MIT Press,1995.
[9] MOZER M C.A Focused Backpropagation Algorithm for Temporal Patern Recognition[M].Hillsdale,1995:137-169.
[10] COLLOBERT R,et al.Natural language processing (almost) from scratch[J].The Journal of Machine Learning Research,2011(12):2493-2537.
[11] KIM Y.Convolutional neural networks for sentence classification[J].arXiv preprint arXiv:1408.5882,2014.
[12] CHEN Y,et al.Event Extraction via Dynamic Multi-PoolingConvolutional Neural Networks[C]∥Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing.2015.
[13] ZENG D,et al.Relation classification via convolutional deepneural network[C]∥Proceedings of COLING.2014.
[14] ZHANG Y,Wallace B.A Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence Classification[J].arXiv preprint arXiv:1510.03820,2015.
[15] HINTON G E,et al.Improving neural networks by preventing o-adaptation of feature detectors[J].arXiv preprint arXiv: 1207.0580,2012.
[16] MIKOLOV T,YIH W T,ZWEIG G.Linguistic Regularities in Continuous Space Word Representations[C]∥HLT-NAACL.2013.
[17] GLOROT X,BORDES A,BENGIO Y.Deep sparse rectifierneural networks[C]∥International Conference on Artificial Intelligence and Statistics.2011.
[18] ZEILER M D.ADADELTA:An adaptive learning rate method[J].arXiv preprint arXiv:1212.5701,2012.
[19] PANG B,LEE L.Seeing stars:Exploiting class relationships for sentiment categorization with respect to rating scales[C]∥Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics.Association for Computational Linguistics,2005.
[20] HENDRICKX I,et al.Semeval-2010 task 8:Multi-way classification of semantic relations between pairs of nominals[C]∥Proceedings of the Workshop on Semantic Evaluations:Recent Achievements and Future Directions.Association for Computational Linguistics,2009.

No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
[1] LEI Li-hui and WANG Jing. Parallelization of LTL Model Checking Based on Possibility Measure[J]. Computer Science, 2018, 45(4): 71 -75, 88 .
[2] XIA Qing-xun and ZHUANG Yi. Remote Attestation Mechanism Based on Locality Principle[J]. Computer Science, 2018, 45(4): 148 -151, 162 .
[3] LI Bai-shen, LI Ling-zhi, SUN Yong and ZHU Yan-qin. Intranet Defense Algorithm Based on Pseudo Boosting Decision Tree[J]. Computer Science, 2018, 45(4): 157 -162 .
[4] WANG Huan, ZHANG Yun-feng and ZHANG Yan. Rapid Decision Method for Repairing Sequence Based on CFDs[J]. Computer Science, 2018, 45(3): 311 -316 .
[5] SUN Qi, JIN Yan, HE Kun and XU Ling-xuan. Hybrid Evolutionary Algorithm for Solving Mixed Capacitated General Routing Problem[J]. Computer Science, 2018, 45(4): 76 -82 .
[6] ZHANG Jia-nan and XIAO Ming-yu. Approximation Algorithm for Weighted Mixed Domination Problem[J]. Computer Science, 2018, 45(4): 83 -88 .
[7] WU Jian-hui, HUANG Zhong-xiang, LI Wu, WU Jian-hui, PENG Xin and ZHANG Sheng. Robustness Optimization of Sequence Decision in Urban Road Construction[J]. Computer Science, 2018, 45(4): 89 -93 .
[8] LIU Qin. Study on Data Quality Based on Constraint in Computer Forensics[J]. Computer Science, 2018, 45(4): 169 -172 .
[9] ZHONG Fei and YANG Bin. License Plate Detection Based on Principal Component Analysis Network[J]. Computer Science, 2018, 45(3): 268 -273 .
[10] SHI Wen-jun, WU Ji-gang and LUO Yu-chun. Fast and Efficient Scheduling Algorithms for Mobile Cloud Offloading[J]. Computer Science, 2018, 45(4): 94 -99, 116 .