Computer Science ›› 2020, Vol. 47 ›› Issue (3): 231-236.doi: 10.11896/jsjkx.190100108
• Artificial Intelligence • Previous Articles Next Articles
FU Jian,KONG Fang
CLC Number:
[1]HOBBS J R.Resolving pronoun references[J].Lingua,1978,44(4):311-338. [2]LAPPIN S,LEASS H J.An algorithm for pronominal anaphora resolution[J].Computational linguistics,1994,20(4):535-561. [3]MCCORD M C.Slot grammar[M]∥Natural Language and Logic. Berlin:Springer,1990:118-145. [4]KONG F,ZHOU G D.Pronoun Resolution in English and Chinese Languages Based on Tree Kernel[J].Journal of Software,2012,23(5):1085-1099. [5]MIKOLOV T,SUTSKEVER I,CHEN K,et al.Distributed representations of words and phrases and their compositionality[C]∥Advances in Neural Information Processing Systems.Lake Tahoe:NIPS,2013:3111-3119. [6]CLARK K,MANNING C D.Deep Reinforcement Learning for Mention-Ranking Coreference Models[C]∥Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing.Austin:EMNLP,2016:2256-2262. [7]PRADHAN S,MOSCHITTI A,XUE N,et al.CoNLL-2012 shared task:Modeling multilingual unrestricted coreference in OntoNotes[C]∥Joint Conference on EMNLP and CoNLL-Shared Task.Jeju Island:ACL,2012:1-40. [8]WU J L,MA W Y.A deep learning framework for coreference resolution based on convolutional neural network[C]∥2017 IEEE 11th International Conference on Semantic Computing (ICSC).San Diego:IEEE,2017:61-64. [9]LEE K,HE L,LEWIS M,et al.End-to-end Neural Coreference Resolution[C]∥Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing.Copenhagen:ACL,2017:188-197. [10]HOCHREITER S,SCHMIDHUBER J.Long short-term memory[J].Neural Computation,1997,9(8):1735-1780. [11]LEE K,HE L,ZETTLEMOYER L.Higher-Order Coreference Resolution with Coarse-to-Fine Inference[C]∥Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.New Orleans:ACL,2018,2:687-692. [12]PETERS M,NEUMANN M,IYYER M,et al.Deep Contextuali- zed Word Representations[C]∥Proceedings of the 2018 Confe-rence of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.New Orleans:ACL,2018:2227-2237. [13]LIANG D,XU W,ZHAO Y.Combining word-level and character-level representations for relation classification of informal text[C]∥Proceedings of the 2nd Workshop on Representation Learning for NLP.Vancouver:ACL,2017:43-47. [14]ZHANG X,ZHAO J,LECUN Y.Character-level convolutional networks for text classification[C]∥Advances in Neural Information Processing Systems.Montreal:NIPS,2015:649-657. [15]LING W,DYER C,BLACK A W,et al.Finding Function in Form:Compositional Character Models for Open Vocabulary Word Representation[C]∥Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing.Lisbon:ACL,2015:1520-1530. [16]VILAIN M,BURGER J,ABERDEEN J,et al.A model-theore- tic coreference scoring scheme[C]∥Proceedings of the 6th Conference on Message Understanding.Columbia:ACL,1995:45-52. [17]BAGGA A,BALDWIN B.Algorithms for scoring coreference chains[C]∥The First International Conference on Language Resources and Evaluation Workshop on Linguistics Corefe-rence.Granada:LREC,1998,1:563-566. [18]LUO X.On coreference resolution performance metrics[C]∥ Proceedings of the Conference on Human Language Technology and Empirical Methods in Natural Language Processing.Vancouver:ACL,2005:25-32. [19]NAIR V,HINTON G E.Rectified linear units improve restric- ted boltzmann machines[C]∥Proceedings of the 27th International Conference on International Conference on Machine Learning.Haifa:Omni press,2010:807-814. [20]SRIVASTAVA N,HINTON G,KRIZHEVSKY A,et al.Dropout:a simple way to prevent neural networks from overfitting[J].The Journal of Machine Learning Research,2014,15(1):1929-1958. [21]KINGMA D P,BA J.Adam:A method for stochastic optimization[J].arXiv:1412.6980,2014. [22]AL-RFOU R,PEROZZI B,SKIENA S.Polyglot:Distributed Word Representations for Multilingual NLP [C]∥Proceedings of the Seventeenth Conference on Computational Natural Language Learning.Sofia:ACL,2013:183-192. [23]PENNINGTON J,SOCHER R,MANNING C.Glove:Global vectors for word representation[C]∥Proceedings of the 2014 Conference on Empirical Methods in Natural Language Proces-sing (EMNLP).Doha:ACL,2014:1532-1543. [24]CLARK K,MANNING C D.Improving Coreference Resolution by Learning Entity-Level Distributed Representations[C]∥Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.Berlin:ACL,2016:643-653. [25]DEVLIN J,CHANG M W,LEE K,et al.Bert:Pre-training of deep bidirectional transformers for language understanding[J].arXiv:1810.04805,2018. |
[1] | LI Zhou-jun,FAN Yu,WU Xian-jie. Survey of Natural Language Processing Pre-training Techniques [J]. Computer Science, 2020, 47(3): 162-173. |
[2] | GU Xue-mei,LIU Jia-yong,CHENG Peng-sen,HE Xiang. Malware Name Recognition in Tweets Based on Enhanced BiLSTM-CRF Model [J]. Computer Science, 2020, 47(2): 245-250. |
[3] | YANG Dan-hao,WU Yue-xin,FAN Chun-xiao. Chinese Short Text Keyphrase Extraction Model Based on Attention [J]. Computer Science, 2020, 47(1): 193-198. |
[4] | SU Qing,LIN Hao,HUANG Jian-feng,HE Fan,LIN Zhi-yi. Study on Dynamic-graph Watermarking Based on Petri Net Coding [J]. Computer Science, 2019, 46(7): 120-125. |
[5] | ZHANG Lu, SHEN Chen-lin, LI Shou-shan. Emotion Classification Algorithm Based on Emotion-specific Word Embedding [J]. Computer Science, 2019, 46(6A): 93-97. |
[6] | HAN Zhong-ming, ZHENG Chen-ye, DUAN Da-gao, DONG Jian. Associated Users Mining Algorithm Based on Multi-information Fusion Representation Learning [J]. Computer Science, 2019, 46(4): 77-82. |
[7] | LI Chang-jing,ZHAO Shu-liang,CHI Yun-xian. Outlier Detection Algorithm Based on Spectral Embedding and Local Density [J]. Computer Science, 2019, 46(3): 260-266. |
[8] | HOU Yu-chen, WU Wei. Design and Implementation of Crowdsourcing System for Still Image Activity Annotation [J]. Computer Science, 2019, 46(11A): 580-583. |
[9] | WEN Wen, LIN Ze-tian, CAI Rui-chu, HAO Zhi-feng, WANG Li-juan. Predicting User’s Dynamic Preference Based on Embedding Learning [J]. Computer Science, 2019, 46(10): 32-38. |
[10] | LI Hao, LIU Yong-jian, XIE Qing, TANG Ling-li. Distant Supervision Relation Extraction Model Based on Multi-level Attention Mechanism [J]. Computer Science, 2019, 46(10): 252-257. |
[11] | HAN Xu-li, ZENG Bi-qing, ZENG Feng, ZHANG Min, SHANG Qi. Sentiment Analysis Based on Word Embedding Auxiliary Mechanism [J]. Computer Science, 2019, 46(10): 258-264. |
[12] | YU Yuan-yuan, CHAO Wen-han, HE Yue-ying, LI Zhou-jun. Cross-language Knowledge Linking Based on Bilingual Topic Model and Bilingual Embedding [J]. Computer Science, 2019, 46(1): 238-244. |
[13] | YE Zhong-lin, ZHAO Hai-xing, ZHANG Ke, ZHU Yu. Network Representation Learning Based on Multi-view Ensemble Algorithm [J]. Computer Science, 2019, 46(1): 117-125. |
[14] | TIAN Xing, ZHENG Jin, ZHANG Zu-ping. Jaccard Text Similarity Algorithm Based on Word Embedding [J]. Computer Science, 2018, 45(7): 186-189. |
[15] | LOU Xue, YAN De-qin, WANG Bo-lin, WANG Zu. Improved Neighborhood Preserving Embedding Algorithm [J]. Computer Science, 2018, 45(6A): 255-258, 278. |
|