Computer Science ›› 2016, Vol. 43 ›› Issue (10): 225-228.doi: 10.11896/j.issn.1002-137X.2016.10.043

Previous Articles     Next Articles

Feature Selection Method Based on MRMR for Text Classification

LI Jun-huai, FU Jing-fei, JIANG Wen-jie, FEI Rong and WANG Huai-jun   

  • Online:2018-12-01 Published:2018-12-01

Abstract: Feature selection is the most important preprocessing step in text classification.The quality of the feature words has a significant impact on the accuracy of the classification results.Using the traditional feature selection method,such as MI,IG,CHI,will still cause the redundanly among the feature words.For this problem,based on the combination of the term frequency-inverse document frequency (TF-IDF) and the maximal relevance minimal redundancy (MRMR),this paper put forward a MRMR based feature words secondary selection method TFIDF_MRMR.The experimental results indicate that this method is able to reduce the redundancy of feature words and improve the accuracy of classification.

Key words: Feature selection,Maximal relevance minimal redundancy (MRMR),Term frequency-inverse document frequency(TF-IDF),Text classification

[1] Yang Yi-ming,Pedersen J O.A comparative study on feature se-lection in text categorization[C]∥Proceedings of the 14th International Conference on Machine Learning(ICML).1997:412-420
[2] Wang De-qing,Zhang Hui,Liu Rui,et al.t-Test feature selection approach based on term frequency for text categorization[J].Pattern Recognition Letters,2014,45(11):1-10
[3] Harun U.A two-stage feature selection method for text categorization by using information gain,principal component analysis and genetic algorithm[J].Knowledge-Based Systems,2011,24(7):1024-1032
[4] Meng Jia-na,Lin Hong-fei,Yu Yu-hai.A two-stage feature se-lection method for text categorization[C]∥2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD).IEEE,2010:1492-1496
[5] Kursat U A,Serkan G.A Novel Probabilistic Feature Selection Method For Text Classification[J].Knowledge-Based Systems,2012,36(6):226-235
[6] Kashif J,Sammen M,Babri Haroon A.A two-stage Markovblanket based feature selection algorithm for text classification[J].Neurocomputing,2015,7:91-104
[7] Lu Zhong-ning,Zhang Bao-wei.A text categorization methodbased on improved TF-IDF function[J].Journal of Henan Normal University(Natural Science Edition),2012,0(6):158-160(in Chinese) 卢中宁,张保威.一种基于改进TF-IDF函数的文本分类方法[J].河南师范大学学报(自然科学版),2012,0(6):158-160
[8] Carol F,Rindflesch T C,Milton C,et al.Natural language processing:state of the art and prospects for significant progress,a workshop sponsored by the National Library of Medicine[J].Journal of Biomedical Informatics,2013,46(5):765-773
[9] Saleh S N,El-Sonbaty Y.A feature selection algorithm with redundancy reduction for text classification[J].Computer and Information Sciences,2007,22:1-6
[10] Lin Yao-jin,Hu Qing-hua,Liu Jing-hua,et al.Multi-label fea-ture selection based on max-dependency and min-redundancy[J].Neurocomputing,2015,168:92-103
[11] Ding C,Peng Han-chuan.Minimum redundancy feature selection from microarray gene expression data[J].Journal of Bioinformatics and Computational Biology,2005,3(2):185-205
[12] George F,Guyon I,Elisseeff A.An extensive empirical study of feature selection metrics for text classification[J].Journal of Machine Learning Research,2003,3:1289-1305
[13] Feng Guo-he.Review of Performance Evaluation of Text Classification[J].Journal of Intelligence,2011,0(8):66-70(in Chinese) 奉国和.文本分类性能评价研究[J].情报杂志,2011,0(8):66-70
[14] NetEase article classification experiment data set[EB/OL].http://www.datatang.com/data/11967(in Chinese) 网易文章分类实验数据集[EB/OL].http://www.datatang.com/data/11967

No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!