计算机科学 ›› 2016, Vol. 43 ›› Issue (10): 225-228.doi: 10.11896/j.issn.1002-137X.2016.10.043

• 人工智能 • 上一篇    下一篇

基于MRMR的文本分类特征选择方法

李军怀,付静飞,蒋文杰,费蓉,王怀军   

  1. 西安理工大学 西安710048,西安理工大学 西安710048,西安理工大学 西安710048,西安理工大学 西安710048,西安理工大学 西安710048
  • 出版日期:2018-12-01 发布日期:2018-12-01
  • 基金资助:
    本文受国家自然科学基金(61172018),陕西教育厅科技计划(15JS077),西安市科技计划(CXY1439(8))资助

Feature Selection Method Based on MRMR for Text Classification

LI Jun-huai, FU Jing-fei, JIANG Wen-jie, FEI Rong and WANG Huai-jun   

  • Online:2018-12-01 Published:2018-12-01

摘要: 特征选择是文本分类技术中重要的处理步骤,特征词选择的优劣直接关系到后续文本分类结果的准确率。使用传统特征选择方法如互信息(MI)、信息增益(IG)、χ2统计量(CHI)等提取的特征词仍存在冗余。针对这一问题,通过结合词频-逆文档率(TF_IDF)和最大相关最小冗余标准(MRMR),提出了一种基于MRMR的特征词二次选取方法TFIDF_MRMR。实验结果表明,该方法可以较好地减少特征词之间的冗余,提高文本分类的准确率。

关键词: 特征选择,最大相关最小冗余,词频-逆文档率,文本分类

Abstract: Feature selection is the most important preprocessing step in text classification.The quality of the feature words has a significant impact on the accuracy of the classification results.Using the traditional feature selection method,such as MI,IG,CHI,will still cause the redundanly among the feature words.For this problem,based on the combination of the term frequency-inverse document frequency (TF-IDF) and the maximal relevance minimal redundancy (MRMR),this paper put forward a MRMR based feature words secondary selection method TFIDF_MRMR.The experimental results indicate that this method is able to reduce the redundancy of feature words and improve the accuracy of classification.

Key words: Feature selection,Maximal relevance minimal redundancy (MRMR),Term frequency-inverse document frequency(TF-IDF),Text classification

[1] Yang Yi-ming,Pedersen J O.A comparative study on feature se-lection in text categorization[C]∥Proceedings of the 14th International Conference on Machine Learning(ICML).1997:412-420
[2] Wang De-qing,Zhang Hui,Liu Rui,et al.t-Test feature selection approach based on term frequency for text categorization[J].Pattern Recognition Letters,2014,45(11):1-10
[3] Harun U.A two-stage feature selection method for text categorization by using information gain,principal component analysis and genetic algorithm[J].Knowledge-Based Systems,2011,24(7):1024-1032
[4] Meng Jia-na,Lin Hong-fei,Yu Yu-hai.A two-stage feature se-lection method for text categorization[C]∥2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD).IEEE,2010:1492-1496
[5] Kursat U A,Serkan G.A Novel Probabilistic Feature Selection Method For Text Classification[J].Knowledge-Based Systems,2012,36(6):226-235
[6] Kashif J,Sammen M,Babri Haroon A.A two-stage Markovblanket based feature selection algorithm for text classification[J].Neurocomputing,2015,7:91-104
[7] Lu Zhong-ning,Zhang Bao-wei.A text categorization methodbased on improved TF-IDF function[J].Journal of Henan Normal University(Natural Science Edition),2012,0(6):158-160(in Chinese) 卢中宁,张保威.一种基于改进TF-IDF函数的文本分类方法[J].河南师范大学学报(自然科学版),2012,0(6):158-160
[8] Carol F,Rindflesch T C,Milton C,et al.Natural language processing:state of the art and prospects for significant progress,a workshop sponsored by the National Library of Medicine[J].Journal of Biomedical Informatics,2013,46(5):765-773
[9] Saleh S N,El-Sonbaty Y.A feature selection algorithm with redundancy reduction for text classification[J].Computer and Information Sciences,2007,22:1-6
[10] Lin Yao-jin,Hu Qing-hua,Liu Jing-hua,et al.Multi-label fea-ture selection based on max-dependency and min-redundancy[J].Neurocomputing,2015,168:92-103
[11] Ding C,Peng Han-chuan.Minimum redundancy feature selection from microarray gene expression data[J].Journal of Bioinformatics and Computational Biology,2005,3(2):185-205
[12] George F,Guyon I,Elisseeff A.An extensive empirical study of feature selection metrics for text classification[J].Journal of Machine Learning Research,2003,3:1289-1305
[13] Feng Guo-he.Review of Performance Evaluation of Text Classification[J].Journal of Intelligence,2011,0(8):66-70(in Chinese) 奉国和.文本分类性能评价研究[J].情报杂志,2011,0(8):66-70
[14] NetEase article classification experiment data set[EB/OL].http://www.datatang.com/data/11967(in Chinese) 网易文章分类实验数据集[EB/OL].http://www.datatang.com/data/11967

No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!