计算机科学 ›› 2012, Vol. 39 ›› Issue (12): 228-232.

• 人工智能 • 上一篇    下一篇

一种基于KL散度和类分离策略的特征选择算法

李晓艳 张子刚 张逸石 张谧   

  1. (华中科技大学管理学院 武汉 430074) (重庆邮电大学经济管理学院 重庆 400065)
  • 出版日期:2018-11-16 发布日期:2018-11-16

KL-divergence Based Feature Selection Algorithm with the Separate-class Strategy

  • Online:2018-11-16 Published:2018-11-16

摘要: 特征选择是模式识别和机器学习中的重要环节之一,所选特征子集的质量直接影响着分类学习算法的效率 及准确率。现有特征选择算法均在整个类标签集的视角下进行特征评价,并未分别考察每一类别与特征间的关系。 提出了一种基于KL散度和类分离策略的特征选择算法,它采用类分离策略分别对类标签中每一类别与特征间的关 系予以考察,并采用一种基于KI散度的有效距离度量类别与特征间的相关性以及特征之间的冗余性。实验结果表 明,所提算法具有较高的运行效率;在所选特征质量上,所提算法显著优于经典的CFS, FCI3F以及RclicfF特征选择 算法。

关键词: 特征选择,KL散度,类分离策略,有效距离

Abstract: Feature selection is one of the core issues in designing pattern recognition systems and has attracted conside- ruble attention in the literature. Most of the feature selection methods in the literature only handle relevance and redun- dancy analysis from the point of view of the whole class, which neglect the relation of features and the separate class la- bels. To this end, a novel KL-divergence based feature selection algorithm was proposed to explicitly handle the rele- vance and redundancy analysis for each class label with a separate-class strategy. A KL-divergence based metric of effec- tive distance was also introduced in the algorithm to conduct the relevance and redundancy analysis. Experimental re- sups show that the proposed algorithm is efficient and outperforms the three representative algorithms CFS,FCI3F and RcliefF with respect to the quality of the selected feature subset.

Key words: Feature selection, KI= divergence, Separatcclass strategy, Effective distance

No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!