计算机科学 ›› 2018, Vol. 45 ›› Issue (2): 63-68.doi: 10.11896/j.issn.1002-137X.2018.02.011

• 2017年中国计算机学会人工智能会议 • 上一篇    下一篇

基于SAC的特征选择算法

张梦林,李占山   

  1. 吉林大学计算机科学与技术学院 长春130012 吉林大学符号计算与知识工程教育部重点实验室 长春130012,吉林大学计算机科学与技术学院 长春130012 吉林大学符号计算与知识工程教育部重点实验室 长春130012
  • 出版日期:2018-02-15 发布日期:2018-11-13
  • 基金资助:
    本文受吉林省科技发展计划项目(20140101200JC)资助

Feature Selection Algorithm Using SAC Algorithm

ZHANG Meng-lin and LI Zhan-shan   

  • Online:2018-02-15 Published:2018-11-13

摘要: 特征选择通过移除不相关和冗余的特征来提高学习算法的性能。基于进化算法在求解优化问题时表现出的优越性能,提出FSSAC特征选择方法。新的初始化策略和评估函数使得SAC能将特征选择作为离散空间搜索问题来解决,利用特征子集的准确率指导SAC的采样阶段。在实验阶段,FSSAC结合SVM,J48和KNN分类器,通过UCI数据集完成验证,并与FSFOA,HGAFS,PSO等算法进行了比较。实验结果表明,FSSAC可以提高分类器的分类准确率,且具有良好的泛化性能。除此之外,对FSSAC和其他算法在特征空间维度缩减情况方面做了对比。

关键词: 特征选择,SAC,FSSAC,维度缩减

Abstract: Feature selection can improve the performance of learning algorithm with the help of removing the irrelevant and redundant features.As evolutionary algorithm is reported to be suitable for optimization tasks,this paper proposed a new feature selection algorithm FSSAC.The new initialization strategy and evaluation function make FSSAC regard feature selection as a discrete space search problem.The algorithm also uses the accuracy of feature subset to guide the sampling period.In the stage of experiment,FSSAC was combined with the SVM,J48 and KNN,and then it was validated on UCI machine learning datasets by comparing with FSFOA,HGAFS,PSO and so on .The experiments show that FSSAC can improve the classification accuracy of classifier and has good generalization.Besides,FSSAC was also compared with other available methods in dimensionality reduction.

Key words: Feature selection,SAC,FSSAC,Dimensionality reduction

[1] TAN K C,TEOH E J,YU Q,et al.A hybrid evolutionary algorithm for attribute selection in data mining[J].Expert Systems with Applications,2009,36(4):8616-8630.
[2] HALL M A.Correlation-Based Feature Selection for Machine Learning[D].Hamilton:The University of Waikato,1999.
[3] HONG Q,YANG Y.On Sampling-and-Classification Optimization in Discrete Domains[C]∥IEEE Congress on Evolutionary Computation.IEEE,2016.
[4] ALMUALLIM H,DIETTERICH T G.Learning Boolean concepts in the presence of many irrelevant features[J].Artificial Intelligence,1994,69(1-2):279-305.
[5] ALMUALLIM H,DIETTERICH T G.Learning with many irrelevant features[C]∥National Conference on Artificial Intelligence.AAAI Press,1991:547-552.
[6] PUDIL P,NOVOVI,KITTLER J.Floating search methods infeature selection[J].Pattern Recognition Letters,1994,15(11):1119-1125.
[7] ZHU W,SI G,ZHANG Y,et al.Neighborhood effective information ratio for hybrid feature subset evaluation and selection[J].Neurocomputing,2013,99:25-37.
[8] GHAEMI M,FEIZI-DERAKHSHI M R.Feature selectionusing Forest Optimization Algorithm[J].Pattern Recognition,2016,60:121-129.
[9] YU Y,QIAN H.The sampling-and-learning framework:A statistical view of evolutionary algorithms[C]∥Evolutionary Computation.IEEE,2014:149-158.
[10] SUTTON A M,NEUMANN F.A Parameterized Runtime A-nalysis of Evolutionary Algorithms for the Euclidean Traveling Salesperson Problem[C]∥AAAI Conference on Artificial Intelligence.2012:595-628.
[11] HU Q,CHE X,ZHANG L,et al.Feature evaluation and selection based on neighborhood soft margin[J].Neurocomputing,2010,73(10-12):2114-2124.
[12] MOUSTAKIDIS S P,THEOCHARIS J B.SVM-FuzCoC:A novel SVM-based feature selection method using a fuzzy complementary criterion[J].Pattern Recognition,2010,43(11):3712-3729.
[13] HUANG J,RONG P.A Hybrid Genetic Algorithm for FeatureSelection Based on Mutual Information[J].Pattern Recognit.Lett.,2007,28(13):1825-1844.
[14] TABAKHI S,MORADI P,AKHLAGHIAN F.An unsuper-vised feature selection algorithm based on ant colony optimization[J].Engineering Applications of Artificial Intelligence,2014,32(6):112-123.
[15] XUE B,ZHANG M,BROWNE W N.Particle swarm optimisation for feature selection in classification:Novel initialisation and updating mechanisms[J].Applied Soft Computing,2014,18(C):261-276.

No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!