Computer Science ›› 2018, Vol. 45 ›› Issue (2): 63-68.doi: 10.11896/j.issn.1002-137X.2018.02.011

Previous Articles     Next Articles

Feature Selection Algorithm Using SAC Algorithm

ZHANG Meng-lin and LI Zhan-shan   

  • Online:2018-02-15 Published:2018-11-13

Abstract: Feature selection can improve the performance of learning algorithm with the help of removing the irrelevant and redundant features.As evolutionary algorithm is reported to be suitable for optimization tasks,this paper proposed a new feature selection algorithm FSSAC.The new initialization strategy and evaluation function make FSSAC regard feature selection as a discrete space search problem.The algorithm also uses the accuracy of feature subset to guide the sampling period.In the stage of experiment,FSSAC was combined with the SVM,J48 and KNN,and then it was validated on UCI machine learning datasets by comparing with FSFOA,HGAFS,PSO and so on .The experiments show that FSSAC can improve the classification accuracy of classifier and has good generalization.Besides,FSSAC was also compared with other available methods in dimensionality reduction.

Key words: Feature selection,SAC,FSSAC,Dimensionality reduction

[1] TAN K C,TEOH E J,YU Q,et al.A hybrid evolutionary algorithm for attribute selection in data mining[J].Expert Systems with Applications,2009,36(4):8616-8630.
[2] HALL M A.Correlation-Based Feature Selection for Machine Learning[D].Hamilton:The University of Waikato,1999.
[3] HONG Q,YANG Y.On Sampling-and-Classification Optimization in Discrete Domains[C]∥IEEE Congress on Evolutionary Computation.IEEE,2016.
[4] ALMUALLIM H,DIETTERICH T G.Learning Boolean concepts in the presence of many irrelevant features[J].Artificial Intelligence,1994,69(1-2):279-305.
[5] ALMUALLIM H,DIETTERICH T G.Learning with many irrelevant features[C]∥National Conference on Artificial Intelligence.AAAI Press,1991:547-552.
[6] PUDIL P,NOVOVI,KITTLER J.Floating search methods infeature selection[J].Pattern Recognition Letters,1994,15(11):1119-1125.
[7] ZHU W,SI G,ZHANG Y,et al.Neighborhood effective information ratio for hybrid feature subset evaluation and selection[J].Neurocomputing,2013,99:25-37.
[8] GHAEMI M,FEIZI-DERAKHSHI M R.Feature selectionusing Forest Optimization Algorithm[J].Pattern Recognition,2016,60:121-129.
[9] YU Y,QIAN H.The sampling-and-learning framework:A statistical view of evolutionary algorithms[C]∥Evolutionary Computation.IEEE,2014:149-158.
[10] SUTTON A M,NEUMANN F.A Parameterized Runtime A-nalysis of Evolutionary Algorithms for the Euclidean Traveling Salesperson Problem[C]∥AAAI Conference on Artificial Intelligence.2012:595-628.
[11] HU Q,CHE X,ZHANG L,et al.Feature evaluation and selection based on neighborhood soft margin[J].Neurocomputing,2010,73(10-12):2114-2124.
[12] MOUSTAKIDIS S P,THEOCHARIS J B.SVM-FuzCoC:A novel SVM-based feature selection method using a fuzzy complementary criterion[J].Pattern Recognition,2010,43(11):3712-3729.
[13] HUANG J,RONG P.A Hybrid Genetic Algorithm for FeatureSelection Based on Mutual Information[J].Pattern Recognit.Lett.,2007,28(13):1825-1844.
[14] TABAKHI S,MORADI P,AKHLAGHIAN F.An unsuper-vised feature selection algorithm based on ant colony optimization[J].Engineering Applications of Artificial Intelligence,2014,32(6):112-123.
[15] XUE B,ZHANG M,BROWNE W N.Particle swarm optimisation for feature selection in classification:Novel initialisation and updating mechanisms[J].Applied Soft Computing,2014,18(C):261-276.

No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!