Computer Science ›› 2020, Vol. 47 ›› Issue (11A): 474-478.doi: 10.11896/jsjkx.200100037

• Big Data & Data Science • Previous Articles     Next Articles

Feature Selection Method Combined with Multi-manifold Structures and Self-representation

YI Yu-gen1, LI Shi-cheng1, PEI Yang1, CHEN Lei1, DAI Jiang-yan2   

  1. 1 School of Software,Jiangxi Normal University,Nanchang 330022,China
    2 School of Computer Engineering,Weifang University,Weifang,Shandong 261061,China
  • Online:2020-11-15 Published:2020-11-17
  • About author:YI Yu-gen,born in 1986,Ph.D,lecturer.His research interests include artificial intelligence,computer vision,and machine learning.
    DAI Jiang-yan,born in 1985,postdoctor,associate professor.Her main research interests include image inpain-ting and computer vision.
  • Supported by:
    This work was supported by the National Natural Science Foundation of China (61602221,61806126),Natural Science Foundation of Jiangxi Province (20171BAB212009),Science and Technology Research Project of Jiangxi Provincial Department of Education (GJJ160315,GJJ170234) and Youth Innovation Science and Technology Support Plan of Shandong Provincial Institution of Higher Education (2019KJN012).

Abstract: Feature selection is to reduce the dimension of data by removing irrelevant and redundant features and improve the efficiency of learning algorithm.Unsupervised feature selection has become one of the challenging problems in dimensionality reduction.Firstly,combining self-representation and manifold structure of features,a Joint Multi-Manifold Structures and Self-Representation (JMMSSR) unsupervised feature selection algorithm is proposed.Different from the existing approaches,our approach designs an adaptive weighted strategy to integrate multi-manifold structures to describe the structure of features accurately.Then,a simple and effective iterative updating algorithm is proposed to solve the objective function,and the convergence of the optimization algorithm is also verified by numerical experiments.Finally,experimental results on three datasets (such as JAEEF,ORL and COIL20) show that the proposed approach exhibits better performance than the existing unsupervised feature selection approaches.

Key words: Adaptive weighted, Feature selection, Multi-manifold structures, Self-representation

CLC Number: 

  • TP391
[1] LI Y,LI T,LIU H.Recent advances in feature selection and its applications [J].Knowledge and Information Systems,2017,53(3):551-577.<br /> [2] LI J,LIU H.Challenges of feature selection for big data analy-tics [J].IEEE Intelligent Systems,2017,32(2):9-15.<br /> [3] GUI J,SUN Z,JI S,et al.Feature selection based on structured sparsity:A comprehensive study [J].IEEE Transactions on Neural Networks and Learning Systems,2016,28(7):1490-1507.<br /> [4] LI Z Q,DU J Q,NIE B,et al.Summary of Feature Selection Methods.CEA,2019,55(24):10-19.<br /> [5] FANG B,CHEN H M,WANG S W.Feature Selection Algo-rithm Based on Rough Sets and Fruit Fly Optimization[J].Computer Science,2019,46(7):157-164.<br /> [6] LI J,CHENG K,WANG S,et al.Feature selection:A data perspective [J].ACM Computing Surveys (CSUR),2018,50(6):94.<br /> [7] RIPLEY B D,HJORT N L.Pattern recognition and neural networks [M].Cambridge University press,1996.<br /> [8] HE X,CAI D,NIYOGI P.Laplacian score for feature selection [C]//Advances in Neural Information Processing Systems.2006:507-514.<br /> [9] ZHAO Z,LIU H.Spectral feature selection for supervised and unsupervised learning [C]//Proceedings of the 24th International Conference on Machine Learning.ACM,2007:1151-1157.<br /> [10] CAI D,ZHANG C,HE X.Unsupervised feature selection formulti-cluster data [C]//Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.ACM,2010:333-342.<br /> [11] LI Z,YANG Y,LIU J,et al.Unsupervised feature selection using nonnegative spectral analysis [C]//Twenty-Sixth AAAI Conference on Artificial Intelligence.2012.<br /> [12] NIE F,HUANG H,CAI X,et al.Efficient and robust feature selection via joint l2,1-norms minimization [C]//Advances in Neural Information Processing Systems.2010:1813-1821.<br /> [13] QIAN M,ZHAI C.Robust unsupervised feature selection [C]//Twenty-Third International Joint Conference on Artificial Intelligence.2013.<br /> [14] ZHU P,ZUO W,ZHANG L,et al.Unsupervised feature selection by regularized self-representation [J].Pattern Recognition,2015,48(2):438-446.<br /> [15] LIANG S,XU Q,ZHU P,et al.Unsupervised feature selection by manifold regularized self-representation [C]//2017 IEEE International Conference on Image Processing (ICIP).IEEE,2017:2398-2402.<br /> [16] TANG C,ZHU X,CHEN J,et al.Robust graph regularized unsupervised feature selection [J].Expert Systems with Applications,2018,96:64-76.<br /> [17] QIAO L,ZHANG L,CHEN S,et al.Data-driven graph construction and graph learning:A review [J].Neurocomputing,2018,312:336-351.<br /> [18] BELKIN M,NIYOGI P.Laplacian eigenmaps and spectral techniques for embedding and clustering [C]//Advances in Neural Information Processing Systems.2002:585-591.<br /> [19] ROWEIS S T,SAUL L K.Nonlinear dimensionality reduction by locally linear embedding [J].Science,2000,290(5500):2323-2326.<br /> [20] CHENG B,YANG J,YAN S,et al.Learning With l1-Graph for Image Analysis [J].IEEE Transactions on Image Processing,2009,19(4):858-866.<br /> [21] LIU G,LIN Z,YAN S,et al.Robust recovery of subspace structures by low-rank representation [J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2013,35(1):171-184.<br /> [22] LU C Y,MIN H,ZHAO Z Q,et al.Robust and efficient subspace segmentation via least squares regression [C]//European Conference on Computer Vision.Springer,2012:347-360.<br /> [23] LYONS M,AKAMATSU S,KAMACHI M,et al.Coding facial s with gabor wavelets [C]//Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.IEEE,1998:200-205.<br /> [24] SAMARIA F S,HARTER A C.Parameterisation of a stochastic model for human face identification [C]//Proceedings of 1994 IEEE Workshop on Applications of Computer Vision.1994:138-142.<br /> [25] NENE S A,NAYAR S K,MURASE H.Columbia object image library (COIL-20):CUCS-005-96 [R].1996.<br /> [26] FANG X,XU Y,LI X,et al.Orthogonal self-guided similaritypreserving projection for classification and clustering [J].Neural Networks,2017,88:1-8.
[1] LI Bin, WAN Yuan. Unsupervised Multi-view Feature Selection Based on Similarity Matrix Learning and Matrix Alignment [J]. Computer Science, 2022, 49(8): 86-96.
[2] HU Yan-yu, ZHAO Long, DONG Xiang-jun. Two-stage Deep Feature Selection Extraction Algorithm for Cancer Classification [J]. Computer Science, 2022, 49(7): 73-78.
[3] KANG Yan, WANG Hai-ning, TAO Liu, YANG Hai-xiao, YANG Xue-kun, WANG Fei, LI Hao. Hybrid Improved Flower Pollination Algorithm and Gray Wolf Algorithm for Feature Selection [J]. Computer Science, 2022, 49(6A): 125-132.
[4] CHU An-qi, DING Zhi-jun. Application of Gray Wolf Optimization Algorithm on Synchronous Processing of Sample Equalization and Feature Selection in Credit Evaluation [J]. Computer Science, 2022, 49(4): 134-139.
[5] SUN Lin, HUANG Miao-miao, XU Jiu-cheng. Weak Label Feature Selection Method Based on Neighborhood Rough Sets and Relief [J]. Computer Science, 2022, 49(4): 152-160.
[6] LI Zong-ran, CHEN XIU-Hong, LU Yun, SHAO Zheng-yi. Robust Joint Sparse Uncorrelated Regression [J]. Computer Science, 2022, 49(2): 191-197.
[7] ZHANG Ye, LI Zhi-hua, WANG Chang-jie. Kernel Density Estimation-based Lightweight IoT Anomaly Traffic Detection Method [J]. Computer Science, 2021, 48(9): 337-344.
[8] YANG Lei, JIANG Ai-lian, QIANG Yan. Structure Preserving Unsupervised Feature Selection Based on Autoencoder and Manifold Regularization [J]. Computer Science, 2021, 48(8): 53-59.
[9] HOU Chun-ping, ZHAO Chun-yue, WANG Zhi-peng. Video Abnormal Event Detection Algorithm Based on Self-feedback Optimal Subclass Mining [J]. Computer Science, 2021, 48(7): 199-205.
[10] HU Yan-mei, YANG Bo, DUO Bin. Logistic Regression with Regularization Based on Network Structure [J]. Computer Science, 2021, 48(7): 281-291.
[11] ZHOU Gang, GUO Fu-liang. Research on Ensemble Learning Method Based on Feature Selection for High-dimensional Data [J]. Computer Science, 2021, 48(6A): 250-254.
[12] DING Si-fan, WANG Feng, WEI Wei. Relief Feature Selection Algorithm Based on Label Correlation [J]. Computer Science, 2021, 48(4): 91-96.
[13] TENG Jun-yuan, GAO Meng, ZHENG Xiao-meng, JIANG Yun-song. Noise Tolerable Feature Selection Method for Software Defect Prediction [J]. Computer Science, 2021, 48(12): 131-139.
[14] ZHANG Ya-chuan, LI Hao, SONG Chen-ming, BU Rong-jing, WANG Hai-ning, KANG Yan. Hybrid Artificial Chemical Reaction Optimization with Wolf Colony Algorithm for Feature Selection [J]. Computer Science, 2021, 48(11A): 93-101.
[15] DONG Ming-gang, HUANG Yu-yang, JING Chao. K-Nearest Neighbor Classification Training Set Optimization Method Based on Genetic Instance and Feature Selection [J]. Computer Science, 2020, 47(8): 178-184.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!