Computer Science ›› 2020, Vol. 47 ›› Issue (6A): 480-484.doi: 10.11896/JsJkx.20190800095

• Database & Big Data & Data Science • Previous Articles     Next Articles

Improved Locality and Similarity Preserving Feature Selection Algorithm

LI Jin-xia1, ZHAO Zhi-gang1, LI Qiang1, LV Hui-xian2 and LI Ming-sheng1   

  1. 1 College of Computer Science and Technology,Qingdao University,Qingdao,Shandong 266071,China
    2 College of Automation and Electrical Engineering,Qingdao University,Qingdao,Shandong 266071,China
  • Published:2020-07-07
  • About author:LI Jin-xia, born in 1994, postgraduate.Her main research interests include machine learning and so on.
    ZHAO Zhi-gang, born in 1973, professor, is a member of China Computer Federation.His main research interests include image processing, machine learning and compressed sensing.
  • Supported by:
    This work was supported by the National Key Research and Development Program of China (2017YFB0203102).

Abstract: LSPE (Locality and similarity preserving embedding) feature selection algorithm firstly maintains the locality of the data based on the pre-defined graph structure of the KNN,and then maintains the locality and similarity of the data based on the low-dimensional reconstruction coefficients that define the learning data of the graph.The two steps are independent and lack of interaction.Since the number of nearest neighbors is artificially defined,the learned graph structure does not have adaptive nearest neighbors and is not optimal,which will affect the performance of the algorithm.In order to optimize the performance of LSPE,an improved locality and similarity preserving feature selection algorithm is proposed.The proposed algorithm incorporates graph learning,sparse reconstruction and feature selection into the same framework,making graph learning and sparse coding are carried out simultaneously.The coding process is required to to be sparse,adaptive neighbor and non-negative.The goal is to find a proJection that can maintain the locality and similarity of the data,and apply a l2,1-norm to the proJection matrix,and then select the relevant features that can maintain locality and similarity.Experimental results show that the improved algorithm reduces the subJective influence,eliminates the instability of selecting features,is more robust to data noise,and improves the accuracy of ima-ge classification.

Key words: Feature selection, Locality and similarity preserving, Sparse reconstruction, Unsupervised learning

CLC Number: 

  • TP391.4
[1] LI T,MENG Z,NI B,et al.Robust Geometric p-norm Feature Pooling for Image Classification and Action Recognition.Ima-ge & Vision Computing,2016,55(P2):64-76.
[2] ZHAO Z,LIU H.Semi-supervised feature selection via spectral analysis.//Proceedings of the 2007 SIAM International Conference on Data Mining.Minneapolis,Minnesota:SIAM,2007:26-28.
[3] LIU Y,NIE F,WU J,et al.Efficient semi-supervised feature selection with noise insensitive trace ratio criterion.Neurocomputing,2013,105(Complete):12-18.
[4] CHANG X,NIE F P,YANG Y,et al.A convex formulation for semi-supervised multi-label feature selection//Proc of the 28th AAAI Conference on Artificial Intelligence.2014:1171-1177.
[5] ROWEIS S T.Nonlinear Dimensionality Reduction by Locally Linear Embedding.Science,2000,290(5500):2323-2326.
[6] BELKIN M,NIYOGI P.LaplacianEigenmaps for Dimensionality Reduction and Data Representation.MIT Press,2003.
[7] BENGIO Y,VINCENT P.Out-of-Sample Extensions for LLE,Isomap,MDS,Eigenmaps,and Spectral Clustering//International Conference on Neural Information Processing Systems.MIT Press,2003.
[8] HE X,YAN S,HU Y,et al.Face Recognition Using Laplacian Faces.IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(3):328-340.
[9] HE X.Neighborhood preserving embedding//Tenth IEEE International Conference on Computer Vision.2005:1208-1213.
[10] QIAO L,CHEN S,TAN X.Sparsity preserving proJections with applications to face recognition.Pattern Recognition,2010,43(1):331-341.
[11] KAI Y,TONG Z,GONG Y.Nonlinear Learning Using Local Coordinate Coding//Advances in Neural Information Processing Systems.2009:2223-2231.
[12] FANG X,XU Y,LI X,et al.Locality and similarity preserving embedding for feature selection.Neurocomputing,2014,128:304-315.
[13] WANG J,YANG J,KAI Y,et al.Locality-constrained Linear Coding for image classification//Computer Vision & Pattern Recognition.2010:3360-3367.
[14] FANG X,XU Y,LI X,et al.Learning a Nonnegative Sparse Graph for Linear Regression.IEEE Transactions on Image Processing,2015,24(9):2760-2771
[15] YANG J,ZHANG Y.Alternating Direction Algorithms forL1-Problems in Compressive Sensing.SIAM Journal on Scie-ntific Computing,2011,33(1):250-278.
[16] FUKUNAGA,KEINOSUKE.Introduction to statistical pattern recognition.Academic Press,1972.
[17] NIE F,HUANG H,CAI X,et al.Efficient and Robust Feature Selection via Joint 2,1-Norms Minimization//Advances in Neural Information Processing Systems.2010:1813-1821.
[18] HAN Y,XU Z,MA Z,et al.Image classification with manifold learning for out-of-sample data.Signal Processing,2013,93(8):2169-2177.
[19] YAN F,WANG X D.A semi-supervised feature selection me-thod based on local discriminant constraint.Pattern Recognition AndArtificial Intelligence,2017,30(1):89-95.
[1] SONG Jie, LIANG Mei-yu, XUE Zhe, DU Jun-ping, KOU Fei-fei. Scientific Paper Heterogeneous Graph Node Representation Learning Method Based onUnsupervised Clustering Level [J]. Computer Science, 2022, 49(9): 64-69.
[2] LI Bin, WAN Yuan. Unsupervised Multi-view Feature Selection Based on Similarity Matrix Learning and Matrix Alignment [J]. Computer Science, 2022, 49(8): 86-96.
[3] HU Yan-yu, ZHAO Long, DONG Xiang-jun. Two-stage Deep Feature Selection Extraction Algorithm for Cancer Classification [J]. Computer Science, 2022, 49(7): 73-78.
[4] KANG Yan, WANG Hai-ning, TAO Liu, YANG Hai-xiao, YANG Xue-kun, WANG Fei, LI Hao. Hybrid Improved Flower Pollination Algorithm and Gray Wolf Algorithm for Feature Selection [J]. Computer Science, 2022, 49(6A): 125-132.
[5] CHU An-qi, DING Zhi-jun. Application of Gray Wolf Optimization Algorithm on Synchronous Processing of Sample Equalization and Feature Selection in Credit Evaluation [J]. Computer Science, 2022, 49(4): 134-139.
[6] SUN Lin, HUANG Miao-miao, XU Jiu-cheng. Weak Label Feature Selection Method Based on Neighborhood Rough Sets and Relief [J]. Computer Science, 2022, 49(4): 152-160.
[7] LI Zong-ran, CHEN XIU-Hong, LU Yun, SHAO Zheng-yi. Robust Joint Sparse Uncorrelated Regression [J]. Computer Science, 2022, 49(2): 191-197.
[8] HOU Hong-xu, SUN Shuo, WU Nier. Survey of Mongolian-Chinese Neural Machine Translation [J]. Computer Science, 2022, 49(1): 31-40.
[9] ZHANG Ye, LI Zhi-hua, WANG Chang-jie. Kernel Density Estimation-based Lightweight IoT Anomaly Traffic Detection Method [J]. Computer Science, 2021, 48(9): 337-344.
[10] YANG Lei, JIANG Ai-lian, QIANG Yan. Structure Preserving Unsupervised Feature Selection Based on Autoencoder and Manifold Regularization [J]. Computer Science, 2021, 48(8): 53-59.
[11] HOU Chun-ping, ZHAO Chun-yue, WANG Zhi-peng. Video Abnormal Event Detection Algorithm Based on Self-feedback Optimal Subclass Mining [J]. Computer Science, 2021, 48(7): 199-205.
[12] HU Yan-mei, YANG Bo, DUO Bin. Logistic Regression with Regularization Based on Network Structure [J]. Computer Science, 2021, 48(7): 281-291.
[13] ZHOU Gang, GUO Fu-liang. Research on Ensemble Learning Method Based on Feature Selection for High-dimensional Data [J]. Computer Science, 2021, 48(6A): 250-254.
[14] DING Si-fan, WANG Feng, WEI Wei. Relief Feature Selection Algorithm Based on Label Correlation [J]. Computer Science, 2021, 48(4): 91-96.
[15] TENG Jun-yuan, GAO Meng, ZHENG Xiao-meng, JIANG Yun-song. Noise Tolerable Feature Selection Method for Software Defect Prediction [J]. Computer Science, 2021, 48(12): 131-139.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!