Computer Science ›› 2022, Vol. 49 ›› Issue (2): 191-197.doi: 10.11896/jsjkx.210300034

• Database & Big Data & Data Science • Previous Articles     Next Articles

Robust Joint Sparse Uncorrelated Regression

LI Zong-ran1, CHEN XIU-Hong1,2, LU Yun1, SHAO Zheng-yi1   

  1. 1 School of Artificial Intelligence and Computer Science,Jiangnan University,Wuxi,Jiangsu 214122,China
    2 Jiangsu Key Laboratory of Media Design and Software Technology,Wuxi,Jiangsu 214122,China
  • Received:2021-03-02 Revised:2021-07-10 Online:2022-02-15 Published:2022-02-23
  • About author:LI Zong-Ran,born in 1995,postgra-duate.His main research interests include machine learning,pattern recognition and digital image processing.
    CHEN Xiu-hong,born in 1964,Ph.D.His main research interests include di-gital image processing,pattern recognition,target detection and tracking,optimization theory and method and so on.

Abstract: Common unsupervised feature selection methods only consider the selection of discriminative features,while ignoring the redundancy of features and failing to consider the problem of small classes,which affect the classification performance.Based on this background,a robust uncorrelated regression algorithm is proposed.First,research on uncorrelated regression,use uncorrelated orthogonal constraints to find irrelevant but discriminative features.Uncorrelated constraints keep the data structure in the Stiefel manifold,making the model have a closed solution,avoiding the possible trivial solutions caused by the traditional ridge regression model.Secondly,the loss function and the regularization term use the L2,1 norm to ensure the robustness of the model and obtain a sparse projection matrix.At the same time,the small class problem is taken into account,so that the number of projection matrices is not limited by the number of classes,and the result is enough projection matrices to improve the classification performance of the model.Theoretical analysis and experimental results on multiple data sets show that the proposed method has better performance than other feature selection methods.

Key words: Feature selection, Joint, Regression, Robust, Small-class, Uncorrelated

CLC Number: 

  • TP391.4
[1]MAATEN L,POSTMA E,HERIK J.Dimensionality reduction:A comparative review[J].Review Literature & Arts of the Americas,2009,10(1):158-165.
[2]JAIN A,ZONGKER D.Feature selection:evaluation,applica-tion,and small sample performance[J].IEEE Trans. Pattern Anal. Mach. Intell,1997,19(2):153-158.
[3]DUDA R O,HART P E,STORK D G.Pattern Classification[M].Wiley,2001:63-69.
[4]BELHUMEUR P N,HESPANHA J P,KRIEGMAN D J.Eigenfaces vs.Fisherfaces:Recognition Using Class Specific Li-near Projection[C]//European Conference on Computer Vision.Springer-Verlag,1996:15-28.
[5]HE J,DING L,LEI J,et al.Kernel ridge regression classification[C]//Proceedings of International Joint Conference on Neural Networks.IEEE,2014:137-146.
[6]NIE F,HUANG H,XIAO C,et al.Efficient and Robust Feature Selection via Joint ℓ2,1-Norms Minimization[C]//International Conference on Neural Information Processing Systems.Curran Associates Inc.,2010:1813-1821.
[7]CHEN X H,LU Y.Robust Graph Regularized Sparse Matrix Regression for Two-dimensional Supervised Feature Selection[J].IET Image Processing,2020,14(9):1740-1749.
[8]MO D,LAI Z.Robust Jointly Sparse Regression with Genera-lized Orthogonal Learning for Image Feature Selection[J].Pattern Recognition,2019,16(3):964-985.
[9]HOU C,JIAO Y,NIE F,et al.2D Feature Selection by SparseMatrix Regression[J].IEEE Transactions on Image Processing,2017,26(9):4255-4268.
[10]CHEN X H,LU Y.Dynamic Graph Regularization and Label Relaxation- Based Sparse Matrix Regression for Two-dimensional Feature Selection[J].IEEE Access,2020,8:62855-62870.
[11]YU Y,CHEN X.Joint feature weighting and adaptive graph-based matrix regression for image supervised feature Selection-ScienceDirect[J].Signal Processing:Image Communication,2020,13(2):9-16.
[12]YANG L,JIANG A L,QIANG Y.Structure Preserving Unsupervised Feature Selection Based on Autoencoder and Manifold Regularization[J].Computer Science,2021,48(8):53-59.
[13]YI Y,SHEN H T,MA Z,et al.L21-norm regularized discriminative feature selection for unsupervised learning[C]//Procee-dings of the Twenty-Second international joint conference on Artificial Intelligence.AAAI Press,2011:56-59.
[14]HOU C,NIE F,LI X,et al.Joint Embedding Learning andSparse Regression:A Framework for Unsupervised Feature Selection[J].IEEE Transactions on Cybernetics,2013,44(6):793-804.
[15]LIU X,WANG L,ZHANG J,et al.Global and Local Structure Preservation for Feature Selection[J].IEEE Transactions on Neural Networks and Learning Systems,2017,25(6):1083-1095.
[16]YUAN H,LI J,LAI L L,et al.Joint sparse matrix regression and nonnegative spectral analysis for two-dimensional unsupervised feature selection[J].Pattern Recognition,2019,66(3):119-133.
[17]LI X,ZHANG H,ZHANG R,et al.Generalized UncorrelatedRegression with Adaptive Graph for Unsupervised Feature Selection[J].IEEE Transactions on Neural Networks and Lear-ning Systems,2018,3(6):1-9.
[18]ZOU H,TIBSHIRANI H R.Sparse Principal Component Ana-lysis[J].Journal of Computational and Graphical Stats,2006,15(2):1-30.
[19]DAVENPORT M A,WAKIN M B.Analysis of OrthogonalMatching Pursuit using the Restricted Isometry Property[J].IEEE Transactions on Information Theory,2010,56(9):4395-4401.
[20]NIE F,ZHANG R,LI X.A generalized power iteration method for solving quadratic problem on the Stiefel manifold[J].Science China,2017(11):146-155.
[21]BELHUMEUR P N,HESPANHA J P,KRIEGMAN D J.Eigenfaces vs.Fisherfaces:Recognition Using Class Specific Li-near Projection[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1997,19(7):711-720.
[22]GEORGHIADES A,BELHUMEUR P P,KRIEGMAN D.Yale face database[OL].http://cvc.yale.edu/ptojects/yalefaces/yalefaces.html.
[23]JESORSKY O.Robust Face Detection Using the Hausdorff Distance[J].AVBPA,2001,32(10):90-95.
[24]LYONS M J,BUDYNEK J.Automatic classification of single facial images[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1999,4(3):93-104.
[25]NENE S A.Columbia Object Image Library(COIL-20)[J].Technical Report,1996,5(8):63-68.
[1] LYU You, WU Wen-yuan. Privacy-preserving Linear Regression Scheme and Its Application [J]. Computer Science, 2022, 49(9): 318-325.
[2] LI Bin, WAN Yuan. Unsupervised Multi-view Feature Selection Based on Similarity Matrix Learning and Matrix Alignment [J]. Computer Science, 2022, 49(8): 86-96.
[3] HU Yan-yu, ZHAO Long, DONG Xiang-jun. Two-stage Deep Feature Selection Extraction Algorithm for Cancer Classification [J]. Computer Science, 2022, 49(7): 73-78.
[4] ZHOU Hui, SHI Hao-chen, TU Yao-feng, HUANG Sheng-jun. Robust Deep Neural Network Learning Based on Active Sampling [J]. Computer Science, 2022, 49(7): 164-169.
[5] CHEN Yong-ping, ZHU Jian-qing, XIE Yi, WU Han-xiao, ZENG Huan-qiang. Real-time Helmet Detection Algorithm Based on Circumcircle Radius Difference Loss [J]. Computer Science, 2022, 49(6A): 424-428.
[6] YAN Meng, LIN Ying, NIE Zhi-shen, CAO Yi-fan, PI Huan, ZHANG Lan. Training Method to Improve Robustness of Federated Learning [J]. Computer Science, 2022, 49(6A): 496-501.
[7] KANG Yan, WANG Hai-ning, TAO Liu, YANG Hai-xiao, YANG Xue-kun, WANG Fei, LI Hao. Hybrid Improved Flower Pollination Algorithm and Gray Wolf Algorithm for Feature Selection [J]. Computer Science, 2022, 49(6A): 125-132.
[8] WANG Wen-qiang, JIA Xing-xing, LI Peng. Adaptive Ensemble Ordering Algorithm [J]. Computer Science, 2022, 49(6A): 242-246.
[9] LI Jing-tai, WANG Xiao-dan. XGBoost for Imbalanced Data Based on Cost-sensitive Activation Function [J]. Computer Science, 2022, 49(5): 135-143.
[10] ZHAO Liang, ZHANG Jie, CHEN Zhi-kui. Adaptive Multimodal Robust Feature Learning Based on Dual Graph-regularization [J]. Computer Science, 2022, 49(4): 124-133.
[11] CHU An-qi, DING Zhi-jun. Application of Gray Wolf Optimization Algorithm on Synchronous Processing of Sample Equalization and Feature Selection in Credit Evaluation [J]. Computer Science, 2022, 49(4): 134-139.
[12] SUN Lin, HUANG Miao-miao, XU Jiu-cheng. Weak Label Feature Selection Method Based on Neighborhood Rough Sets and Relief [J]. Computer Science, 2022, 49(4): 152-160.
[13] ZHAO Yue, YU Zhi-bin, LI Yong-chun. Cross-attention Guided Siamese Network Object Tracking Algorithm [J]. Computer Science, 2022, 49(3): 163-169.
[14] ZHANG Cheng-rui, CHEN Jun-jie, GUO Hao. Comparative Analysis of Robustness of Resting Human Brain Functional Hypernetwork Model [J]. Computer Science, 2022, 49(2): 241-247.
[15] XIAO Kang, ZHOU Xia-bing, WANG Zhong-qing, DUAN Xiang-yu, ZHOU Guo-dong, ZHANG Min. Review Question Generation Based on Product Profile [J]. Computer Science, 2022, 49(2): 272-278.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!