计算机科学 ›› 2024, Vol. 51 ›› Issue (11A): 240100054-7.doi: 10.11896/jsjkx.240100054
杨兴1,3, 王士同2, 胡文军1,3
YANG Xing1,3, WANG Shitong2, HU Wenjun1,3
摘要: 流形学习的核心是通过保持局部结构来捕捉数据中隐藏的几何信息,而局部结构通常利用冗余或干扰的原始数据进行评价,这意味着局部结构是不可靠的,存在局部结构置信度不足的问题。为此,针对性地提出一种局部结构自适应的线性投影方法。该方法的核心在于:一方面,它强制线性投影后的低维表示保持高维空间中的局部结构;另一方面,通过低维空间表示更新高维空间中的局部结构,并通过循环迭代方式实现局部结构的自适应。真实数据集上的实验结果表明,所提方法在各项性能指标上均优于其他对比方法。
中图分类号:
[1]ZHU F,GAO J,YANG J,et al.Neighborhood Linear Discriminant Analysis[J].Pattern Recognition,2022,12 3:108422. [2]TANG T M,ALLEN G I.Integrated Principal ComponentsAnalysis[J].The Journal of Machine Learning Research,2021,22(1):8953-9023. [3]HASAN B M S,ABDULAZEEZ A M.A Review of PrincipalComponent Analysis Algorithm for Dimensionality Reduction[J].Journal of Soft Computing and Data Mining,2021,2(1):20-30. [4]LI P,ZHANG W,LU C,et al.Robust Kernel Principal Component Analysis with Optimal Mean[J].Neural Networks,2022,152:347-352. [5]LI S,ZHANG H,MA R,et al.Linear Discriminant Analysiswith Generalized Kernel Constraint for Robust Image Classification[J].Pattern Recognition,2023,136:109196. [6]AYESHA S,HANIF M K,TALIB R.Overview and Comparative Study of Dimensionality Reduction Techniques for High Dimensional Data[J].Information Fusion,2020,59:44-58. [7]WANG Y,ZHANG Z,LIN Y.Multi-cluster Feature SelectionBased on Isometric Mapping[J].IEEE/CAA Journal of Automatica Sinica,2021,9(3):570-572. [8]LU X,LONG J,WEN J,et al.Locality Preserving Projectionwith Symmetric Graph Embedding for Unsupervised Dimensio-nality Reduction[J].Pattern Recognition,2022,131:108844. [9]LOU X,YAN D Q,WAN B L,et al.An Improved Neighbor-hood Preserving Embedding Algorithm[J].Computer Science,2018,45:255-258,278. [10]DING S,KEAL C A,ZHAO L,et al.Dimensionality Reduction and Classification for Hyperspectral Image Based on Robust Supervised ISOMAP[J].Journal of Industrial and Production Engineering,2022,39(1):19-29. [11]MIAO J,YANG T,SUN L,et al.Graph Regularized LocallyLinear Embedding for Unsupervised Feature Selection[J].Pattern Recognition,2022,122:108299. [12]ANOWAR F,SADAOUI S,SELIM B.Conceptual and Empirical Comparison of Dimensionality Reduction Algorithms(PCA,KPCA,LDA,MDS,SVD,LLE,ISOMAP,LE,ICA,T-SNE)[J].Computer Science Review,2021,40:100378. [13]ZHU H,SUN K,KONIUSZ P.Contrastive Laplacian Eigen-maps[J].Advances in Neural Information Processing Systems,2021,34:5682-5695. [14]LIU N,LAI Z,LI X,et al.Locality Preserving Robust Regression for Jointly Sparse Subspace Learning[J].IEEE Transactions on Circuits and Systems for Video Technology,2020,31(6):2274-2287. [15]JIANG W,NIE F,HUANG H.Robust Dictionary Learningwith Capped L1-norm[C]//Twenty-Fourth International Joint Conference on Artificial Intelligence.2015. [16]NIE F,YUAN J,HUANG H.Optimal Mean Robust Principal Component Analysis[C]//International Conference on Machine Learning.PMLR,2014:1062-1070. [17]CHANG W,NIE F,WANG Z,et al.Self-weighted LearningFramework for Adaptive Locality Discriminant Analysis [J].Pattern Recognition,2022,129:108778. [18]ZHAN Y H,CHANG Z N.Research on Feature Filtering Preprocessing Based on KNN Algorithm[J].Modern Information Technology,2022,6(4):126-128. [19]DAI W,CHAI J,LIU Y J.Semi-supervised Learning Algorithm Based on Maximum Margin and Manifold Hypothesis[J].Computer Science,2024,51(2):259-267. [20]GUO H,ZOU H,TAN J.Semi-supervised Dimensionality Reduction via Sparse Locality Preserving Projection[J].Applied Intelligence,2020,50:1222-1232. [21]GAMBELLA C,GHADDAR B,NAOUM-SAWAYA J.Optimization Problems for Machine Learning:A Survey[J].European Journal of Operational Research,2021,290(3):807-828. [22]HU X F,CHEN S P.Review of Small Sample Learning Based onMachine Learning[J].Intelligent Computers and Applications,2021,11(7):191-195,201. |
|