计算机科学 ›› 2024, Vol. 51 ›› Issue (11A): 240100054-7.doi: 10.11896/jsjkx.240100054

• 大数据&数据科学 • 上一篇    下一篇

局部结构自适应的线性投影方法研究

杨兴1,3, 王士同2, 胡文军1,3   

  1. 1 湖州师范学院信息工程学院 浙江 湖州 313000
    2 江南大学人工智能与计算机学院 江苏 无锡 214122
    3 浙江省现代农业资源智慧管理与应用研究重点实验室 浙江 湖州 313000
  • 出版日期:2024-11-16 发布日期:2024-11-13
  • 通讯作者: 胡文军(hoowenjun@foxmail.com)
  • 作者简介:(zzhszzx@163.com)
  • 基金资助:
    面向医学影像的小样本学习理论、关键技术与实证研究(U20A20228)

Study on Linear Projection Method for Local Structure Adaptation

YANG Xing1,3, WANG Shitong2, HU Wenjun1,3   

  1. 1 School of Information Engineering,Huzhou University,Huzhou,Zhejiang 313000,China
    2 School of Artificial Intelligence and Computer,Jiangnan University,Wuxi,Jiangsu 214122,China
    3 Zhejiang Provincial Key Laboratory of Intelligent Management and Application of Modern Agricultural Resources,Huzhou,Zhejiang 313000,China
  • Online:2024-11-16 Published:2024-11-13
  • About author:YANG Xing,born in 1998,postgra-duate,is a member of CCF(No.N0566G).His main research interests is manifold learning.
    HU Wenjun,born in 1977,Ph.D,professor.His main research interests include machine learning and pattern recognition.
  • Supported by:
    Learning Theory,Key Learning Techniques and CASE Studies on Small-sample-sized Medical Imaging Data(U20A20228).

摘要: 流形学习的核心是通过保持局部结构来捕捉数据中隐藏的几何信息,而局部结构通常利用冗余或干扰的原始数据进行评价,这意味着局部结构是不可靠的,存在局部结构置信度不足的问题。为此,针对性地提出一种局部结构自适应的线性投影方法。该方法的核心在于:一方面,它强制线性投影后的低维表示保持高维空间中的局部结构;另一方面,通过低维空间表示更新高维空间中的局部结构,并通过循环迭代方式实现局部结构的自适应。真实数据集上的实验结果表明,所提方法在各项性能指标上均优于其他对比方法。

关键词: 流形学习, 局部结构, 置信度, 自适应, 线性投影

Abstract: The core of manifold learning lies in capturing hidden geometric information within data by preserving local structures,which are typically assessed using redundant or noisy raw data.This implies that local structures are unreliable,giving rise to issues of insufficient confidence in local structures.To address this issue,a locally adaptive linear projection method is proposed.The essence of this method lies in two aspects:firstly,it enforces that the low-dimensional representation obtained through linear projection preserves local structures in the high-dimensional space;secondly,it updates the local structures in the high-dimensional space through the low-dimensional representation and achieves local structure adaptation through iterative cycles.Experimental results on real datasets demonstrate that the proposed method outperforms other comparative methods across various performance metrics.

Key words: Manifold learning, Local structure, Confidence, Adaptive, Linear projection

中图分类号: 

  • TP391
[1]ZHU F,GAO J,YANG J,et al.Neighborhood Linear Discriminant Analysis[J].Pattern Recognition,2022,12 3:108422.
[2]TANG T M,ALLEN G I.Integrated Principal ComponentsAnalysis[J].The Journal of Machine Learning Research,2021,22(1):8953-9023.
[3]HASAN B M S,ABDULAZEEZ A M.A Review of PrincipalComponent Analysis Algorithm for Dimensionality Reduction[J].Journal of Soft Computing and Data Mining,2021,2(1):20-30.
[4]LI P,ZHANG W,LU C,et al.Robust Kernel Principal Component Analysis with Optimal Mean[J].Neural Networks,2022,152:347-352.
[5]LI S,ZHANG H,MA R,et al.Linear Discriminant Analysiswith Generalized Kernel Constraint for Robust Image Classification[J].Pattern Recognition,2023,136:109196.
[6]AYESHA S,HANIF M K,TALIB R.Overview and Comparative Study of Dimensionality Reduction Techniques for High Dimensional Data[J].Information Fusion,2020,59:44-58.
[7]WANG Y,ZHANG Z,LIN Y.Multi-cluster Feature SelectionBased on Isometric Mapping[J].IEEE/CAA Journal of Automatica Sinica,2021,9(3):570-572.
[8]LU X,LONG J,WEN J,et al.Locality Preserving Projectionwith Symmetric Graph Embedding for Unsupervised Dimensio-nality Reduction[J].Pattern Recognition,2022,131:108844.
[9]LOU X,YAN D Q,WAN B L,et al.An Improved Neighbor-hood Preserving Embedding Algorithm[J].Computer Science,2018,45:255-258,278.
[10]DING S,KEAL C A,ZHAO L,et al.Dimensionality Reduction and Classification for Hyperspectral Image Based on Robust Supervised ISOMAP[J].Journal of Industrial and Production Engineering,2022,39(1):19-29.
[11]MIAO J,YANG T,SUN L,et al.Graph Regularized LocallyLinear Embedding for Unsupervised Feature Selection[J].Pattern Recognition,2022,122:108299.
[12]ANOWAR F,SADAOUI S,SELIM B.Conceptual and Empirical Comparison of Dimensionality Reduction Algorithms(PCA,KPCA,LDA,MDS,SVD,LLE,ISOMAP,LE,ICA,T-SNE)[J].Computer Science Review,2021,40:100378.
[13]ZHU H,SUN K,KONIUSZ P.Contrastive Laplacian Eigen-maps[J].Advances in Neural Information Processing Systems,2021,34:5682-5695.
[14]LIU N,LAI Z,LI X,et al.Locality Preserving Robust Regression for Jointly Sparse Subspace Learning[J].IEEE Transactions on Circuits and Systems for Video Technology,2020,31(6):2274-2287.
[15]JIANG W,NIE F,HUANG H.Robust Dictionary Learningwith Capped L1-norm[C]//Twenty-Fourth International Joint Conference on Artificial Intelligence.2015.
[16]NIE F,YUAN J,HUANG H.Optimal Mean Robust Principal Component Analysis[C]//International Conference on Machine Learning.PMLR,2014:1062-1070.
[17]CHANG W,NIE F,WANG Z,et al.Self-weighted LearningFramework for Adaptive Locality Discriminant Analysis [J].Pattern Recognition,2022,129:108778.
[18]ZHAN Y H,CHANG Z N.Research on Feature Filtering Preprocessing Based on KNN Algorithm[J].Modern Information Technology,2022,6(4):126-128.
[19]DAI W,CHAI J,LIU Y J.Semi-supervised Learning Algorithm Based on Maximum Margin and Manifold Hypothesis[J].Computer Science,2024,51(2):259-267.
[20]GUO H,ZOU H,TAN J.Semi-supervised Dimensionality Reduction via Sparse Locality Preserving Projection[J].Applied Intelligence,2020,50:1222-1232.
[21]GAMBELLA C,GHADDAR B,NAOUM-SAWAYA J.Optimization Problems for Machine Learning:A Survey[J].European Journal of Operational Research,2021,290(3):807-828.
[22]HU X F,CHEN S P.Review of Small Sample Learning Based onMachine Learning[J].Intelligent Computers and Applications,2021,11(7):191-195,201.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!