Computer Science ›› 2023, Vol. 50 ›› Issue (2): 138-145.doi: 10.11896/jsjkx.220400230

• Database & Big Data & Data Science • Previous Articles     Next Articles

Hierarchical Multiple Kernel K-Means Algorithm Based on Sparse Connectivity

WANG Lei1,2, DU Liang1,2, ZHOU Peng3   

  1. 1 College of Computer and Information Technology,Shanxi University,Taiyuan 030006,China
    2 Institute of Big Data Science and Industry,Shanxi University,Taiyuan 030006,China
    3 College of Computer Science and Technology,Anhui University,Hefei 230601,China
  • Received:2022-04-24 Revised:2022-10-27 Online:2023-02-15 Published:2023-02-22
  • Supported by:
    National Natural Science Foundation of China(61976129,62176001) and Natural Science Foundation for Young Scientists of Shanxi Province,China(201901D211168)

Abstract: Multiple kernel learning(MKL) aims to find an optimal consistent kernel function.In the hierarchical multiple kernel clustering(HMKC) algorithm,the sample features are extracted layer by layer from high-dimensional space to maximize the retention of effective information,but the information interaction between layers is ignored.In this model,only the corresponding nodes in the adjacent layer will exchange information,but for other nodes,it is isolated,and if the full connection is adopted,the diversity of the final consistence matrix will be reduced.Therefore,this paper proposes a hierarchical multiple kernel K-Means(SCHMKKM) algorithm based on sparse connectivity,which controls the assignment matrix to achieve the effect of sparse connections through the sparsity rate,thereby locally fusing the features obtained by the distillation of information between layers.Finally,we perform cluster analysis on multiple data sets and compare it with the fully connected hierarchical multiple kernel K-Means(FCHMKKM) algorithm in experiment.Finally,it is proved that more discriminative information fusion is beneficial to learn a better consistent partition matrix,and the fusion strategy of sparse connection is better than the strategy of full connection.

Key words: Multiple kernel learning, Hierarchical multiple kernel clustering, Sparse connectivity, Fully connected, Information distillation, Local fusion

CLC Number: 

  • TP181
[1]MACQUEEN J.Some methods for classification and analysis of multivariate observations[C]//Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability.California:University of California Press,1967:281-297.
[2]SCHÖLKOPF B,SMOLA A,MÜLLER K R.Nonlinear component analysis as a kernel eigenvalue problem[J].Neural Computation,1998,10(5):1299-1319.
[3]JIAO R H,LIU S L,WEN W,et al.Incremental kernel fuzzy c-means with optimizing cluster center initialization and delivery[J].Kybernetes:The International Journal of Systems and Cybernetics,2016,45(8):1273-1291.
[4]KANG Z,PENG C,CHENG Q,et al.Unified spectral clustering with optimal graph[C]//Thirty-Second AAAI Conference on Artificial Intelligence.Palo Alto,CA:AAAI Press,2017:3366-3373.
[5]SHI Y,TRANCHEVENT L,LIU X H,et al.Optimized data fusion for kernel k-means clustering.[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2012,34(5):1031-1039.
[6]XU Z L,JIN R,KING I,et al.An extended level method for efficient multiple kernel learning[C]//Advances in Neural Information Processing Systems 21.Massachusetts:MIT Press,2009:1825-1832.
[7]DU L,ZHOU P,SHI L,et al.Robust multiple kernel k-means using l21-norm[C]//Proceedings of the 24th International Conference on Artificial Intelligence.Palo Alto,CA:AAAI Press,2015:3476-3482.
[8]PATEL V M,VIDAL R.Kernel sparse subspace clustering[C]//2014 IEEE International Conference on Image Processing(ICIP).Piscataway:IEEE Press,2014:2849-2853.
[9]SUN M J,WANG S W,ZHANG P,et al.Projective Multiple Kernel Subspace Clustering[J].IEEE Transactions on Multimedia,2021,2567-2579.
[10]LIU J Y,LIU X W,WANG S W,et al.Hierarchical Multiple Kernel Clustering[C]//Proceedings of the Thirty-fifth AAAI Conference on Artificial Intelligence.Palo Alto,CA:AAAI Press,2021:8671-8679.
[11]EN Z,ZHOU S H,WANG Y Q,et al.Optimal Neighborhood Kernel Clustering with Multiple Kernels[C]//Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence.Palo Alto,CA:AAAI Press,2017:2266-2272.
[12]LIU J Y,LIU X W,XIONG J,et al.Optimal NeighborhoodMultiple Kernel Clustering with Adaptive Local Kernels[J].IEEE Transactions on Knowledge and Data Engineering,2021,34(6):2872-2885.
[13]SPRINGENBERG J T,DOSOVITSKIY A,BROX T,et al.Striving for simplicity:The all convolutional net[J].arXiv:1412.6806,2014.
[14] CONSTANTIN M D,ELENA M,PETER S,et al.Scalabletraining of artificial neural networks with adaptive sparse connectivity inspired by network science[J].Nature Communications,2018,9(1):2383-2383.
[15]LECUN Y,BENGIO Y,HINTON G.Deep learning[J].Nature,2015,521(7553):436-444.
[16]LIU X W,DOU Y,YIN J P,et al.2016.Multiple Kernel k-Means Clustering with Matrix-Induced Regularization[C]//Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence. Palo Alto,CA:AAAI Press,2016:1888-1894.
[17]LI M M,LIU X W,WANG L,et al.Multiple Kernel Clustering with Local Kernel Alignment Maximization[C]//Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence.San Francisco:Morgan Kaufmann Press,2016:1704-1710.
[18]KANG Z,WEN L J,CHEN W Y,et al.Low-rank kernel lear-ning for graph-based clustering[J].Knowledge Based Systems,2019 163(JAN.1):510-517.
[19]KANG Z,NIE F P,WANG J,et al.Multiview Consensus Graph Clustering[J].IEEE Transactions on Image Processing,2019,28(3):1261-1270.
[20]YANG C,REN Z W,SUN Q S,et al.Joint Correntropy Metric Weighting and Block Diagonal Regularizer for Robust Multiple Kernel Subspace Clustering[J].Information Sciences,2019,500:48-66.
[1] DAI Xiao-lu, WANG Ting-hua, ZHOU Hui-ying. Fuzzy Multiple Kernel Support Vector Machine Based on Weighted Mahalanobis Distance [J]. Computer Science, 2022, 49(11A): 210800216-5.
[2] ZHANG Jie, BAI Guang-wei, SHA Xin-lei, ZHAO Wen-tian, SHEN Hang. Mobile Traffic Forecasting Model Based on Spatio-temporal Features [J]. Computer Science, 2019, 46(12): 108-113.
[3] ZHONG Rui, WU Huai-yu, HE Yun. Fast Face Recognition Algorithm Based on Local Fusion Feature and Hierarchical Incremental Tree [J]. Computer Science, 2018, 45(6): 308-313.
[4] WANG Tie-jian, WU Fei and JING Xiao-yuan. Multiple Kernel Dictionary Learning for Software Defect Prediction [J]. Computer Science, 2017, 44(12): 131-134.
[5] ZHONG Xian, YANG Guang and LU Yan-sheng. Method of Key Frames Extraction Based on Double-threshold Values Sliding Window Sub-shot Segmentation and Fully Connected Graph [J]. Computer Science, 2016, 43(6): 289-293.
[6] CHEN Tong-tong, DING Xin-miao, LIU Chan-juan, ZOU Hai-lin, ZHOU Shu-sen and LIU Ying. Multi-instance Multi-label Learning Algorithm by Treating Instances as Non-independent Identically Distributed Samples [J]. Computer Science, 2016, 43(2): 287-292.
[7] . Face Feature Extraction Based on Weighted Multiple Kernel Fisher Discriminant Analysis [J]. Computer Science, 2012, 39(9): 262-265.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!