Computer Science ›› 2026, Vol. 53 ›› Issue (4): 48-56.doi: 10.11896/jsjkx.251000068

• Interdisciplinary Integration of Artificial Intelligence and Theoretical Computer Science • Previous Articles     Next Articles

Dynamic Ensemble Stacking Broad Learning System for High-dimensional Data

YUN Fan, YU Zhiwen, YANG Kaixiang   

  1. College of Computer Science and Engineering, South China University of Technology, Guangzhou 510006, China
  • Received:2025-10-16 Revised:2026-01-16 Online:2026-04-15 Published:2026-04-08
  • About author:YUN Fan,born in 2000,postgraduate,is a member of CCF(No.K4411M).Her main research interests include compu-ter science,artificial intelligence,machine learning,data mining,broad lear-ning system and ensemble learning.
    YU Zhiwen,born in 1979,Ph.D,professor,Ph.D supervisor,is a member of CCF(No.16933D).His main research interests include artificial intelligence,machine learning and data mining.
  • Supported by:
    National Natural Science Foundation of China(62572199,92467109,U21A20478) and National Key R & D Program of China(2023YFA1011601).

Abstract: In high-dimensional small sample classification tasks,BLS (Broad Learning System) has garnered much attention due to its efficiency.However,the feature extraction capability of the single-layer BLS is limited,making it difficult to handle complex high-dimensional data.The random node generation mechanism induces node redundancy when directly stacking BLS hidden la-yers,thereby hindering improvements in model performance.To address these issues,an ensemble stack BLS(E-SBLS) algorithm is proposed.E-SBLS utilizes the output of the previous BLS layer as enhanced features,concatenates them with the original feature weighted by classification confidence,and sends them into the subsequent BLS to continuously enhance the feature representation capability in deeper layers.By integrating the outputs of multiple BLS layers through a meta-learner pool,the high-dimensional feature extraction ability of the original single-layer BLS is augmented,thereby improving the generalizationperfor-mance of the proposed model.Furthermore,considering the complex and variable characteristics of high-dimensional data,a dynamic ensemble framework is designed to adjust the complexity of the model dynamically based on data difficulties.The proposed method further enhances ensemble efficiency while maintaining model performance.Ablation experiments validate the effectiveness of each module in the proposed algorithm,and comparative experiments demonstrate the superior classification performance of the proposed model on high-dimensional disease data.

Key words: Broad learning system, Ensemble learning, Dynamic structure, High-dimensional data, Stacking

CLC Number: 

  • TP302
[1]CHEN C L P,LIU Z.Broad learning system:A new learning paradigm and system without going deep[C]//Proceedings of 32nd Youth Academicannual Conference of Chinese Association of Automation.IEEE,2017:1271-1276.
[2]CHEN C L P,LIU Z.Broad learning system:An effective and efficient incremental learning system without the need for deep architecture[J].IEEE Transactions on Neural Networks and Learning Systems,2018,29(1):10-24.
[3]YUN F,YU Z,YANG K,et al.Adaboost-stacking based on incremental broad learning system[J].IEEE Transactions on Knowledge and Data Engineering,2024,36:7585-7599.
[4]PAO Y H,TAKEFUJI Y.Functional-link Net Computing:Theory,System Architecture,and Functionalities[J].Computer,1992,25:76-79.
[5]YUN F,YU Z,YANG K,Broad Learning System in Data Mi-ning and Machine Lear-ning[J].Data Mining and Machine Learning,2025,1(1):100002.
[6]ZHANG L,YU Z,YANG K,et al.Transferable and discriminative broad network for unsupervised domain adaptation[J].Knowledge-Based Systems,2025,315:113297.
[7]HU X,CHEN C L P,ZHANG T.Broad metric learning:A fast and efficient discriminative metric learning model[J].IEEE Transactions on Cybernetics,2010,55(10):14.
[8]YU Z,ZHONG Z,YANG K,et al.Broad learning autoencoder with graph structure for data clustering[J].IEEE Transactions on Knowledge and Data Engineering,2024,36(1):49-61.
[9]CHEN W,YANG K,ZHANG W,et al.Double-kernelizedweighted broad learning system for imbalanced data[J].Neural Computing and Applications,2022,34(22):19923-19936.
[10]CHEN W,YANG K,YU Z,et al.Double-kernel based class-specific broad learning system for multiclass imbalance learning[J].Knowledge-Based Systems,2022,253:109535.
[11]YANG K,YU Z,CHEN C L P,et al.Incremental weighted ensemble broad learning system for imbalanced data[J].IEEE Transactions on Knowledge and Data Engineering,2021,34:5809-5824.
[12]ZHANG T,LIU Z,WANG X,et al.Facial expression recognition via broad learning system[C]//Proceedings of IEEE International Conference on Systems,Man,and Cybernetics.IEEE,2018:1898-1902.
[13]LEI C,GUO J,CHEN C L P.Convbls:An effective and efficient incremental convolutional broad learning system combining deep and broad representations[J].IEEE Transactions on Artificial Intelligence,2024,5(10):5075-5089.
[14]YUN F,YU Z,YANG K,et al.Ensemble multi-kernel broadlearning system for disease diagnosis[C]//Proceedings of IEEE International Conference on Medical Artificial Intelligence.IEEE,2024:97-104.
[15]LIN M,YANG K,YU Z,et al.Hybrid ensemble broad learning system for network intrusion detection[J].IEEE Transactions on Industrial Informatics,2024,20(4):5622-5633.
[16]YANG K,SHI Y,YU Z,et al.Stacked one-class broad learning system for intrusion detection in industry 4.0[J].IEEE Transactions on Industrial Informatics,2023,19:251-260.
[17]LIU Z,CHEN C L P,FENG S,et al.Stacked broad learning system:From incremental flatted structure to deep model[J].IEEE Transactions on Systems,Man,and Cybernetics:Systems,2020,51:209-222.
[18]XIE R,WANG S.Downsizing and enhancing broad learning systems by feature augmentation and residuals boosting[J].Complex & Intelligent Systems,2020,6:411-429.
[19]XIE R,VONG C,CHEN C L P,et al.Dynamic network structure:Doubly stacking broad learning systems with residuals and simpler linear model transmission[J].IEEE Transactions on Emerging Topics in Computational Intelligence,2022,6:1378-1395.
[20]CHEN C L P,LIU Z,FENG S.Universal Approximation Capability of Broad Learning System and Its Structural Variations[J].IEEE Transactions on Neural Networks and Learning Systems,2019,30:1191-1204.
[21]YI J,HUANG J,ZHOU W,et al.Intergroup cascade broadlearning system with optimized parameters for chaotic time series prediction[J].IEEE Transactions on Artificial Intelligence,2022,3:709-721.
[22]DONG X,YU Z,CAO W,et al.A survey on ensemble learning[J].Frontiers of Computer Science,2020,14(2):241-258.
[23]BREIMAN L.Bagging predictors[J].Machine Learning,1996,24(2):123-140.
[24]HASTIE T,ROSSET S,ZHU J,et al.Multi-class adaboost[J].Statistics and its Interface,2009,2(3):349-360.
[25]WOLPERT D.Stacked Generalization[J].Neural Networks,1992,5:241-259.
[26]LI J,CHENG K,WANG S,et al.Feature Selection:A Data Perspective[J].ACM Computing Surveys,2016,50(6).
[27]YAN Y,GUO W,WANG L.Broad learning system based on ensemble learning[C]//Proceedings of International Conference on Artificial Intelligence,Big Data and Algorithms.Chengdu:IEEE Press,2021:62-67.
[28]WU C,QIU T,ZHANG C,et al.Ensemble strategy utilizing a broad learning system for indoor fingerprint localization[J].IEEE Internet of Things Journal,2021,9(4):3011-3022.
[29]FAN X,ZHANG S.Lpi-bls:Predicting lncrna-protein interactions with a broad learning system-based stacked ensemble classifier[J].Neurocomputing,2019,370:88-93.
[30]YU Z,LAN K,LIU Z,et al.Progressive ensemble kernel based broad learning system for noisy data classification[J].IEEE Transactions on Cybernetics,2022,52(9):9656-9669.
[31]ZHAO B,YANG D,ZBSG H R.Filter-wrapper combined feature selection and adaboost-weighted broad learning system for transformer fault diagnosis under imbalanced samples[J].Neurocomputing,2023,560(1):1.1-1.12.
[32]GAO Y,MIAO H,CHEN C L P,et al.Explosive cyber security threats during covid-19 pandemic and a novel tree-based broad learning system to overcome[J].IEEE Transactions on Intelligent Transportation Systems,2022,25(1):1-10.
[33]COX D.The Regression Analysis of Binary Sequences[J].Journal of the Royal Statistical Society.Series B:Methodological,1958,20(2):215-242.
[34]QUINLAN J.Induction of decision trees[J].Machine Learning,1986,1(1):81-106.
[35]BREIMAN L.Random forests[J].Machine Learning,2001,45(1):5-32.
[36]ZHANG M,ZHOU Z.Ml-knn:A lazy learning approach tomulti-label learning[J].Pattern Recognition,2007,40(7):2038-2048.
[37]HEARST M,DUMAIS S,OSUNA E,et al.Support vector machines[J].IEEE Intelligent Systems and their applications,1998,13(4):18-28.
[38]ZHOU Z,FENG J.Deep forest:Towards an alternative to deep neural networks[C]//Proceedings of International Joint Confe-rence on Artificial Intelligence.Melbourne:Morgan Kaufmann,2017:3553-3559.
[39]CHEN T,GUESTRIN C.Xgboost:A scalable tree boosting system[C]//Proceedings of ACM SIGKDD International Confe-rence on Knowledge Discovery and Data Mining.San Francisco:ACM,2016:785-794.
[40]XU Y,YU Z,CAO W,et al.A Novel Classifier Ensemble Me-thod Based on Subspace Enhancement for High-Dimensional Data Classification[J].IEEE Transactions on Knowledge and Data Engineering,2023,35(1):16-30.
[1] FU Chao, YU Liangju, CHANG Wenjun. Selective Ensemble Learning Method for Optimal Similarity Based on LLaMa3 and Choquet Integrals [J]. Computer Science, 2025, 52(9): 80-87.
[2] LIU Sixing, XU Shuoyang, XU He, JI Yimu. Machine Learning Based Interventional Glucose Sensor Fault Monitoring Model [J]. Computer Science, 2025, 52(9): 106-118.
[3] BAO Shenghong, YAO Youjian, LI Xiaoya, CHEN Wen. Integrated PU Learning Method PUEVD and Its Application in Software Source CodeVulnerability Detection [J]. Computer Science, 2025, 52(6A): 241100144-9.
[4] LIU Chengming, LI Haixia, LI Shaochuan, LI Yinghao. Ensemble Learning Model for Stock Manipulation Detection Based on Multi-scale Data [J]. Computer Science, 2025, 52(6A): 240700108-8.
[5] CHEN Weiguo, ZHANG Gaofeng, JIA Sheng, XU Benzhu, ZHENG Liping. MOBSF_rule Based Android Malware Detection Method [J]. Computer Science, 2025, 52(11A): 250200120-11.
[6] GUO Jiaming, DU Wentao, YANG Chao. Neural Network Backdoor Sample Filtering Method Based on Deep Partition Aggregation [J]. Computer Science, 2025, 52(11): 425-433.
[7] HAN Wei, JIANG Shujuan, ZHOU Wei. Patch Correctness Verification Method Based on CodeBERT and Stacking Ensemble Learning [J]. Computer Science, 2025, 52(1): 250-258.
[8] LU Xulin, LI Zhihua. IoT Device Recognition Method Combining Multimodal IoT Device Fingerprint and Ensemble Learning [J]. Computer Science, 2024, 51(9): 371-382.
[9] WANG Yiyang, LIU Fagui, PENG Lingxia, ZHONG Guoxiang. Out-of-Distribution Hard Disk Failure Prediction with Affinity Propagation Clustering and Broad Learning Systems [J]. Computer Science, 2024, 51(8): 63-74.
[10] CHEN Xiangxiao, CUI Xin, DU Qin, TANG Haoyao. Study on Optimization of Abnormal Traffic Detection Model Based on Machine Learning [J]. Computer Science, 2024, 51(6A): 230700051-5.
[11] ZHUO Peiyan, ZHANG Yaona, LIU Wei, LIU Zijin, SONG You. CTGANBoost:Credit Fraud Detection Based on CTGAN and Boosting [J]. Computer Science, 2024, 51(6A): 230600199-7.
[12] LIANG Meiyan, FAN Yingying, WANG Lin. Fine-grained Colon Pathology Images Classification Based on Heterogeneous Ensemble Learningwith Multi-distance Measures [J]. Computer Science, 2024, 51(6A): 230400043-7.
[13] LI Xinrui, ZHANG Yanfang, KANG Xiaodong, LI Bo, HAN Junling. Intelligent Diagnosis of Brain Tumor with MRI Based on Ensemble Learning [J]. Computer Science, 2024, 51(6A): 230600043-7.
[14] HE Yifan, HE Yulin, CUI Laizhong, HUANG Zhexue. Subspace-based I-nice Clustering Algorithm [J]. Computer Science, 2024, 51(6): 153-160.
[15] KANG Wei, LI Lihui, WEN Yimin. Semi-supervised Classification of Data Stream with Concept Drift Based on Clustering Model Reuse [J]. Computer Science, 2024, 51(4): 124-131.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!