Computer Science ›› 2022, Vol. 49 ›› Issue (11): 212-220.doi: 10.11896/jsjkx.210900054

• Artificial Intelligence • Previous Articles     Next Articles

Universal Multi-class Ensemble Method with Self Adaptive Weights

WEI Jun-sheng1, LIU Yan1, CHEN Jing2, DUAN Shun-ran1   

  1. 1 Key Laboratory of Cyberspace Situation Awareness of Henan,Zhengzhou 450001,China
    2 Department of Big Data Analysis,Information Engineering University,Zhengzhou 450001,China
  • Received:2021-09-06 Revised:2022-03-21 Online:2022-11-15 Published:2022-11-03
  • About author:WEI Jun-sheng,born in 1992,postgra-duate.His main research interests include big data analysis and so on.
    LIU Yan,born in 1979,Ph.D,associate professor,Ph.D supervisor.Her main research interests include network topology discovery and network data analysis.
  • Supported by:
    National Natural Science Foundation of China(U1804263,62002386).

Abstract: Ensemble learning has always been one of the strategies to build a powerful and stable predictive model.It can improve the accuracy and stability of the results by fusing multiple models.However,existing ensemble methods still have certain shortcomings in the calculation of weights.When facing a variety of classification problems,they cannot adaptively select ensemble weights,and they are not universal.In view of the above problems,a universal multi-class ensemble method with self-adaptive weights(UMEAW) is proposed.Different to usual ensemble classification method that only targets one kind of classification task,when facing different classification problems,firstly,UMEAW calculates the weight allocation coefficient according to the number of classification,and then the weights of base classifiers is automatically calculated according to the model evaluation index and the weight allocation coefficient by using the distribution characteristics of exponential function.Finally,the weights is adjusted adaptively through continuous iteration to realize model ensemble under different classification tasks.Experimental results show that UMEAW can achieve model ensemble on 9 datasets with different classification numbers,different fields and different scales,and the effect of UMEAW is better than the baselines in most tasks.Compared with a single model,F1 value increases by 3%~25% after UMEAW fusion.Compared with other ensemble methods,the F1 value improves by 1%~2%.It is proved that UMEAW is universal and effective.

Key words: Ensemble learning, Weight, Classification, Fusion, Universal method

CLC Number: 

  • TP391
[1]OPITZ D,MACLIN R.Popular Ensemble Methods:An Empirical Study[J].Journal of Artificial Intelligence Research,1999,11:169-198.
[2]HANSEN L K,SALAMON P.Neural network ensembles[J].IEEE Transactions on Pattern Analysis & Machine Intelligence,2002,12(10):993-1001.
[3]FREUND Y.Boosting a weak learning algorithm by majority[J].Information and Computation,1995,121(2):256-285.
[4]SHAPIRE R.The strength of weak learnability[J].Machine Learning,1990,5(2):197-227.
[5]JI Z,HUI Z,ROSSET S,et al.Multi-class adaboost[J].Statistics and its Interface,2009,2(3):349-360.
[6]WOLPERT D H.Stacked generalization[J].Neural Networks,2017,5(2):241-259.
[7]LITTLESTONE N,WARMUTH M K.The weighted majority algorithm[J].Information and Computation,1994,108(2):212-261.
[8]YU Z,WANG D,ZHAO Z,et al.Hybrid incremental ensemble learning for noisy real-world data classification[J].IEEE Transa-ctions on Cybernetics,2017,49(2):403-416.
[9]DONG M G,ZHANG W,JING C.Dynamic weight ensemble classification algorithm for unbalanced data streams[J].Miniature Microcomputer Systems,2020,41(8):1649-1655.
[10]GAO W,ZHOU Z H.Approximation stability and boosting[C]//International Conference on Algorithmic Learning Theory.Berlin:Springer,2010:59-73.
[11]YIN X C,HUANG K,HAO H W,et al.A novel classifier ensemble method with sparsity and diversity[J].Neurocomputing,2014,134:214-221.
[12]ZHANG L,SUGANTHAN P N.Random Forests with ensemble of feature spaces[J].Pattern Recognition,2014,47(10):3429-3437.
[13]KOLTER J Z,MALOOF M A.Dynamic weighted majority:An ensemble method for drifting concepts[J].Journal of Machine Learning Research,2007,8(Dec):2755-2790.
[14]MAO S,JIAO L,XIONG L,et al.Weighted classifier ensemble based on quadratic form[J].Pattern Recognition,2015,48(5):1688-1706.
[15]MUHLBAIER M D,TOPALIS A,POLIKAR R.Learn++.NC:Combining Ensemble of Classifiers With Dynamically Weighted Consult-and-Vote for Efficient Incremental Learning of New Classes[J].IEEE Transactions on Neural Networks,2009,20(1):152-168.
[16]HUANG Y,MONEKOSSO D,WANG H.Clustering Ensembles Based on Multi-classifier Fusion[C]//IEEE International Conference on Intelligent Computing and Intelligent Systems(ICIS 2010).Shanghai:IEEE,2010:393-397.
[17]NGUYEN T T,LIEW A W C,TRAN M T,et al.Combining multi classifiers based on a genetic algorithm-a gaussian mixture model framework[C]//International Conference on Intelligent Computing.Cham:Springer,2014:56-67.
[18]SONG J Z,GUO C Y,LIU H S.Selective SVM Ensemble Base on Clustering Analysis Apply for Analog Circuit Fault Diagnosis with Small Samples[J].Applied Mechanics & Materials,2013,380-384:841-845.
[19]YANG J,WANG F.Auto-Ensemble:An Adaptive LearningRate Scheduling Based Deep Learning Model Ensembling[J].IEEE Access,2020,8(12):217499-217509.
[20]LU Z,WU X,ZHU X,et al.Ensemble Pruning via Individual Contribution Ordering[C]//ACM Sigkdd International Confe-rence on Knowledge Discovery & Data Mining.Washington:ACM,2010:871.
[21]MARTÍNEZ-MUÑOZ G,HERNÁNDEZ-LOBATO D,SUÁREZA.An Analysis of Ensemble Pruning Techniques Based on Ordered Aggregation[J].IEEE Transactions on Pattern Analysis &Machine Intelligence,2009,31(2):245-259.
[22]HASAN M A,ABDULLAH N,RAHMAN M,et al.Dental Impression Tray Selection From Maxillary Arch Images Using Multi-Feature Fusion and Ensemble Classifier[J].IEEE Access,2021,9(2):30573-30586.
[23]BANIA R K,HALDER A.R-HEFS:Rough set based heteroge-neous ensemble feature selection method for medical data classification[J].Artificial Intelligence in Medicine,2021,114:102049-102080.
[24]LAI C M,HUANG H P.A gene selection algorithm using simplified swarm optimization with multi-filter ensemble technique[J].Applied Soft Computing,2021,100:106994-107042.
[25]ZHOUG,GUO F L.Research on high-dimensional data integration learning method based on feature selection[J].Computer Science,2021,48(S1):250-254.
[26]DEVLIN J,CHANG M,LEE K,et al.Bert:Pre-training of deep bidirectional transformers for language understanding[J].ar-Xiv:1810.04805,2018.
[27]CUI Y,CHE W,LIU T,et al.Pre-training with whole word masking for chinese bert[J].arXiv:1906.08101,2019.
[28]YANG Z,DAI Z,YANG Y,et al.XLNet:Generalized Autoregressive Pretraining for Language Understanding[J].Advances in Neural Information Processing Systems,2019,32:5753-5763.
[1] CAO Xiao-wen, LIANG Mei-yu, LU Kang-kang. Fine-grained Semantic Reasoning Based Cross-media Dual-way Adversarial Hashing Learning Model [J]. Computer Science, 2022, 49(9): 123-131.
[2] ZHOU Xu, QIAN Sheng-sheng, LI Zhang-ming, FANG Quan, XU Chang-sheng. Dual Variational Multi-modal Attention Network for Incomplete Social Event Classification [J]. Computer Science, 2022, 49(9): 132-138.
[3] ZHOU Le-yuan, ZHANG Jian-hua, YUAN Tian-tian, CHEN Sheng-yong. Sequence-to-Sequence Chinese Continuous Sign Language Recognition and Translation with Multi- layer Attention Mechanism Fusion [J]. Computer Science, 2022, 49(9): 155-161.
[4] WU Zi-yi, LI Shao-mei, JIANG Meng-han, ZHANG Jian-peng. Ontology Alignment Method Based on Self-attention [J]. Computer Science, 2022, 49(9): 215-220.
[5] CHEN Zhi-qiang, HAN Meng, LI Mu-hang, WU Hong-xin, ZHANG Xi-long. Survey of Concept Drift Handling Methods in Data Streams [J]. Computer Science, 2022, 49(9): 14-32.
[6] SHEN Xiang-pei, DING Yan-rui. Multi-detector Fusion-based Depth Correlation Filtering Video Multi-target Tracking Algorithm [J]. Computer Science, 2022, 49(8): 184-190.
[7] TAN Ying-ying, WANG Jun-li, ZHANG Chao-bo. Review of Text Classification Methods Based on Graph Convolutional Network [J]. Computer Science, 2022, 49(8): 205-216.
[8] YAN Jia-dan, JIA Cai-yan. Text Classification Method Based on Information Fusion of Dual-graph Neural Network [J]. Computer Science, 2022, 49(8): 230-236.
[9] HAO Zhi-rong, CHEN Long, HUANG Jia-cheng. Class Discriminative Universal Adversarial Attack for Text Classification [J]. Computer Science, 2022, 49(8): 323-329.
[10] WU Hong-xin, HAN Meng, CHEN Zhi-qiang, ZHANG Xi-long, LI Mu-hang. Survey of Multi-label Classification Based on Supervised and Semi-supervised Learning [J]. Computer Science, 2022, 49(8): 12-25.
[11] QIN Qi-qi, ZHANG Yue-qin, WANG Run-ze, ZHANG Ze-hua. Hierarchical Granulation Recommendation Method Based on Knowledge Graph [J]. Computer Science, 2022, 49(8): 64-69.
[12] WEI Kai-xuan, FU Ying. Re-parameterized Multi-scale Fusion Network for Efficient Extreme Low-light Raw Denoising [J]. Computer Science, 2022, 49(8): 120-126.
[13] YANG Wen-kun, YUAN Xiao-pei, CHEN Xiao-feng, GUO Rui. Spatial Multi-feature Segmentation of 3D Lidar Point Cloud [J]. Computer Science, 2022, 49(8): 143-149.
[14] WANG Can, LIU Yong-jian, XIE Qing, MA Yan-chun. Anchor Free Object Detection Algorithm Based on Soft Label and Sample Weight Optimization [J]. Computer Science, 2022, 49(8): 157-164.
[15] CHEN Ming-xin, ZHANG Jun-bo, LI Tian-rui. Survey on Attacks and Defenses in Federated Learning [J]. Computer Science, 2022, 49(7): 310-323.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!