Computer Science ›› 2022, Vol. 49 ›› Issue (11): 212-220.doi: 10.11896/jsjkx.210900054
• Artificial Intelligence • Previous Articles Next Articles
WEI Jun-sheng1, LIU Yan1, CHEN Jing2, DUAN Shun-ran1
CLC Number:
[1]OPITZ D,MACLIN R.Popular Ensemble Methods:An Empirical Study[J].Journal of Artificial Intelligence Research,1999,11:169-198. [2]HANSEN L K,SALAMON P.Neural network ensembles[J].IEEE Transactions on Pattern Analysis & Machine Intelligence,2002,12(10):993-1001. [3]FREUND Y.Boosting a weak learning algorithm by majority[J].Information and Computation,1995,121(2):256-285. [4]SHAPIRE R.The strength of weak learnability[J].Machine Learning,1990,5(2):197-227. [5]JI Z,HUI Z,ROSSET S,et al.Multi-class adaboost[J].Statistics and its Interface,2009,2(3):349-360. [6]WOLPERT D H.Stacked generalization[J].Neural Networks,2017,5(2):241-259. [7]LITTLESTONE N,WARMUTH M K.The weighted majority algorithm[J].Information and Computation,1994,108(2):212-261. [8]YU Z,WANG D,ZHAO Z,et al.Hybrid incremental ensemble learning for noisy real-world data classification[J].IEEE Transa-ctions on Cybernetics,2017,49(2):403-416. [9]DONG M G,ZHANG W,JING C.Dynamic weight ensemble classification algorithm for unbalanced data streams[J].Miniature Microcomputer Systems,2020,41(8):1649-1655. [10]GAO W,ZHOU Z H.Approximation stability and boosting[C]//International Conference on Algorithmic Learning Theory.Berlin:Springer,2010:59-73. [11]YIN X C,HUANG K,HAO H W,et al.A novel classifier ensemble method with sparsity and diversity[J].Neurocomputing,2014,134:214-221. [12]ZHANG L,SUGANTHAN P N.Random Forests with ensemble of feature spaces[J].Pattern Recognition,2014,47(10):3429-3437. [13]KOLTER J Z,MALOOF M A.Dynamic weighted majority:An ensemble method for drifting concepts[J].Journal of Machine Learning Research,2007,8(Dec):2755-2790. [14]MAO S,JIAO L,XIONG L,et al.Weighted classifier ensemble based on quadratic form[J].Pattern Recognition,2015,48(5):1688-1706. [15]MUHLBAIER M D,TOPALIS A,POLIKAR R.Learn++.NC:Combining Ensemble of Classifiers With Dynamically Weighted Consult-and-Vote for Efficient Incremental Learning of New Classes[J].IEEE Transactions on Neural Networks,2009,20(1):152-168. [16]HUANG Y,MONEKOSSO D,WANG H.Clustering Ensembles Based on Multi-classifier Fusion[C]//IEEE International Conference on Intelligent Computing and Intelligent Systems(ICIS 2010).Shanghai:IEEE,2010:393-397. [17]NGUYEN T T,LIEW A W C,TRAN M T,et al.Combining multi classifiers based on a genetic algorithm-a gaussian mixture model framework[C]//International Conference on Intelligent Computing.Cham:Springer,2014:56-67. [18]SONG J Z,GUO C Y,LIU H S.Selective SVM Ensemble Base on Clustering Analysis Apply for Analog Circuit Fault Diagnosis with Small Samples[J].Applied Mechanics & Materials,2013,380-384:841-845. [19]YANG J,WANG F.Auto-Ensemble:An Adaptive LearningRate Scheduling Based Deep Learning Model Ensembling[J].IEEE Access,2020,8(12):217499-217509. [20]LU Z,WU X,ZHU X,et al.Ensemble Pruning via Individual Contribution Ordering[C]//ACM Sigkdd International Confe-rence on Knowledge Discovery & Data Mining.Washington:ACM,2010:871. [21]MARTÍNEZ-MUÑOZ G,HERNÁNDEZ-LOBATO D,SUÁREZA.An Analysis of Ensemble Pruning Techniques Based on Ordered Aggregation[J].IEEE Transactions on Pattern Analysis &Machine Intelligence,2009,31(2):245-259. [22]HASAN M A,ABDULLAH N,RAHMAN M,et al.Dental Impression Tray Selection From Maxillary Arch Images Using Multi-Feature Fusion and Ensemble Classifier[J].IEEE Access,2021,9(2):30573-30586. [23]BANIA R K,HALDER A.R-HEFS:Rough set based heteroge-neous ensemble feature selection method for medical data classification[J].Artificial Intelligence in Medicine,2021,114:102049-102080. [24]LAI C M,HUANG H P.A gene selection algorithm using simplified swarm optimization with multi-filter ensemble technique[J].Applied Soft Computing,2021,100:106994-107042. [25]ZHOUG,GUO F L.Research on high-dimensional data integration learning method based on feature selection[J].Computer Science,2021,48(S1):250-254. [26]DEVLIN J,CHANG M,LEE K,et al.Bert:Pre-training of deep bidirectional transformers for language understanding[J].ar-Xiv:1810.04805,2018. [27]CUI Y,CHE W,LIU T,et al.Pre-training with whole word masking for chinese bert[J].arXiv:1906.08101,2019. [28]YANG Z,DAI Z,YANG Y,et al.XLNet:Generalized Autoregressive Pretraining for Language Understanding[J].Advances in Neural Information Processing Systems,2019,32:5753-5763. |
[1] | CAO Xiao-wen, LIANG Mei-yu, LU Kang-kang. Fine-grained Semantic Reasoning Based Cross-media Dual-way Adversarial Hashing Learning Model [J]. Computer Science, 2022, 49(9): 123-131. |
[2] | ZHOU Xu, QIAN Sheng-sheng, LI Zhang-ming, FANG Quan, XU Chang-sheng. Dual Variational Multi-modal Attention Network for Incomplete Social Event Classification [J]. Computer Science, 2022, 49(9): 132-138. |
[3] | ZHOU Le-yuan, ZHANG Jian-hua, YUAN Tian-tian, CHEN Sheng-yong. Sequence-to-Sequence Chinese Continuous Sign Language Recognition and Translation with Multi- layer Attention Mechanism Fusion [J]. Computer Science, 2022, 49(9): 155-161. |
[4] | WU Zi-yi, LI Shao-mei, JIANG Meng-han, ZHANG Jian-peng. Ontology Alignment Method Based on Self-attention [J]. Computer Science, 2022, 49(9): 215-220. |
[5] | CHEN Zhi-qiang, HAN Meng, LI Mu-hang, WU Hong-xin, ZHANG Xi-long. Survey of Concept Drift Handling Methods in Data Streams [J]. Computer Science, 2022, 49(9): 14-32. |
[6] | SHEN Xiang-pei, DING Yan-rui. Multi-detector Fusion-based Depth Correlation Filtering Video Multi-target Tracking Algorithm [J]. Computer Science, 2022, 49(8): 184-190. |
[7] | TAN Ying-ying, WANG Jun-li, ZHANG Chao-bo. Review of Text Classification Methods Based on Graph Convolutional Network [J]. Computer Science, 2022, 49(8): 205-216. |
[8] | YAN Jia-dan, JIA Cai-yan. Text Classification Method Based on Information Fusion of Dual-graph Neural Network [J]. Computer Science, 2022, 49(8): 230-236. |
[9] | HAO Zhi-rong, CHEN Long, HUANG Jia-cheng. Class Discriminative Universal Adversarial Attack for Text Classification [J]. Computer Science, 2022, 49(8): 323-329. |
[10] | WU Hong-xin, HAN Meng, CHEN Zhi-qiang, ZHANG Xi-long, LI Mu-hang. Survey of Multi-label Classification Based on Supervised and Semi-supervised Learning [J]. Computer Science, 2022, 49(8): 12-25. |
[11] | QIN Qi-qi, ZHANG Yue-qin, WANG Run-ze, ZHANG Ze-hua. Hierarchical Granulation Recommendation Method Based on Knowledge Graph [J]. Computer Science, 2022, 49(8): 64-69. |
[12] | WEI Kai-xuan, FU Ying. Re-parameterized Multi-scale Fusion Network for Efficient Extreme Low-light Raw Denoising [J]. Computer Science, 2022, 49(8): 120-126. |
[13] | YANG Wen-kun, YUAN Xiao-pei, CHEN Xiao-feng, GUO Rui. Spatial Multi-feature Segmentation of 3D Lidar Point Cloud [J]. Computer Science, 2022, 49(8): 143-149. |
[14] | WANG Can, LIU Yong-jian, XIE Qing, MA Yan-chun. Anchor Free Object Detection Algorithm Based on Soft Label and Sample Weight Optimization [J]. Computer Science, 2022, 49(8): 157-164. |
[15] | CHEN Ming-xin, ZHANG Jun-bo, LI Tian-rui. Survey on Attacks and Defenses in Federated Learning [J]. Computer Science, 2022, 49(7): 310-323. |
|