Computer Science ›› 2022, Vol. 49 ›› Issue (3): 225-231.doi: 10.11896/jsjkx.201100111

• Artificial Intelligence • Previous Articles     Next Articles

Fiber Bundle Meta-learning Algorithm Based on Variational Bayes

LIU Yang, LI Fan-zhang   

  1. School of Computer Science and Technology,Soochow University,Suzhou,Jiangsu 215006,China
  • Received:2020-11-16 Revised:2021-03-10 Online:2022-03-15 Published:2022-03-15
  • About author:LIU Yang,born in 1996,postgraduate.His main research interests include meta-learning and so on.
    LI Fan-zhang,born in 1964,Ph.D,professor,Ph.D supervisor,is a member of China Computer Federation.His main research interests include Lie group machine learning,and dynamic fuzzy logic.
  • Supported by:
    National Key R & D Program of China(2018YFA0701700,2018YFA0701701) and National Natural Science Foundation of China(61902269).

Abstract: Deep learning based on neural network has achieved excellent results in a large number of fields,but it is difficult to deal with similar or untrained tasks,and it is difficult to learn and adapt to new tasks.Moreover,it requires a high scale of trai-ning samples,resulting in its poor generalization and expansion.Meta learning is a new learning framework,which aims to solve the problem that traditional learning methods can’t solve fast learning and adapt to new tasks.Aiming at the meta learning problem of image classification,a novel fiber bundle meta learning algorithm based on Bayesian theory is proposed.Firstly,the convolution neural network is used to extract the image information supporting the dataset,and the image representation is obtained.Then the manifold structure of data features and the fiber bundle of data features are constructed.The input query set selects the manifold section of the current new task to obtain the fiber suitable for the new task,so as to get the correct label of the image.Experimental results show that the model based on the proposed algorithm (FBBML) achieves the best accuracy performance compared with the standard four-layer convolutional neural network model on the common data set (mini-ImageNet).At the same time,the fiber bundle theory is introduced into meta learning,which makes the algorithm more interpretable.

Key words: Bayesian, Classification, Convolution neural network, Fiber bundle, Manifold, Meta learning

CLC Number: 

  • TP301.6
[1]HOSPEDALES T,ANTONIOU A,MICAELLI P,et al.Meta-learning in neural networks:a survey[J].arXiv:2004.05439,2020.
[2]SAHA S,GAN Z,CHENG L,et al.Hierarchical Deep Learning Neural Network (HiDeNN):An artificial intelligence (AI) framework for computational science and engineering[J].Computer Methods in Applied Mechanics and Engineering,2021,373:113452.
[3]TORREY L,SHAVLIK J.Transfer learning[M]//Handbook of Research on Machine Learning Applications and Trends:Algorithms,Methods,and Techniques.IGI global,2010:242-264.
[4]GLASS G V.Primary,secondary,and meta-analysis of research[J].Educational Researcher,1976,5(10):3-8.
[5]MAUDSLEY D B.A theory of meta-learning and principles of facilitation:an organismic perspective[D].University of Toronto,1980.
[6]CHAN P K,STOLFO S J.Scaling learning by meta-learning over disjoint and partially replicated data[C]//Ninth Florida AI Research Symposium.1996.
[7]BENSUSAN H,GIRAUD-CARRIER C G,KENNEDY C J.A higher-order approach to meta-learning [C]//Inductive Logic Programming,International Conference.London,UK,2000.
[8]VILALTA R,DRISSI Y.A perspective View and Survey of Meta-Learning[J].Artificial Intelligence Review,2002,18:77-95.
[9]FINN C,ABBEEL P,LEVINE S.Model-agnostic meta-learning for fast adaptation of deep networks [C]//Proceedings of the 34th International Conference on Machine Learning-Volume 70.JMLR.org,2017.
[10]SEUNG H S.The manifold ways of perception[J].Science,2000,290(5500):2268-2269.
[11]SILVA V D,TENENBAUM J B.Global versus local methods in nonlinear dimensionality reduction[C]//Proceedings of the 15th International Conference on Neural Information Processing Systems (NIPS).Cambridge,MA,USA:MIT Press,2002:721-728.
[12]KÜHNEL W.Differential geometry[M].American Mathematical Soc.,2015.
[13]ZHANG P,BAI Y,WANG D,et al.Few-shot Classification of Aerial Scene Images via Meta-learning[J].Remote Sensing,2021,13(1):108.
[14]ZHANG J,LI F Z.Research on fiber bundle model based on Manifold Learning[J].Journal of Nanjing University:Natural Science Edition,2008,44(5):477-485.
[15]LU M,LI F.Survey on lie group machine learning[J].Big Data Mining and Analytics,2020,3(4):235-258.
[16]LI F,ZHANG L,ZHANG Z.Lie group machine learning[M].Walter de Gruyter GmbH & Co KG,2018.
[17]KINGMA D P,WELLING M.Auto-encoding variational bayes[J].arXiv:1312.6114,2013.
[18]LECUN Y,BOSER B,DENKER J S,et al.Backpropagation applied to handwritten zip code recognition[J].Neural Computation,1989,1(4):541-551.
[19]ZHANG Y,YANG Q.A survey on multi-task learning[J].ar-Xiv:1707.08114,2017.
[20]LU J S.Application of Monte Carlo method in integral solution[J].Mathematical Learning and Research:Teaching and Research Edition,2017(5):39.
[21]LAKE B M,SALAKHUTDINOV R,GROSS J,et al.One shot learning of simple visual concepts[C]//Proceedings of the Annual Meeting of the Cognitive Science Society.2011.
[22]RAVI S,LAROCHELLE H.Optimization as a model for few-shot learning[C]//Proceedings of the International Conference on Learning Representations (ICLR).2017.
[23]VINYALS O,BLUNDELL C,LILLICRAP T,et al.Matchingnetworks for one shot learning[C]//Advances in Neural Information Processing Systems.2016:3630-3638.
[24]FINN C,XU K,LEVINE S.Probabilistic model-agnostic meta-learning[C]//Advances in Neural Information Processing Systems.2018:9516-9527.
[25]SNELL J,SWERSKY K,ZEMEL R.Prototypical networks for few-shot learning[C]//Advances in Neural Information Processing Systems.2017:4080-4090.
[26]TRIANTAFILLOU E,ZEMEL R,URTASUN R.Few-shotlearning through an information retrieval lens[C]//Advances in Neural Information Processing Systems.2017:2255-2265.
[27]LI Z,ZHOU F,CHEN F,et al.Meta-sgd:Learning to learnquickly for few shot learning[J].arXiv:1707.09835,2017.
[28]MISHRA N,ROHANINEJAD M,CHEN X,et al.A SimpleNeural Attentive Meta-Learner[C]//International Conference on Learning Representations.2018.
[29]YOON J,KIM T,DIA O,et al.Bayesian model-agnostic meta-learning[C]//Conference and Workshop on Neural Information Processing Systems.2018.
[30]GORDON J,BRONSKILL J,BAUER M,et al.Meta-learningprobabilistic inference for prediction[C]//International Confe-rence on Learning Representations.2019.
[31]ZHANG Y J.Image Information Fusion[M]//Handbook ofImage Engineering.Springer,Singapore,2021:1493-1512.
[32]NIELSEN F.On a Generalization of the Jensen-Shannon Divergence and the Jensen-Shannon Centroid[J].Entropy,2020,22(2):221.
[1] CHEN Zhi-qiang, HAN Meng, LI Mu-hang, WU Hong-xin, ZHANG Xi-long. Survey of Concept Drift Handling Methods in Data Streams [J]. Computer Science, 2022, 49(9): 14-32.
[2] ZHOU Xu, QIAN Sheng-sheng, LI Zhang-ming, FANG Quan, XU Chang-sheng. Dual Variational Multi-modal Attention Network for Incomplete Social Event Classification [J]. Computer Science, 2022, 49(9): 132-138.
[3] HAO Zhi-rong, CHEN Long, HUANG Jia-cheng. Class Discriminative Universal Adversarial Attack for Text Classification [J]. Computer Science, 2022, 49(8): 323-329.
[4] ZHU Cheng-zhang, HUANG Jia-er, XIAO Ya-long, WANG Han, ZOU Bei-ji. Deep Hash Retrieval Algorithm for Medical Images Based on Attention Mechanism [J]. Computer Science, 2022, 49(8): 113-119.
[5] TAN Ying-ying, WANG Jun-li, ZHANG Chao-bo. Review of Text Classification Methods Based on Graph Convolutional Network [J]. Computer Science, 2022, 49(8): 205-216.
[6] YAN Jia-dan, JIA Cai-yan. Text Classification Method Based on Information Fusion of Dual-graph Neural Network [J]. Computer Science, 2022, 49(8): 230-236.
[7] WU Hong-xin, HAN Meng, CHEN Zhi-qiang, ZHANG Xi-long, LI Mu-hang. Survey of Multi-label Classification Based on Supervised and Semi-supervised Learning [J]. Computer Science, 2022, 49(8): 12-25.
[8] QI Xiu-xiu, WANG Jia-hao, LI Wen-xiong, ZHOU Fan. Fusion Algorithm for Matrix Completion Prediction Based on Probabilistic Meta-learning [J]. Computer Science, 2022, 49(7): 18-24.
[9] GAO Zhen-zhuo, WANG Zhi-hai, LIU Hai-yang. Random Shapelet Forest Algorithm Embedded with Canonical Time Series Features [J]. Computer Science, 2022, 49(7): 40-49.
[10] YANG Bing-xin, GUO Yan-rong, HAO Shi-jie, Hong Ri-chang. Application of Graph Neural Network Based on Data Augmentation and Model Ensemble in Depression Recognition [J]. Computer Science, 2022, 49(7): 57-63.
[11] ZHANG Hong-bo, DONG Li-jia, PAN Yu-biao, HSIAO Tsung-chih, ZHANG Hui-zhen, DU Ji-xiang. Survey on Action Quality Assessment Methods in Video Understanding [J]. Computer Science, 2022, 49(7): 79-88.
[12] ZHANG Ying-tao, ZHANG Jie, ZHANG Rui, ZHANG Wen-qiang. Photorealistic Style Transfer Guided by Global Information [J]. Computer Science, 2022, 49(7): 100-105.
[13] DU Li-jun, TANG Xi-lu, ZHOU Jiao, CHEN Yu-lan, CHENG Jian. Alzheimer's Disease Classification Method Based on Attention Mechanism and Multi-task Learning [J]. Computer Science, 2022, 49(6A): 60-65.
[14] LI Xiao-wei, SHU Hui, GUANG Yan, ZHAI Yi, YANG Zi-ji. Survey of the Application of Natural Language Processing for Resume Analysis [J]. Computer Science, 2022, 49(6A): 66-73.
[15] LI Ya-ru, ZHANG Yu-lai, WANG Jia-chen. Survey on Bayesian Optimization Methods for Hyper-parameter Tuning [J]. Computer Science, 2022, 49(6A): 86-92.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!