计算机科学 ›› 2023, Vol. 50 ›› Issue (3): 254-265.doi: 10.11896/jsjkx.220600007

• 人工智能 • 上一篇    下一篇

演化循环神经网络研究综述

胡中源, 薛羽, 查加杰   

  1. 南京信息工程大学软件学院 南京 210044
  • 收稿日期:2022-05-31 修回日期:2022-09-27 出版日期:2023-03-15 发布日期:2023-03-15
  • 通讯作者: 薛羽(xueyu@nuist.edu.cn)
  • 作者简介:(20211220009@nuist.edu.cn)
  • 基金资助:
    国家自然科学基金(61876089);数据科学与智慧软件江苏省重点实验室开放课题基金(2019DS302);江苏省自然科学基金(BK20141005);江苏省高校自然科学研究项目(14KJB520025);江苏省研究生科研与实践创新计划(KYCX22_1206)

Survey on Evolutionary Recurrent Neural Networks

HU Zhongyuan, XUE Yu, ZHA Jiajie   

  1. School of Software,Nanjing University of Information Science and Technology,Nanjing 210044,China
  • Received:2022-05-31 Revised:2022-09-27 Online:2023-03-15 Published:2023-03-15
  • About author:HU Zhongyuan,born in 1999,postgra-duate.His main research interests include evolutionary computation and deep learning.
    XUE Yu,born in 1981,Ph.D,professor.His main research interests include deep learning,evolutionary computation,machine learning and computer vision.
  • Supported by:
    National Natural Science Foundation of China(61876089),Opening Project of Jiangsu Key Laboratory of Data Science and Smart Software(2019DS302),Natural Science Foundation of Jiangsu Province(BK20141005),Natural Science Foundation of the JiangsuHigher Education Institutions of China(14KJB520025) and Postgraduate Research & Practice Innovation Program of Jiangsu Province(KYCX22_1206).

摘要: 演化计算利用生物演化过程中的自然选择机制和遗传规律求解优化问题,循环神经网络的精度和效率依赖其参数以及结构的优化效果,采用演化计算解决循环神经网络中的参数与结构自适应优化问题是自动化深度学习领域的研究热点。文中针对结合演化计算和循环神经网络的算法进行了详细的调研。首先,简要介绍了演化算法的传统类别、常见算法和优点,以及循环神经网络模型的结构及特点,并对影响循环神经网络性能的因素进行了分析;其次,分析了演化循环神经网络的算法框架,并分别从权重优化、超参数优化和结构优化方面分析了当前演化循环神经网络的研究进展;然后,对演化循环神经网络的一些其他工作进行了分析;最后,指出了演化循环神经网络面临的挑战以及发展趋势。

关键词: 循环神经网络, 演化计算, 权重优化, 超参数优化, 结构优化, 集成学习, 迁移学习

Abstract: Evolutionary computation utilizes natural selection mechanisms and genetic laws in the process of biological evolution to solve optimization problems.The accuracy and efficiency of the evolutionary recurrent neural network model depends on the optimization effect of parameters and the structures.The utilization of evolutionary computation to solve the problem of adaptive optimization of parameters and structures in recurrent neural networks is a hot spot of automated deep learning.This paper summarizes the algorithms that combine evolutionary algorithms and recurrent neural networks.Firstly,it briefly reviews the traditional categories,common algorithms,and advantages of evolutionary computation.Next,it briefly introduces the structures and characteristics of the recurrent neural network models and analyzes the influencing factors of recurrent neural network perfor-mance.Then,it analyzes the algorithmic framework of evolutionary recurrent neural networks,and the current research development of evolutionary recurrent neural networks from weight optimization,hyperparameter optimization and structure optimization.Besides,other work on evolutionary recurrent neural networks is analyzed.Finally,it points out the challenges and the deve-lopment trend of evolutionary recurrent neural networks.

Key words: Recurrent neural network, Evolutionary computation, Weight optimization, Hyperparameter optimization, Optimization of structure, Ensemble learning, Transfer learning

中图分类号: 

  • TP183
[1]WANG X W,LI J,TAN Z H,et al.The state of the art and future tendency of “Internet+” oriented network technology[J].Journal of Computer Research and Development,2016,53(4):729-741.
[2]HAO L,SHAO Z P,ZHANG J H,et al.Review of deep lear-ning-based action recognition algorithm[J].Computer Science,2020,47(S1):139-147.
[3]KAVIANI S,SOHN I.Influence of random topology in artificial neural networks:A survey[J].ICT Express,2020,6(2):145-150.
[4]LIPTON Z C,BERKOWITZ J,ELKAN C.A critical review of recurrent neural networks for sequence learning[J].arXiv:1506.00019,2015.
[5]CHE Z,PURUSHOTHAM S,CHO K,et al.Recurrent neural networks for multivariate time series with missing values[J].Scientific reports,2018,8(1):1-12.
[6]MIKOLOV T,KARAFIÁT M,BURGET L,et al.Recurrentneural network based language model[C]//Interspeech.2010:1045-1048.
[7]KALCHBRENNER N,BLUNSOM P.Recurrent continuoustranslation models[C]//Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing.2013:1700-1709.
[8]YOU Q,JIN H,WANG Z,et al.Image captioning with semantic attention[C]//Proceedings of the IEEE Conference on Compu-ter Vision and Pattern Recognition.2016:4651-4659.
[9]WERBOS P J.Backpropagation through time:what it does and how to do it[C]//Proceedings of the IEEE.1990:1550-1560.
[10]BIANCHINI M,GORI M,MAGGINI M.On the problem oflocal minima in recurrent neural networks[J].IEEE Transactions on Neural Networks,1994,5(2):167-177.
[11]PASCANU R,MIKOLOV T,BENGIO Y.On the difficulty of training recurrent neural networks[C]//International Confe-rence on Machine Learning.PMLR,2013:1310-1318.
[12]YAO Q,WANG M,CHEN Y,et al.Taking human out of lear-ning applications:A survey on automated machine learning[J].arXiv:1810.13306,2018.
[13]SPARKS E R,TALWALKAR A,HAAS D,et al.Automating model search for large scale machine learning[C]//Proceedings of the Sixth ACM Symposium on Cloud Computing.2015:368-380.
[14]SCHRIMPF M,MERITY S,BRADBURY J,et al.A flexible approach to automated rnn architecture generation[J].arXiv:1712.07316,2017.
[15]ZOPH B,LE Q V.Neural architecture search with reinforce-ment learning[J].arXiv:1611.01578,2016.
[16]PHAM H,GUAN M,ZOPH B,et al.Efficient neural architecture search via parameters sharing[C]//International Confe-rence on Machine Learning.PMLR,2018:4095-4104.
[17]BACK T,HAMMEL U,SCHWEFEL H P.Evolutionary computation:Comments on the history and current state[J].IEEE Transactions on Evolutionary Computation,1997,1(1):3-17.
[18]KOU G J,MA Y Y,YUE J,et al.Survey of bio-inspired natural computing[J].Computer Science,2014,41(S1):37-41.
[19]KATOCH S,CHAUHAN S S,KUMAR V.A review on genetic algorithm:past,present,and future[J].Multimedia Tools and Applications,2021,80(5):8091-8126.
[20]DAS S,SUGANTHAN P N.Differential evolution:A survey of the state-of-the-art[J].IEEE Transactions on Evolutionary Computation,2010,15(1):4-31.
[21]KENNEDY J,EBERHART R.Particle swarm optimization[C]//International Conference on Neural Networks(ICNN’95).1995:1942-1948.
[22]KRASNOGOR N,SMITH J.A tutorial for competent memetic algorithms:model,taxonomy,and design issues[J].IEEE transactions on Evolutionary Computation,2005,9(5):474-488.
[23]ZHOU A,QU B Y,LI H,et al.Multiobjective evolutionary algorithms:A survey of the state of the art[J].Swarm and evolutionary computation,2011,1(1):32-49.
[24]YAO X,CHEN G L,XU H M,et al.A survey of evolutionary algorithms[J].Chinese Journal of Computers,1995(9):694-706.
[25]XIE X,LIU Y,SUN Y,et al.BenchENAS:A BenchmarkingPlatform for Evolutionary Neural Architecture Search[J].IEEE Transactions on Evolutionary Computation,2022,26(6):1473-1485.
[26]ELMAN J L.Finding structure in time[J].Cognitive science,1990,14(2):179-211.
[27]KARPATHY A.The unreasonable effectiveness of recurrentneural networks[EB/OL].(2015-05-21) [2016-08-13].http://karpathy.github.io/2015/05/21 /rnn-effectiveness,2016.
[28]BAHDANAU D,CHO K,BENGIO Y.Neural machine translation by jointly learning to align and translate[J].arXiv:1409.0473,2014.
[29]HOCHREITER S,SCHMIDHUBER J.Long short-term memory[J].Neural computation,1997,9(8):1735-1780.
[30]JAEGER H,HAAS H.Harnessing nonlinearity:Predictingchaotic systems and saving energy in wireless communication[J].Science,2004,304(5667):78-80.
[31]CHUNG J,GULCEHRE C,CHO K H,et al.Empirical evaluation of gated recurrent neural networks on sequence modeling[J].arXiv:1412.3555,2014.
[32]LECUN Y,BENGIO Y,HINTON G.Deep learning [J].Nature,2015,521(7553):436-444.
[33]GRAVES A,FERNÁNDEZ S,SCHMIDHUBER J.Bidirectional LSTM networks for improved phoneme classification and recognition[C]//International Conference on Artificial Neural Networks.2005:799-804.
[34]GRAVES A,MOHAMED A,HINTON G.Speech recognition with deep recurrent neural networks[C]//2013 IEEE International Conference on Acoustics,Speech and Signal Processing.2013:6645-6649.
[35]MA L B,LI N,CHENG S.Evolutionary neural networks:principles,models and methods[J].Journal of Shaanxi Normal University(Natural Science Edtion),2021,49(5):30-38,133.
[36]GE H W,LIANG Y C,MARCHESE M.A modified particle swarm optimization-based dynamic recurrent neural network for identifying and controlling nonlinear systems[J].Computers & structures,2007,85(21/22):1611-1622.
[37]CAI X,ZHANG N,VENAYAGAMOORTHY G K,et al.Time series prediction with recurrent neural networks trained by a hybrid PSO-EA algorithm[J].Neurocomputing,2007,70(13/14/15):2342-2353.
[38]XIAO P,VENAYAGAMOORTHY G K,CORZINE K A.Combined training of recurrent neural networks with particle swarm optimization and backpropagation algorithms for impedance identification[C]//2007 IEEE Swarm Intelligence Symposium.2007:9-15.
[39]ZHANG N,BEHERA P K,WILLIAMS C.Solar radiation prediction based on particle swarm optimization and evolutionary algorithm using recurrent neural networks[C]//2013 IEEE International Systems Conference(SysCon).IEEE,2013:280-286.
[40]KANG Q,LIAO W K,AGRAWAL A,et al.A hybrid training algorithm for recurrent neural network using particle swarm optimization-based preprocessing and temporal error aggregation[C]//2017 IEEE International Conference on Data Mining Workshops(ICDMW).IEEE,2017:812-817.
[41]IBRAHIM A M,EL-AMARY N H.Particle Swarm Optimization trained recurrent neural network for voltage instability prediction[J].Journal of Electrical Systems and Information Technology,2018,5(2):216-228.
[42]HU H,WANG H,BAI Y,et al.Determination of endometrial carcinoma with gene expression based on optimized Elman neural network[J].Applied Mathematics and Computation,2019,341:204-214.
[43]AB AZIZ M F,MOSTAFA S A,FOOZY C F M,et al.Integrating Elman recurrent neural network with particle swarm optimization algorithms for an improved hybrid training of multidisciplinary datasets[J].Expert Systems with Applications,2021,183:115441.
[44]DUONG S C,UEZATO E,KINJO H,et al.A hybrid evolutio-nary algorithm for recurrent neural network control of a three-dimensional tower crane[J].Automation in Construction,2012,23:55-63.
[45]BLANCO A,DELGADO M,PEGALAJAR M C.A real-coded genetic algorithm for training recurrent neural networks[J].Neural Networks,2001,14(1):93-105.
[46]GUO C X,YUAN G,ZHU C Z,et al.SoC estimation for lithiumion battery using recurrent NARX neural network and genetic algorithm[C]//IOP Conference Series:Materials Science and Engineering.IOP Publishing,2019.
[47]DUCHANOY C A,MORENO-ARMENDÁRIZ M A,URBINA L,et al.A novel recurrent neural network soft sensor via a differential evolution training algorithm for the tire contact patch[J].Neurocomputing,2017,235:71-82.
[48]NAWI N M,KHAN A,REHMAN M Z,et al.Weight optimization in recurrent neural networks with hybrid metaheuristic Cuckoo search techniques for data classification[J].Mathematical Problems in Engineering,2021,18:115441.
[49]RUÍZ L G B,CAPEL M I,PEGALAJAR M C.Parallel memetic algorithm for training recurrent neural networks for the energy efficiency problem[J].Applied Soft Computing,2019,76:356-368.
[50]RASHID T A,FATTAH P,AWLA D K.Using accuracy mea-sure for improving the training of LSTM with metaheuristic algorithms[J].Procedia Computer Science,2018,140:324-333.
[51]ROY K,MANDAL K K,MANDAL A C.Ant-Lion Optimizer algorithm and recurrent neural network for energy management of micro grid connected system[J].Energy,2019,167:402-416.
[52]SHAH H,CHIROMA H,HERAWAN T,et al.An efficient bio-inspired bees colony for breast cancer prediction[C]//Proceedings of the International Conference on Data Engineering 2015(DaEng-2015).Singapore:Springer,2019:597-608.
[53]YANG S,CHEN D,LI S,et al.Carbon price forecasting based on modified ensemble empirical mode decomposition and long short-term memory optimized by improved whale optimization algorithm[J].Science of the Total Environment,2020,716:137117.
[54]HASSIB E,EL-DESOUKY A,LABIB L,et al.WOA+BRNN:An imbalanced big data classification framework using Whale optimization and deep neural network[J].Soft Computing,2020,24(8):5573-5592.
[55]GROSAN C,ABRAHAM A.Hybrid evolutionary algorithms:methodologies,architectures,and reviews[M]//Hybrid evolutionary Algorithms.2007:1-17.
[56]VICOL P,METZ L,SOHL-DICKSTEIN J.Unbiased gradient estimation in unrolled computation graphs with persistent evolution strategies[C]//International Conference on Machine Learning.PMLR,2021:10553-10563.
[57]LI Y,ZHU Z,KONG D,et al.EA-LSTM:Evolutionary attention-based LSTM for time series prediction[J].Knowledge-Based Systems,2019,181:104785.
[58]BERGSTRA J,BARDENET R,BENGIO Y,et al.Algorithms for hyper-parameter optimization[C]//International Conference on in Neural Information Processing Systems.2011.
[59]LORENZO P R,NALEPA J,KAWULOK M,et al.Particleswarm optimization for hyper-parameter selection in deep neural networks[C]//Proceedings of the Genetic and Evolutionary Computation Conference.2017:481-488.
[60]TANAKA T,MORIYA T,SHINOZAKI T,et al.Automated structure discovery and parameter tuning of neural network language model based on evolution strategy[C]//2016 IEEE Spoken Language Technology Workshop(SLT).IEEE,2016:665-671.
[61]PENG L,LIU S,LIU R,et al.Effective long short-term memory with differential evolution algorithm for electricity price prediction[J].Energy,2018,162:1301-1314.
[62]CHUNG H,SHIN K.Genetic algorithm-optimized long short-term memory network for stock market prediction[J].Sustai-nability,2018,10(10):1-18.
[63]CAMERO A,TOUTOUH J,STOLFI D H,et al.Evolutionary deep learning for car park occupancy prediction in smart cities[C]//International Conference on Learning and Intelligent Optimization.Cham:Springer,2018:386-401.
[64]LIU R,LIU L.Predicting housing price in China based on long short-term memory incorporating modified genetic algorithm[J].Soft Computing,2019,23(22):11829-11838.
[65]CAMERO A,TOUTOUH J,FERRER J,et al.Waste gene-ration prediction under uncertainty in smart cities through deep neuroevolution[J].Revista Facultad de Ingeniería Universidad de Antioquia,2019(93):128-138.
[66]CAMERO A,TOUTOUH J,ALBA E.A specialized evolu-tionary strategy using mean absolute error random sampling to design recurrent neural networks[J].arXiv:1909.02425,2019.
[67]CHUNG C C,LIN W T,ZHANG R,et al.Emotion estimation by joint facial expression and speech tonality using evolutionary deep learning structures[C]//2019 IEEE 8th Global Conference on Consumer Electronics(GCCE).IEEE,2019:221-224.
[68]ALMALAQ A,ZHANG J J.Evolutionary deep learning-based energy consumption prediction for buildings[J].IEEE Access,2018,7:1520-1531.
[69]PENG L,ZHU Q,LV S X,et al.Effective long short-term me-mory with fruit fly optimization algorithm for time series forecasting[J].Soft Computing,2020,24(19):15059-15079.
[70]BOUKTIF S,FIAZ A,OUNI A,et al.Multi-sequence LSTM-RNN deep learning and metaheuristics for electric load forecasting[J].Energies,2020,13(2):1-21.
[71]ORTEGO P,DIEZ-OLIVAN A,DEL SER J,et al.Evolutionary LSTM-FCN networks for pattern classification in industrial processes[J].Swarm and Evolutionary Computation,2020,54:100650.
[72]VIJAYAPRABAKARAN K,SATHIYAMURTHY K.Towards activation function search for long short-term model network:A differential evolution based approach[J].Journal of King Saud University-Computer and Information Sciences,2022,34,2637-2650.
[73]NESHAT M,NEZHAD M M,ABBASNEJAD E,et al.An evolutionary deep learning method for short-term wind speed prediction:A case study of the lillgrund offshore wind farm[J].arXiv:2002.09106,2020.
[74]SOMU N,MR G R,RAMAMRITHAM K.A hybrid model for building energy consumption forecasting using long short term memory networks[J].Applied Energy,2020,261:114131.
[75]VISWAMBARAN R A,CHEN G,XUE B,et al.Evolving deep recurrent neural networks using a new variable-length genetic algorithm[C]//2020 IEEE Congress on Evolutionary Computation(CEC).IEEE,2020:1-8.
[76]DIEZ-OLIVAN A,ORTEGO P,DEL SER J,et al.Adaptivedendritic cell-deep learning approach for industrial prognosis under changing conditions[J].IEEE Transactions on Industrial Informatics,2021,17(11):7760-7770.
[77]ROGERS B,NOMAN N,CHALUP S,et al.Evolutionary Hyperparameter Optimisation for Sentence Classification[C]//2021 IEEE Congress on Evolutionary Computation(CEC).IEEE,2021:958-965.
[78]VIOLOS J,TSANAKAS S,THEODOROPOULOS T,et al.Hypertuming GRU Neural Networks for Edge Resource Usage Prediction[C]//2021 IEEE Symposium on Computers and Communications(ISCC).IEEE,2021:1-8.
[79]PAWAR S A,SAN O,YEN G G.Hyperparameter Search using the Genetic Algorithm for Surrogate Modeling of Geophysical Flows[C]//AIAA SCITECH 2022 Forum.2022.
[80]ANGELINE P J,SAUNDERS G M,POLLACK J B.An evolutionary algorithm that constructs recurrent neural networks[J].IEEE Transactions on Neural Networks,1994,5(1):54-65.
[81]JUANG C F.A hybrid of genetic algorithm and particle swarm optimization for recurrent network design[J].IEEE Transactions on Systems,Man,and Cybernetics,Part B(Cybernetics),2004,34(2):997-1006.
[82]BAYER J,WIERSTRA D,TOGELIUS J,et al.Evolving memory cell structures for sequence learning[C]//International Conference on Artificial Neural Networks.Berlin:Springer,2009:755-764.
[83]WANG H,WANG H,XU K.Evolutionary recurrent neural network for image captioning[J].Neurocomputing,2020,401:249-256.
[84]RAWAL A,MIIKKULAINEN R.Evolving deep LSTM-basedmemory networks using an information maxi-mization objective[C]//Proceedings of the Genetic and Evolutionary Computation Conference 2016.2016:501-508.
[85]STANLEY K O,MIIKKULAINEN R.Evolving neural net-works through augmenting topologies[J].Evolutionary computation,2002,10(2):99-127.
[86]RAWAL A,MIIKKULAINEN R.From nodes to networks:Evolving recurrent neural networks[J].arXiv:1803.04439,2018.
[87]DESELL T,CLACHAR S,HIGGINS J,et al.Evolving deep recurrent neural networks using ant colony optimization[C]//European Conference on Evolutionary Computation in Combinatorial Optimization.2015:86-98.
[88]ELSAID A E R,EL JAMIY F,HIGGINS J,et al.Optimizing long short-term memory recurrent neural networks using ant colony optimization to predict turbine engine vibration[J].Applied Soft Computing,2018,73:969-991.
[89]ELSAID A E R,JAMIY F E,HIGGINS J,et al.Using ant colony optimization to optimize long short-term memory recurrent neural networks[C]//Proceedings of the Genetic and Evolutio-nary Computation Conference.2018:13-20.
[90]ELSAID A E R A,ORORBIA A G,DESELL T J.The ant swarm neuro-evolution procedure for optimizing recurrent networks[J].arXiv:1909.11849,2019.
[91]ELSAID A E R,BENSON S,PATWARDHAN S,et al.Evolving recurrent neural networks for time series data prediction of coal plant parameters[C]//International Conference on the Applications of Evolutionary Computation(Part of EvoStar).Cham:Springer,2019:488-503.
[92]ORORBIA A,ELSAID A E R,DESELL T.Investigating recurrent neural network memory structures using neuro-evolution[C]//Proceedings of the Genetic and Evolutionary Computation Conference.2019:446-455.
[93]ORORBIA II A G,MIKOLOV T,REITTER D.Learning simpler language models with the differential state framework[J].Neural Computation,2017,29(12):3327-3352.
[94]ZHOU G B,WU J,ZHANG C L,et al.Minimal gated unit for recurrent neural networks[J].International Journal of Automation and Computing,2016,13(3):226-234.
[95]COLLINS J,SOHL-DICKSTEIN J,SUSSILLO D.Capacity and trainability in recurrent neural networks[J].arXiv:1611.09913,2016.
[96]ALBA E,TOMASSINI M.Parallelism and evolution-ary algorithms[J].IEEE transactions on evolutionary computation,2002,6(5):443-462.
[97]DESELL T,ELSAID A E R A,ORORBIA A G.Investigating Deep Recurrent Connections and Recurrent Memory Cells Using Neuro-Evolution[M]//Deep Neural Evolution.2020:253-291.
[98]MO H,CUSTODE L L,IACCA G.Evolutionary neural architecture search for remaining useful life predic-tion[J].Applied Soft Computing,2021,108:107474.
[99]CAMERO A,TOUTOUH J,ALBA E.Random error sampling-based recurrent neural network archi-tecture optimization[J].Engineering Applications of Artificial Intelligence,2020,96:103946.
[100]LYU Z,ELSAID A E R,KARNS J,et al.An ExperimentalStudy of Weight Initialization and Weight Inheritance Effects on Neuroevolution[J].arXiv:2009.09644,2020.
[101]HANSEN L K,SALAMON P.Neural network ensem-bles[J].IEEE transactions on pattern analysis and machine intelligence,1990,12(10):993-1001.
[102]YAO W S,WAN Q,CHEN Z Q,et al.The researching overview of evolutionary neural networ-ks[J].Computer Science,2004(3):125-129.
[103]SMITH C,JIN Y.Evolutionary multi-objective generation of recurrent neural network ensembles for time series prediction[J].Neurocomputing,2014,143:302-311.
[104]AI S,CHAKRAVORTY A,RONG C.Evolutionary ensemblelstm based household peak demand prediction[C]//2019 International Conference on Artificial Intelligence in Information and Communication(ICAIIC).IEEE,2019:1-6.
[105]VISWAMBARAN R A,CHEN G,XUE B,et al.Evolutionary design of long short term memory(lstm) ensemble[C]//2020 IEEE Symposium Series on Computational Intelligence(SSCI).IEEE,2020:2692-2698.
[106]VISWAMBARAN R A,CHEN G,XUE B,et al.Two-Stage Genetic Algorithm for Designing Long Short Term Memory(LSTM) Ensembles[C]//2021 IEEE Congress on Evolutionary Computation(CEC).IEEE,2021:942-949.
[107]PAN S J,YANG Q.A survey on transfer learning [J].IEEETransactions on Knowledge and Data Engineering,2009,22(10):1345-1359.
[108]ZHUANG F,QI Z,DUAN K,et al.A comprehensive survey on transfer learning[C]//Proceedings of the IEEE.2020:43-76.
[109]ELSAID A E R,KARNAS J,LYU Z,et al.Neuro-evolutionary transfer learning through structural adaptation[C]//International Conference on the Applications of Evolutionary Computation(Part of EvoStar).Cham:Springer,2020:610-625.
[110]ELSAID A E R,KARNS J,ORORBIA II A,et al.Neuro-evolutionary Transfer Learning of Deep Recurrent Neural Networks through Network-Aware Adaptation[J].arXiv:2006.02655,2020.
[111]ELSAID A E R,KARNS J,LYU Z,et al.Improving neuro-evolutionary transfer learning of deep recurrent neural networks through network-aware adaptation [C]//Proceedings of the 2020 Genetic and Evolutionary Computation Conference.2020:315-323.
[1] 王晓飞, 樊学强, 李章维.
基于迁移学习和多视图特征融合提高RNA碱基相互作用预测
Improving RNA Base Interactions Prediction Based on Transfer Learning and Multi-view Feature Fusion
计算机科学, 2023, 50(3): 164-172. https://doi.org/10.11896/jsjkx.211200186
[2] 方义秋, 张震坤, 葛君伟.
基于自注意力机制和迁移学习的跨领域推荐算法
Cross-domain Recommendation Algorithm Based on Self-attention Mechanism and Transfer Learning
计算机科学, 2022, 49(8): 70-77. https://doi.org/10.11896/jsjkx.210600011
[3] 王灿, 刘永坚, 解庆, 马艳春.
基于软标签和样本权重优化的Anchor Free目标检测算法
Anchor Free Object Detection Algorithm Based on Soft Label and Sample Weight Optimization
计算机科学, 2022, 49(8): 157-164. https://doi.org/10.11896/jsjkx.210600240
[4] 彭双, 伍江江, 陈浩, 杜春, 李军.
基于注意力神经网络的对地观测卫星星上自主任务规划方法
Satellite Onboard Observation Task Planning Based on Attention Neural Network
计算机科学, 2022, 49(7): 242-247. https://doi.org/10.11896/jsjkx.210500093
[5] 王君锋, 刘凡, 杨赛, 吕坦悦, 陈峙宇, 许峰.
基于多源迁移学习的大坝裂缝检测
Dam Crack Detection Based on Multi-source Transfer Learning
计算机科学, 2022, 49(6A): 319-324. https://doi.org/10.11896/jsjkx.210500124
[6] 林夕, 陈孜卓, 王中卿.
基于不平衡数据与集成学习的属性级情感分类
Aspect-level Sentiment Classification Based on Imbalanced Data and Ensemble Learning
计算机科学, 2022, 49(6A): 144-149. https://doi.org/10.11896/jsjkx.210500205
[7] 康雁, 吴志伟, 寇勇奇, 张兰, 谢思宇, 李浩.
融合Bert和图卷积的深度集成学习软件需求分类
Deep Integrated Learning Software Requirement Classification Fusing Bert and Graph Convolution
计算机科学, 2022, 49(6A): 150-158. https://doi.org/10.11896/jsjkx.210500065
[8] 王宇飞, 陈文.
基于DECORATE集成学习与置信度评估的Tri-training算法
Tri-training Algorithm Based on DECORATE Ensemble Learning and Credibility Assessment
计算机科学, 2022, 49(6): 127-133. https://doi.org/10.11896/jsjkx.211100043
[9] 彭云聪, 秦小林, 张力戈, 顾勇翔.
面向图像分类的小样本学习算法综述
Survey on Few-shot Learning Algorithms for Image Classification
计算机科学, 2022, 49(5): 1-9. https://doi.org/10.11896/jsjkx.210500128
[10] 韩红旗, 冉亚鑫, 张运良, 桂婕, 高雄, 易梦琳.
基于共同子空间分类学习的跨媒体检索研究
Study on Cross-media Information Retrieval Based on Common Subspace Classification Learning
计算机科学, 2022, 49(5): 33-42. https://doi.org/10.11896/jsjkx.210200157
[11] 喻昕, 林植良.
解决一类非光滑伪凸优化问题的新型神经网络
Novel Neural Network for Dealing with a Kind of Non-smooth Pseudoconvex Optimization Problems
计算机科学, 2022, 49(5): 227-234. https://doi.org/10.11896/jsjkx.210400179
[12] 安鑫, 代子彪, 李阳, 孙晓, 任福继.
基于BERT的端到端语音合成方法
End-to-End Speech Synthesis Based on BERT
计算机科学, 2022, 49(4): 221-226. https://doi.org/10.11896/jsjkx.210300071
[13] 谭珍琼, 姜文君, 任演纳, 张吉, 任德盛, 李晓鸿.
基于二分图的个性化学习任务分配
Personalized Learning Task Assignment Based on Bipartite Graph
计算机科学, 2022, 49(4): 269-281. https://doi.org/10.11896/jsjkx.210500125
[14] 李昊, 曹书瑜, 陈亚青, 张敏.
基于注意力机制的用户轨迹识别模型
User Trajectory Identification Model via Attention Mechanism
计算机科学, 2022, 49(3): 308-312. https://doi.org/10.11896/jsjkx.210300231
[15] 左杰格, 柳晓鸣, 蔡兵.
基于图像分块与特征融合的户外图像天气识别
Outdoor Image Weather Recognition Based on Image Blocks and Feature Fusion
计算机科学, 2022, 49(3): 197-203. https://doi.org/10.11896/jsjkx.201200263
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!