Computer Science ›› 2019, Vol. 46 ›› Issue (7): 206-210.doi: 10.11896/j.issn.1002-137X.2019.07.031

• Artificial Intelligence • Previous Articles     Next Articles

Bio-inspired Activation Function with Strong Anti-noise Ability

MAI Ying-chao,CHEN Yun-hua,ZHANG Ling   

  1. (School of Computers,Guangdong University of Technology,Guangzhou 5110006,China)
  • Received:2018-05-22 Online:2019-07-15 Published:2019-07-15

Abstract: Although the artificial neural network is almost comparable to the human brain in image recognition,the activation functions such as ReLU and Softplus are only highly simplified and simulated for the output response characte-ristics of biological neurons.There is still a huge gap between the artificial neural network and the human brain in many aspects,such as noise resistance,uncertainty information processing and power consumption.In this paper,based on the simulation experiments of biological neurons and their response characteristics,a strong anti-noise activation function Rand Softplus with biological authenticity was constructed by defining and calculating parameters η,which reflects the randomness of each neuron.Finally,the activation function was applied to the depth residuals network and verified by facial expression dataset.The results show that the recognition accuracy of the activation function proposed in this paper is almost equal to the current mainstream activation function when there is no noise or a small amount of noise,and when the input contains a large amount of noise,it shows good anti-noise performance.

Key words: Activation function, Anti-noise, Leaky integrate-and-fire model, Neural networks

CLC Number: 

  • TP183
[1] LECUN Y,BENGIO Y,HINTON G.Deep learning [J].Nature,2015,521(7553):436.
[2] SCHMIDHUBER J.Deep learning in neural networks:An overview[J].Neural Networks,2015,61:85-117.
[3] BADJATIYA P,GUPTA S,GUPTA M,et al.Deep Learning for Hate Speech Detection in Tweets[C]∥Proceedings of the 26th International Conference on World Wide Web Companion.Perth:International World Wide Web Conferences Steering Committee,2017:759-760.
[4] KIM B K,LEE H,ROH J,et al.Hierarchical Committee of Deep CNNs with Exponentially-Weighted Decision Fusion for Static Facial Expression Recognition[C]∥Proceedings of the 2015 ACM on International Conference on Multimodal Interaction.Seattle:ACM,2015:427-434.
[5] HODGKIN A L,HUXLEY A F.A quantitative description of membrane current and its application to conduction and excitation in nerve[J].The Journal of Physiology,1952,117(4):500-544.
[6] STEIN R B.A Theoretical analysis of neuronal variability[J].Biophysical Journal,1965,5(2):173-194.
[7] IZHIKEVICH E M.Which model to use for cortical spiking neurons[J].IEEE Transactions on Neural Networks,2004,15(5):1063-1070.
[8] ACHARD P,DE SCHUTTER E.Complex parameter landscape for a complex neuron model[J].PLOS Computational Biology,2006,2(7):794-804.
[9] GLOROT X,BENGIO Y.Understanding the difficulty of trai- ning deep feedforward neural networks[C]∥Proceedings of the thirteenth International Conferenceon Artificial Intelligenceand Statistics.Sardinia:PMLR,2010:249-256.
[10] GLOROT X,BORDES A,BENGIO Y.Deep sparse rectifier neural networks[C]∥Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics.Fort Lauderdale:PMLR,2011:315-323.
[11] ERHAN D,BENGIO Y,COURVILLE A,et al.Why does unsupervised pre-training help deep learning?[J].Journal of Machine Learning Research,2010,11:625-660.
[12] TROTTIER L,GIGU P,CHAIB-DRAA B.Parametric expo- nential linear unit for deep convolutional neural networks[C]∥Machine Learning and Applications.Cancun:IEEE Press,2017:207-214.
[13] STROMATIAS E,NEIL D,PFEIFFER M,et al.Robustness of spiking deep belief networks to noise and reduced bit precision of neuro-inspired hardware platforms[J].Frontiers in neuroscience,2015,9:222.
[14] LIU Q,FURBER S.Noisy softplus:a biology inspired activation function[C]∥InInternational Conference on Neural Information Processing.Kyoto:Springer,2016:405-412.
[15] LA C G,GIUGLIANO M,SENN W,et al.The response of cortical neurons to in vivo-like input current:theory and experiment:I.Noisy inputs with stationary statistics[J].Biological Cybernetics,2008,99(5):279-301.
[16] GEWALTIG M O,DIESMANN M.Nest (neural simulation tool)[J].Scholarpedia,2007,2(4):1430.
[17] DAVISON A P,BRUDERLE D,EPPLER J M,et al.PyNN:a common interface for neuronal network simulators[J].Frontiers in Neuroinformatics,2009,2(11):204-123.
[18] LUCEY P,COHN J F,KANADE T,et al.The Extended Cohn-Kanade Dataset (CK+):A complete dataset for action unit and emotion-specified expression[C]∥Computer Vision and Pattern Recognition Workshops.San Francisco:IEEE Press 2010:94-101.
[19] CALVO M G,LUNDQVIS D.Facial expressions of emotion (KDEF):Identification under different display-duration conditions[J].Behavior Research Methods,2008,40(1):109-115.
[20] WHITEHILL J,MOVELLAN J R.A discriminative approach to frame-by-frame head pose tracking[C]∥IEEE International Conference on Automatic Face & Gesture Recognition.Amsterdam:IEEE Press,2008:1-7.
[21] HE K,ZHANG X,REN S,et al.Deep residual learning for ima- ge recognition[C]∥Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,Boston:IEEE Press,2016:770-778.
[1] ZHUANG Shi-jie, YU Zhi-yong, GUO Wen-zhong, HUANG Fang-wan. Short Term Load Forecasting via Zoneout-based Multi-time Scale Recurrent Neural Network [J]. Computer Science, 2020, 47(9): 105-109.
[2] ZHAO Qin-yan, LI Zong-min, LIU Yu-jie, LI Hua. Cascaded Siamese Network Visual Tracking Based on Information Entropy [J]. Computer Science, 2020, 47(9): 157-162.
[3] SUN Yan-li, YE Jiong-yao. Convolutional Neural Networks Compression Based on Pruning and Quantization [J]. Computer Science, 2020, 47(8): 261-266.
[4] LI Ze-wen, LI Zi-ming, FEI Tian-lu, WANG Rui-lin and XIE Zai-peng. Face Image Restoration Based on Residual Generative Adversarial Network [J]. Computer Science, 2020, 47(6A): 230-236.
[5] DIAO Li and WANG Ning. Research on Premium Income Forecast Based on X12-LSTM Model [J]. Computer Science, 2020, 47(6A): 512-516.
[6] MA Hai-Jiang. Recommendation Algorithm Based on Convolutional Neural Network and Constrained Probability Matrix Factorization [J]. Computer Science, 2020, 47(6A): 540-545.
[7] QIAO Meng-yu, WANG Peng, WU Jiao, ZHANG Kuan. Lightweight Convolutional Neural Networks for Land Battle Target Recognition [J]. Computer Science, 2020, 47(5): 161-165.
[8] ZHUANG Zhi-gang, XU Qing-lin. Scene Graph Generation Model Combining Multi-scale Feature Map and Ring-type RelationshipReasoning [J]. Computer Science, 2020, 47(4): 136-141.
[9] PENG Xian, PENG Yu-xu, TANG Qiang, SONG Yan-qi. Crowd Counting Based on Single-column Multi-scale Convolutional Neural Network [J]. Computer Science, 2020, 47(4): 150-156.
[10] LIU Yu-hong,LIU Shu-ying,FU Fu-xiang. Optimization of Compressed Sensing Reconstruction Algorithms Based on Convolutional Neural Network [J]. Computer Science, 2020, 47(3): 143-148.
[11] LIU Xiao-ling,LIU Bai-song,WANG Yang-yang,TANG Hao. Research and Development of Multi-label Generation Based on Deep Learning [J]. Computer Science, 2020, 47(3): 192-199.
[12] HUANG Hong-wei,LIU Yu-jiao,SHEN Zhuo-kai,ZHANG Shao-wei,CHEN Zhi-min,GAO Yang. End-to-end Track Association Based on Deep Learning Network Model [J]. Computer Science, 2020, 47(3): 200-205.
[13] WANG Li-hua,DU Ming-hui,LIANG Ya-ling. Classification Net Based on Angular Feature [J]. Computer Science, 2020, 47(2): 83-87.
[14] FU Xue-yang,SUN Qi,HUANG Yue,DING Xing-hao. Single Image De-raining Method Based on Deep Adjacently Connected Networks [J]. Computer Science, 2020, 47(2): 106-111.
[15] SHAO Yang-xue, MENG Wei, KONG Deng-zhen, HAN Lin-xuan, LIU Yang. Cross-modal Retrieval Method for Special Vehicles Based on Deep Learning [J]. Computer Science, 2020, 47(12): 205-209.
Full text



[1] LEI Li-hui and WANG Jing. Parallelization of LTL Model Checking Based on Possibility Measure[J]. Computer Science, 2018, 45(4): 71 -75 .
[2] SUN Qi, JIN Yan, HE Kun and XU Ling-xuan. Hybrid Evolutionary Algorithm for Solving Mixed Capacitated General Routing Problem[J]. Computer Science, 2018, 45(4): 76 -82 .
[3] ZHANG Jia-nan and XIAO Ming-yu. Approximation Algorithm for Weighted Mixed Domination Problem[J]. Computer Science, 2018, 45(4): 83 -88 .
[4] WU Jian-hui, HUANG Zhong-xiang, LI Wu, WU Jian-hui, PENG Xin and ZHANG Sheng. Robustness Optimization of Sequence Decision in Urban Road Construction[J]. Computer Science, 2018, 45(4): 89 -93 .
[5] SHI Wen-jun, WU Ji-gang and LUO Yu-chun. Fast and Efficient Scheduling Algorithms for Mobile Cloud Offloading[J]. Computer Science, 2018, 45(4): 94 -99 .
[6] ZHOU Yan-ping and YE Qiao-lin. L1-norm Distance Based Least Squares Twin Support Vector Machine[J]. Computer Science, 2018, 45(4): 100 -105 .
[7] LIU Bo-yi, TANG Xiang-yan and CHENG Jie-ren. Recognition Method for Corn Borer Based on Templates Matching in Muliple Growth Periods[J]. Computer Science, 2018, 45(4): 106 -111 .
[8] GENG Hai-jun, SHI Xin-gang, WANG Zhi-liang, YIN Xia and YIN Shao-ping. Energy-efficient Intra-domain Routing Algorithm Based on Directed Acyclic Graph[J]. Computer Science, 2018, 45(4): 112 -116 .
[9] CUI Qiong, LI Jian-hua, WANG Hong and NAN Ming-li. Resilience Analysis Model of Networked Command Information System Based on Node Repairability[J]. Computer Science, 2018, 45(4): 117 -121 .
[10] WANG Zhen-chao, HOU Huan-huan and LIAN Rui. Path Optimization Scheme for Restraining Degree of Disorder in CMT[J]. Computer Science, 2018, 45(4): 122 -125 .