Computer Science ›› 2020, Vol. 47 ›› Issue (11A): 437-443.doi: 10.11896/jsjkx.200300091

• Big Data & Data Science • Previous Articles     Next Articles

Tax Prediction Based on LSTM Recurrent Neural Network

WEN Hao, CHEN Hao   

  1. School of Computer Science & Information Engineering,Hubei University,Wuhan 430062,China
  • Online:2020-11-15 Published:2020-11-17
  • About author:WEN Hao,born in 1994,postgraduate.His main research interests include machine learning and tax informatization.
    CHEN Hao,born in 1977,Ph.D,professor.His main research interests include uncertain artificial intelligence and so on.
  • Supported by:
    This work was supported by the General Program of National Natural Science Foundation of China (61977021) and Guizhou Tax Big Data Consolidation Platform Project (182001022).

Abstract: Analyzing the hidden relationship between historical tax data and using mathematical models to predict future tax revenue is the focus of tax forecast research.A tax prediction model of long short-term memory (LSTM) recurrent neural network combined with wavelet transform is proposed in this paper.Combining wavelet transform on data preprocessing to remove noise from tax data and improve the generalization ability of the model.The LSTM neural network can better learn the correlation between historical tax data by adding hidden neural units and gated units,and further extract valid state innovations between input sequences,and overcome the long-term dependency problem of recurrent neural networks.Experimental results show that the encoder-decoder structure based on the LSTM neural network can enhance the time step of tax prediction.Compared with the single-step sliding window LSTM neural network model and the gray model based on difference differential equations in the long-term tax prediction,the model and the regression-based autoregressive moving average model (ARIMA) significantly improve the prediction accuracy.

Key words: Encoder-decoder, Long-short term memory network, Tax forecasting, Wavelet transform

CLC Number: 

  • TP183
[1] WANG Y,LI Y,WANG L L,et al.Software stage effort prediction based on analogy and grey model [J].Computer Science,2018,45(S2):480-487.
[2] ZHAO Z,WANG J Z,ZHAO J,et al.Using a grey model optimized by differential evolution algorithm to forecast the per capita annual net income of rural households in China[J].Omega-International Journal of Management Science,2012,40(5):525-532.
[3] XIANG C S,ZHANG L F.Grain yield prediction model based on grey theory and markvo[J].Computer Science,2013,40(2):245-248.
[4] MALDONADO-MOLINA M M,WAGENAAR A C.Effects of Alcohol Taxes on Alcohol-Related Mortality in Florida:Time-Series Analyses From 1969 to 2004[J].Alcoholism-Clinical And Experimental Research,2010,34 (11):1915-1921.
[5] LIN J L.Application of time series model in customs revenue forecasting[J].Statistics and consulting,2008,2008(5):46-47.
[6] HAVIV D,RIVKIND A,BARAK O.Understanding and Controlling Memory in Recurrent Neural Networks[C]//Thirty-sixth International Conference on Machine Learning.2019:2663-2671.
[7] WANG Y Y,SMOLA A,MADDIX D C,et al.Deep Factors for Forecasting[C]//Thirty-sixth International Conference on Machine Learning.2019.
[8] CHARLES A,YIN D,ROZELL C.Distributed Sequence Memory of Multidimensional Inputs in Recurrent Networks[J].Journal of Machine Learning Research,2017,18(7):1-37.
[9] GERS F A,ECK D,SCHMIDHUBER J.Applying LSTM to time series predictable through time-window approach[C]//Proceedings of the 2001 International Conference on Artificial Neural Networks.London:Springer-Verlag,2001:669-676.
[10] SUTSKEVER I,VINYALS O,LE Q V.Sequence to sequence learning with neural networks[C]//Neural Information Processing Systems 2014.Advances in neural information processing systems,2014:3104-3112.
[11] SONG K T,TAN X,QIN T,et al.Tie-Yan Liu.MASS:Masked Sequence to Sequence Pre-training for Language Generation[C]//Thirty-sixth International Conference on Machine Learning.2019.
[12] TAY Y,PHAN M C,TUAN L A,et al.Learning to RankQuestion Answer Pairs with Holographic Dual LSTM Architecture[C]//The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval,2018.ACM,2018:325-324.
[13] GAO T W,CHAI Y T,LIU Y.Applying long short term me-mory neural networks for predicting stock closing price [C]//2017 8th IEEE International Conference on Software Engineering and Service Science.IEEE,2017:575-578.
[14] HE Z,GAO S B,XIAO L,et al.Wider and Deeper,Cheaper and Faster:Tensorized LSTMs for Sequence Learning [C]//Advances in Neural Information Processing Systems 30 (NIPS 2017).Curran Associates Inc,2017:1-11.
[15] JIA L,ZHENG C J.Short-term Forecasting Model of Agricultural Product Price Index Based on LSTM-DA Neural Network[J].Computer Science,2019,2019,46(S2):62-65,71.
[16] ESSIEN A,GIANNETTI C.A Deep Learning Framework forUnivariate Time Series Prediction Using Convolutional LSTM Stacked Autoencoders [C]//2019 IEEE International Symposiumon Innovations in Intelligent Systems and Applications.IEEE,2019:1-6.
[17] HOCHREITER S,SCHMIDHUBER J.Long short-term memory[J].Neural Computation,1997,9(8):1735-1780.
[18] CHO K,BAHDANAU D,BOUGARES F,et al.Learning phrase representations using RNN encoder-decoder for statistical machine translation[J].EMNLP,2014:1724-1734.
[19] POPOOLA A,AHMAD K.Testing the suitability of wavelet preprocessing for TSK fuzzy models[C]//ICFS 2006:2006 IEEE International Conference on Fuzzy Systems.IEEE,2006:1305-1309.
[20] ZHAO A,ZHANG D,SHI J Q.Forecasting and Analysis ofEUR/USD Exchange Rate Moving Direction with Support Vector Machine [C]//2018 IEEE 8th Annual International Conference on CYBER Technology in Automation,Control,and Intelligent Systems.IEEE,2018:1484-1489.
[21] ZHANG S X,CONG X R.The Application of Wavelet Analysis in Financial Multiple Change Points Time Series [C]//2018 5th International Conference on Industrial Economics System and Industrial Security Engineering.IEEE,2018:1-6.
[22] SIRCAR R.An introduction to wavelets and other filteringmethods in finance and economics[M].Utah:Academic Press,2002:359.
[23] MALLAT S G.A Theory of Multiresolution Signal Decomposition[J].IEEE,1989,11(7):581-767.
[24] KINGMA D P,BA J.Adam:A Method for Stochastic Optimization [C]//Proceedings of the 3rd International Conference on Learning Representations.2015.
[25] SRIVASTAVA N,HINTON G E,KRIZHEVSKY A,et al.Dropout:a simple way to prevent neural networks from overfitting[J].Journal of Machine Learning Research,2014,15(6):1929-1958.
[1] SUN Jie-qi, LI Ya-feng, ZHANG Wen-bo, LIU Peng-hui. Dual-field Feature Fusion Deep Convolutional Neural Network Based on Discrete Wavelet Transformation [J]. Computer Science, 2022, 49(6A): 434-440.
[2] LAI Teng-fei, ZHOU Hai-yang, YU Fei-hong. Real-time Extend Depth of Field Algorithm for Video Processing [J]. Computer Science, 2022, 49(6A): 314-318.
[3] CHEN Zhang-hui, XIONG Yun. Stylized Image Captioning Model Based on Disentangle-Retrieve-Generate [J]. Computer Science, 2022, 49(6): 180-186.
[4] QIU Jia-zuo, XIONG De-yi. Frontiers in Neural Question Generation:A Literature Review [J]. Computer Science, 2021, 48(6): 159-167.
[5] TENG Jian, TENG Fei, LI Tian-rui. Travel Demand Forecasting Based on 3D Convolution and LSTM Encoder-Decoder [J]. Computer Science, 2021, 48(12): 195-203.
[6] ZHANG Ning, FANG Jing-wen, ZHAO Yu-xuan. Bitcoin Price Forecast Based on Mixed LSTM Model [J]. Computer Science, 2021, 48(11A): 39-45.
[7] JIANG Qi, SU Wei, XIE Ying, ZHOUHONG An-ping, ZHANG Jiu-wen, CAI Chuan. End-to-End Chinese-Braille Automatic Conversion Based on Transformer [J]. Computer Science, 2021, 48(11A): 136-141.
[8] YAN Xu, MA Shuai, ZENG Feng-jiao, GUO Zheng-hua, WU Jun-long, YANG Ping, XU Bing. Light Field Depth Estimation Method Based on Encoder-decoder Architecture [J]. Computer Science, 2021, 48(10): 212-219.
[9] LUO Ting-rui, JIA Jian, ZHANG Rui. Epileptic EEG Signals Detection Based on Tunable Q-factor Wavelet Transform and Transfer Learning [J]. Computer Science, 2020, 47(7): 199-205.
[10] CHEN Jin-yin, JIANG Tao and ZHENG Hai-bin. Radio Modulation Recognition Based on Signal-noise Ratio Classification [J]. Computer Science, 2020, 47(6A): 310-317.
[11] ZHANG Yang-feng, WEI Shi-hong, DENG Na-na, WANG Wen-rui. Vibration Sensor Data Analysis Based on Wavelet Denoising [J]. Computer Science, 2019, 46(6A): 537-539.
[12] LI Kai, LUO Xiao-qing, ZHANG Zhan-cheng, WANG Jun. Image Fusion Using Quaternion Wavelet Transform and Copula Model [J]. Computer Science, 2019, 46(4): 293-299.
[13] LIU Jia-hui, WANG Yu-jie, LEI Yi. CSI Gesture Recognition Method Based on LSTM [J]. Computer Science, 2019, 46(11A): 283-288.
[14] WANG Qian, YU Lai-hang, CAO Yan, ZHANG Lei, QIN Jie, YE Hai-qin. Blind Watermarking Algorithm for Digital Image Based on Fibonacci Scrambling in Wavelet Domain [J]. Computer Science, 2018, 45(6): 135-140.
[15] YANG Yan-chun, LI Jiao, DANG Jian-wu and WANG Yang-ping. Multi-focus Image Fusion Based on Redundant Wavelet Transform and Guided Filtering [J]. Computer Science, 2018, 45(2): 301-305.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!