Computer Science ›› 2024, Vol. 51 ›› Issue (11A): 231200008-7.doi: 10.11896/jsjkx.231200008

• Big Data & Data Science • Previous Articles     Next Articles

Time Series Prediction of Hybrid Neural Networks Based on Seasonal Decomposition

XU Junwen, CHEN Zonglei, LI Tianrui, LI Chongshou   

  1. School of Computing and Artificial Intelligence,Southwest Jiaotong University,Chengdu 611756,China
  • Online:2024-11-16 Published:2024-11-13
  • About author:XU Junwen,born in 1999,postgraduate.Her main research interests include seasonal decomposition and time series prediction.
    LI Chongshou,born in 1988,Ph.D,associate professor,is a member of CCF(No.J8308M).His main research interests include intelligent transportation,data analysis and AI.
  • Supported by:
    National Natural Science Foundation of China(62202395,62176221),Natural Science Foundation of Sichuan Province,China(2022NSFSC0930) and Fundamental Research Funds for the Central Universities of Ministry of Education of China(2682022CX067).

Abstract: In recent years,time series forecasting has found widespread applications in various domains such as finance,meteoro-logy,and military.Deep learning has begun to demonstrate significant potential and application prospects in time series forecasting tasks.However,recurrent neural networks often encounter issues like information loss and exploding gradients when dealing with time series predictions over extended periods.In contrast,Transformer models and their variants,when utilizing attention mechanisms,typically overlook the temporal relationships between variables in time series data.To address these challenges,this paper proposes a hybrid neural network time series forecasting model based on seasonal decomposition.This model employs a seasonal decomposition module to capture the variations in different periodic frequency components within the time series.Simultaneously,by integrating multi-head self-attention mechanisms and composite dilated convolution layers,the model leverages the interaction between global and local information to obtain multi-scale temporal positional information among the data.Ultimately,experiments are conducted on publicly available datasets from 4 different domains,and the results indicate that the predictive perfor-mance of the proposed model surpasses that of current popular mainstream methods.

Key words: Time series forecasting, Seasonal decomposition, Self-attention mechanism, Dilated convolution, Hybrid model

CLC Number: 

  • TP181
[1]BÖSE J H,FLUNKERT V,GASTHAUS J,et al.Probabilistic demand forecasting at scale [C]//Proceedings of the VLDB Endowment.2017:1694-1705.
[2]MUDELSEE M.Trend analysis of climate time series:A review of methods [J].Earth-science Reviews,2019,190:310-322.
[3]TOPOLE J.High-performance medicine:the convergence of human and artificial intelligence[J].Nature Medicine,2019,25(1):44-56.
[4]GARDNER J E.Exponential smoothing:The state of the art[J].Journal of Forecasting,1985,4(1):1-28.
[5]WINTERS P.Forecasting sales by exponentially weighted mo-ving averages [J].Management Science,1960,6(3):324-342.
[6]VASWANI A,SHAZEER N,PARMARN,et al.Attention is allyou need[J].Advances in Neural Information Processing Systems,2017,30.
[7]RANGAPURAM S S,SEEGER M W,GASTHAUS J,et al.Deep state space models for time series forecasting [J].Advances in Neural Information Processing Systems,2018,31.
[8]LECUN Y,BENGIO Y.Convolutional networks for images,speech,and time series[M]//The Handbook of Brain Theory and Neural Networks.MIT Press,1995.
[9]HUBEL D H,WIESEL T N.Receptive fields of single neurones in the cat's striate cortex[J].The Journal of Physiology,1959,148(3):574.
[10]VASWANI A,SHAZEER N,PARMAR N,et al.Attention isall you need[J].Advances in Neural Information Processing Systems,2017,30.
[11]CHEN M,PENG H,FU J,et al.Autoformer:Searching transformers for visual recognition [C]//Proceedings of the IEEE/CVF International Conference on Computer Vision.2021:12270-12280.
[12]DU S,LI T,YANG Y,et al.Multivariate time series forecasting via attention-based encoder-decoder framework[J].Neurocomputing,2020,388:269-279
[13]BOX GE P,JENKINS G M.Some recent advances in forecasting and control [J].Journal of the Royal Statistical Society,1968,17(2):91-109.
[14]CROSTON J D.Forecasting and stock control for intermittent demands [J].Journal of the Operational Research Society,1972,23(3):289-303.
[15]GRAVES A.Long short-term memory [M]//Supervised Sequence Labelling with Recurrent Neural Networks.2012:37-45.
[16]CHO K,VAN MERRIËNBOER B,BAHDANAU D,et al.Onthe properties of neural machine translation:Encoder-decoder approaches [J].arXiv:1409.1259,2014.
[17]SALINAS D,FLUNKERT V,GASTHAUSJ,et al.DeepAR:Probabilistic forecasting with autoregressive recurrent networks [J].International Journal of Forecasting,2020,36(3):1181-1191.
[18]LAI G,CHANG W C,YANG Y,et al.Modeling long-and short-term temporal patterns with deep neural networks [C]//The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval.2018:95-104.
[19]SMYLS.A hybrid method of exponential smoothing and recurrent neural networks for time series forecasting[J].Interna-tional Journal of Forecasting,2020,36(1):75-85.
[20]LIU Y,WU H,WANG J,et al.Non-stationary transformers:Exploring the stationarity in time series forecasting[J].Advances in Neural Information Processing Systems,2022,35:9881-9893.
[21]ZHOU H,ZHANG S,PENG J,et al.Informer:Beyond efficient transformer for long sequence time-series forecasting[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2021:11106-11115.
[22]YU F,KOLTUN V.Multi-scale context aggregation by dilated convolutions [J].arXiv:1511.07122,2015.
[1] LI Jiaying, LIANG Yudong, LI Shaoji, ZHANG Kunpeng, ZHANG Chao. Study on Algorithm of Depth Image Super-resolution Guided by High-frequency Information ofColor Images [J]. Computer Science, 2024, 51(7): 197-205.
[2] QUE Yue, GAN Menghan, LIU Zhiwei. Object Detection with Receptive Field Expansion and Multi-branch Aggregation [J]. Computer Science, 2024, 51(6A): 230600151-6.
[3] LIU Xiaohu, CHEN Defu, LI Jun, ZHOU Xuwen, HU Shan, ZHOU Hao. Speaker Verification Network Based on Multi-scale Convolutional Encoder [J]. Computer Science, 2024, 51(6A): 230700083-6.
[4] ZHANG Lanxin, XIANG Ling, LI Xianze, CHEN Jinpeng. Intelligent Fault Diagnosis Method for Rolling Bearing Based on SAMNV3 [J]. Computer Science, 2024, 51(6A): 230700167-6.
[5] GAO Nan, ZHANG Lei, LIANG Ronghua, CHEN Peng, FU Zheng. Scene Text Detection Algorithm Based on Feature Enhancement [J]. Computer Science, 2024, 51(6): 256-263.
[6] ZHANG Jianliang, LI Yang, ZHU Qingshan, XUE Hongling, MA Junwei, ZHANG Lixia, BI Sheng. Substation Equipment Malfunction Alarm Algorithm Based on Dual-domain Sparse Transformer [J]. Computer Science, 2024, 51(5): 62-69.
[7] ZHANG Guodong, CHEN Zhihua, SHENG Bin. Infrared Small Target Detection Based on Dilated Convolutional Conditional GenerativeAdversarial Networks [J]. Computer Science, 2024, 51(2): 151-160.
[8] ZHANG Feng, HUANG Shixin, HUA Qiang, DONG Chunru. Novel Image Classification Model Based on Depth-wise Convolution Neural Network andVisual Transformer [J]. Computer Science, 2024, 51(2): 196-204.
[9] REN Yuheng, ZHAO Yunfeng, WU Chuang. Deep Gait Recognition Network Based on Relative Position Encoding Transformer [J]. Computer Science, 2024, 51(11A): 240400064-6.
[10] ZHOU Xueyang, FU Qiming, CHEN Jianping, LU You, WANG Yunzhe. Chemical-induced Disease Relation Extraction:Graph Reasoning Method Based on Evidence Focusing [J]. Computer Science, 2024, 51(10): 351-361.
[11] LIU Peigang, SUN Jie, YANG Chaozhi, LI Zongmin. Crowd Counting Based on Multi-scale Feature Aggregation in Dense Scenes [J]. Computer Science, 2023, 50(9): 235-241.
[12] TENG Sihang, WANG Lie, LI Ya. Non-autoregressive Transformer Chinese Speech Recognition Incorporating Pronunciation- Character Representation Conversion [J]. Computer Science, 2023, 50(8): 111-117.
[13] YAN Mingqiang, YU Pengfei, LI Haiyan, LI Hongsong. Arbitrary Image Style Transfer with Consistent Semantic Style [J]. Computer Science, 2023, 50(7): 129-136.
[14] LI Han, HOU Shoulu, TONG Qiang, CHEN Tongtong, YANG Qimin, LIU Xiulei. Entity Relation Extraction Method in Weapon Field Based on DCNN and GLU [J]. Computer Science, 2023, 50(6A): 220200112-7.
[15] LI Fan, JIA Dongli, YAO Yumin, TU Jun. Graph Neural Network Few Shot Image Classification Network Based on Residual and Self-attention Mechanism [J]. Computer Science, 2023, 50(6A): 220500104-5.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!