Computer Science ›› 2023, Vol. 50 ›› Issue (11A): 221100186-6.doi: 10.11896/jsjkx.221100186

• Big Data & Data Science • Previous Articles     Next Articles

Prediction Method of Long Series Time Series Based on Improved Informer Model with Kernel Technique

PAN Liqun1, WU Zhonghua1, HONG Biao2   

  1. 1 School of Management,Shanghai University,Shanghai 200444,China
    2 School of International Business and Economics,Shanghai University of International Business and Economic,Shanghai 201620,China
  • Published:2023-11-09
  • About author:PAN Liqun,born in 2000,postgraduate.His main research interests include deep learning and demand forecasting.

Abstract: Nowadays,the prediction of long sequence time series problems is mainly based on RNN like models,and most of the loss functions used are mean square error(MSE).However,MSE loss function can not capture the nonlinear problems commonly existing in long time series data.Moreover,MSE loss function itself is sensitive to outliers and has low robustness.Therefore,this paper proposes to use the improved Kernel MSE loss function based on kernel technique to replace the traditional MSE loss function in Informer model,and solve the nonlinearity in data by mapping the error from the original feature space to a higher dimensional space.Moreover,the first and second derivatives of the new loss function ensure robustness to outliers.Under the background of multivariable prediction and multivariable,this paper compares the prediction accuracy with the classical Informer model,LSTM model and GRU model,taking eight data sets in three types of data as examples.The results show that the improved Informer model has higher prediction accuracy,and the relative improvement value of accuracy increases with the increase of the original data volume,which is suitable for the prediction of long series time series.

Key words: Informer, Loss function, Kernal trick, Long series time series prediction

CLC Number: 

  • TP181
[1]ZHOU H,ZHANG S,PENG J,et al.Informer:Beyond efficient transformer for long sequence time-series forecasting[C]//Proceedings of the AAAI Conference on Artificial Intelligence.AAAI Press,2021:11106-11115.
[2]ZHOU Z T,LIU L,SONG X,et al.Remaining useful life prediction method of rolling bearing based on Transformer model[J/OL].[2021-08-14].https://doi.org/10.13700/j.bh.1001-5965.2021.0247.
[3]LI Y,LIN Y,XIAO T,et al.An efficient transformer decoder with compressed sub-layers[C]//Proceedings of the AAAI Conference on Artificial Intelligence.AAAI Press,2021:13315-13323.
[4]WU H,MENG K,FAN D,et al.Multistep short-term windspeed forecasting using transformer[J].Energy,2022,261:125231.
[5]ZHOU C H,LIN P Q.Traffic flow prediction method based on multi-channel Transformer[J/OL].[2022-10-28].https://doi.org/10.19734/j.issn.1001-3695.2022.06.0 306.
[6]NEYSHABUR B,BHOJANAPALLI S,MCAL-LESTER D,et al.Exploring generalizationin deep learning[C]//31st Confe-rence on Neural Information Processing Systems.Curran Asso-ciates,Inc.,2017:5949-5958.
[7]CHEN L,QU H,ZHAO J.Generalized correntropy induced loss function for deep learning[C]//2016 International Joint Confe-rence on Neural Networks.IEEE,2016:1428-1433.
[8]LAI G,CHANG W C,YANG Y,et al.Modeling long-and short-term temporal patterns with deep neural networks[C]//The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval.Association for Computing Machinery,2018:95-104.
[9]BANDARA K,BERGMEIR C,HEWAMA-LAGE H.LSTM-MSNet:Leveraging forecasts on sets of related time series with multiple seasonal patterns[J].IEEE Transactions on Neural Networks and Learning Systems,2020,32(4):1586-1599.
[10]CHEN X,YU R,ULLAH S,et al.A novel loss function of deep learning in wind speed forecasting[J].Energy,2022,238:121808.
[11]HOCHREITER S,SCHMIDHUBERJ.Long short-term memory[J].Neural Computer,1997,9(8):1735e80.
[12]LIU H,MI X W,LI Y F.Wind speed forecasting method based on deep learning strategy using empirical wavelet transform,long short term memory neural network and Elman neural network[J].Energy Convers Manag,2018,156:498e514.
[13]CHO K,VAN MERRIENBOER B,GULCEHRE C,et al.Learning phrase representations using RNN encoder-decoder for statistical machine translation[C]//Conference Learning Phrase Representations Using RNN Encoder-decoder for Statistical Machine Translation.1724-1734.
[14]LI S,JIN X,XUAN Y,et al.Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[C]//33rd Conference on Neural Information Processing Systems.Curran Associates,Inc.,2019:5244-5254.
[15]LI H A,ZHOU X F,FANG L S,et al.Multivariable time series prediction method based on space-time map convolution network[J].Computer Application Research,2022,39(12):1-7.
[16]WAN C,LI W Z,DING W X,et al.A multivariable time series prediction algorithm based on self evolutionary pre training[J].Journal of Computer Science,2022,45(3):513-525.
[17]LIU H X,XIANG M,ZHOU B T,et al.Power load forecasting for long sequence time-series based on informer[J].Journal of Hubei Minzu University(Natural Science Edition),2021,39(3):326-331.
[18]MA J W,YAN J H,SUN R W,et al.Prediction model of PM2.5 concentration based on LSTM-GCN[J].Environmental Monitoring in China,2022,38(5):153-160.
[1] ZHU Ye, HAO Yingguang, WANG Hongyu. Deep Learning Based Salient Object Detection in Infrared Video [J]. Computer Science, 2023, 50(9): 227-234.
[2] WU Hanxiao, ZHAO Qianqian, ZHU Jianqing, ZENG Huanqiang, DU Jixiang, LIAO Yun. Metric Regularized Infrared and Visible Cross-modal Person Re-identification [J]. Computer Science, 2023, 50(6A): 221100046-8.
[3] SHEN Qiuhui, ZHANG Hongjun, XU Youwei, WANG Hang, CHENG Kai. Comprehensive Survey of Loss Functions in Knowledge Graph Embedding Models [J]. Computer Science, 2023, 50(4): 149-158.
[4] WANG Xiaofei, FAN Xueqiang, LI Zhangwei. Improving RNA Base Interactions Prediction Based on Transfer Learning and Multi-view Feature Fusion [J]. Computer Science, 2023, 50(3): 164-172.
[5] LI Junlin, OUYANG Zhi, DU Nisuo. Scene Text Detection with Improved Region Proposal Network [J]. Computer Science, 2023, 50(2): 201-208.
[6] MENG Yue-bo, MU Si-rong, LIU Guang-hui, XU Sheng-jun, HAN Jiu-qiang. Person Re-identification Method Based on GoogLeNet-GMP Based on Vector Attention Mechanism [J]. Computer Science, 2022, 49(7): 142-147.
[7] GAO Rong-hua, BAI Qiang, WANG Rong, WU Hua-rui, SUN Xiang. Multi-tree Network Multi-crop Early Disease Recognition Method Based on Improved Attention Mechanism [J]. Computer Science, 2022, 49(6A): 363-369.
[8] RAN Yu, ZHANG Li. R-YOLOv5:Auto-cutting,Rotated Text Detection Model [J]. Computer Science, 2022, 49(11A): 210900185-6.
[9] HUANG Ying-qi, CHEN Hong-mei. Cost-sensitive Convolutional Neural Network Based Hybrid Method for Imbalanced Data Classification [J]. Computer Science, 2021, 48(9): 77-85.
[10] ZHANG Xiao-yu, WANG Bin, AN Wei-chao, YAN Ting, XIANG Jie. Glioma Segmentation Network Based on 3D U-Net++ with Fusion Loss Function [J]. Computer Science, 2021, 48(9): 187-193.
[11] FENG Jiao, LU Chang-yu. Cross Media Retrieval Method Based on Residual Attention Network [J]. Computer Science, 2021, 48(6A): 122-126.
[12] SHI Xian-rang, SONG Ting-lun, TANG De-zhi, DAI Zhen-yong. Novel Deep Learning Algorithm for Monocular Vision:H_SFPN [J]. Computer Science, 2021, 48(4): 130-137.
[13] QU Hao, CUI Chao-ran, WANG Xiao-xiao, SU Ya-xi, HAN Xiao-hui, YIN Yi-long. Hierarchical Learning on Unbalanced Data for Predicting Cause of Action [J]. Computer Science, 2021, 48(12): 337-342.
[14] MU Feng-jun, QIU Jing, CHEN Lu-feng, HUANG Rui, ZHOU Lin, YU Gong-jing. Optimization Method for Inter-frame Stability of Object Pose Estimation for Human-Machine Collaboration [J]. Computer Science, 2021, 48(11): 226-233.
[15] MENG Li-sha, REN Kun, FAN Chun-qi, HUANG Long. Dense Convolution Generative Adversarial Networks Based Image Inpainting [J]. Computer Science, 2020, 47(8): 202-207.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!