计算机科学 ›› 2025, Vol. 52 ›› Issue (2): 67-79.doi: 10.11896/jsjkx.240100167
王会强1, 黄飞虎1, 彭舰1, 蒋元2, 张凌浩3
WANG Huiqiang1, HUANG Feihu1, PENG Jian1, JIANG Yuan2, ZHANG Linghao3
摘要: 多元时间序列预测具有广阔的应用领域,如电力预测、天气预测等。最新的模型虽然都取得了相对不错的结果,但仍然面临以下挑战:1)如何充分考虑多元序列不同变量之间的相关性,作出更加精准的预测;2)建模不同变量之间的相关性通常需要巨大的时间和空间代价。目前的方法主要分为变量独立方法和变量混合方法。其中变量独立方法是根据每个变量自身的信息对该变量进行预测,没有考虑不同变量之间的相关性;变量混合方法是将所有变量整体嵌入到一个高维隐藏空间,没有对变量之间的相关性进行针对性的建模,不能充分捕捉变量之间的相关性。为了应对这些挑战,提出了基于时间依赖和变量交互的多元时间序列预测方法FIID,在充分建模不同变量之间相关性的同时,大大降低了时间和空间代价。首先,基于不同变量之间的相关性通常具有稀疏性这一特征提出变量折叠,大幅降低后续建模不同变量之间相关性的时间和空间代价;然后,提出时间依赖模块从频率角度进行线性变换来捕捉变量内部的全局相关性;更进一步地,定义不同变量之间的相关性为所有变量不同时间段之间的相关性,基于此提出变量交互模块,先聚合变量的局部信息,再在此基础上建模所有局部特征之间的全局相关性。这两个模块不仅对变量之间的相关性进行了充分的建模,且相比现有方法大大降低了时间和空间代价。在12个真实数据集上对FIID进行了对比实验,结果表明所提模型达到了最佳性能,并且效率更高。
中图分类号:
[1]RAFAL A A,PETRUS C M,BERKAY A,et al.Multivariate Time Series Dataset For Space Weather Data Analytics[J].Scientific Data,2020,7(1):227. [2]YASUKO M,YASUSHI S,WILLEM G,et al.FUNNEL:automatic mining of spatially coevolving epidemics[C]//FUNNEL:Automatic Mining of Spatially Coevolving Epidemics.2014:105-114. [3]SUNDAR R S,SEEGER M,GASTHAUS J,et al.Deep StateSpace Models for Time Series Forecasting[C]//Proceedings of the 32nd International Conference on Neural Information Processing Systems.2018:7796-7805. [4]DAVID S,VALENTIN F,JAN G,et al.Deepar:Probabilistic Forecasting With Autoregressive Recurrent Networks[J].International Journal of Forecasting,2021,37(3):1303-1303. [5]BAI S J,ZICO KOLTER J,KOLTUN V.An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling[J].arXiv:1803.01271,2018. [6]HUIQIANG W,JIAN P,FEIHU H,et al.MICN:Multi-scale Local and Global Context Modeling for Long-term Series Forecasting[C]//ICLR.2023. [7]WU H X,HU T G,LIU Y,et al.TimesNet:Temporal 2D-Variation Modeling for General Time Series Analysis[C]//ICLR.2023. [8]ASHISH V,NOAM S,NIKI P,et al.Attention Is All You Need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.2017:6000-6010. [9]LI S Y,JIN X Y,XUAN Y,et al.Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting[C]//Proceedings of the 33rd International Confe-rence on Neural Information Processing Systems.2019:5244-5254. [10]WU H X,XU J H,WANG J M,et al.Autoformer:Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting[C]//Proceedings of the 35th International Confe-rence on Neural Information Processing Systems.2024:22419-22430. [11]LIU Y,WU H X,WANG J M,et al.Non-stationary Transformers:Exploring the Stationarity in Time Series Forecasting[C]//NeurIPS.2022. [12]ZHANG Y H,YAN J C.Crossformer:Transformer UtilizingCross-Dimension Dependency for Multivariate Time Series Forecasting[C]//ICLR.2023. [13]ZHOU T,MA Z Q,WEN Q S,et al.FEDformer:Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting[C]//Proceedings of the 39thInternational Conference on Machine Learning.2022:27268-27286. [14]ZENG A L,CHEN M X,ZHANG L,et al.Are TransformersEffective for Time Series Forecasting?[C]//AAAI.2023:11121-11128. [15]NIE Y Q,NGUYEN N H,SINTHONG P,et al.A Time Series is Worth 64 Words:Long-term Forecasting with Transformers[C]//ICLR.2023. [16]ZHOU H Y,ZHANG S H,PENG J Q,et al.Informer:Beyond Efficient Transformer For Long Sequence Time-Series Forecasting[J].arXiv:2012.07436,2021. [17]LIU S Z,YU H,LIAO C,et al.Pyraformer:Low-ComplexityPyramidal Attention for Long-Range Time Series Modeling and Forecasting[C]//International Conference on Learning Representations.2022. [18]KIM T,KIM J,TAE Y,et al.Reversible Instance Normalization for Accurate Time-Series Forecasting against Distribution Shift[C]//ICLR.2022. [19]WOO S,PARK J,LEE J Y,et al.CBAM:Convolutional Block Attention Module[J].arXiv:1807.06521,2018. [20]WANG S N,LI B,KHABSA M,et al.Linformer:Self-Attention with Linear Complexity[J].arXiv:2006.04768,2020. [21]WU C H,WU F Z,QI T,et al.Fastformer:Additive Attention is All You Need[J].arXiv:2108.09084,2021. [22]SEN R,YU H F,DHILLON I.Think Globally,Act Locally:A Deep Neural Network Approach to High-Dimensional Time Series Forecasting[C]//Proceedings of the 33rd International Conference on Neural Information Processing Systems.2019:4838-4847. [23]KITAEV N,KAISER L,LEVSKAYA A.Reformer:The Efficient Transformer[C]//ICLR.2020. |
|