计算机科学 ›› 2025, Vol. 52 ›› Issue (2): 67-79.doi: 10.11896/jsjkx.240100167

• 数据库&大数据&数据科学 • 上一篇    下一篇

基于时间依赖和变量交互的多元时间序列预测

王会强1, 黄飞虎1, 彭舰1, 蒋元2, 张凌浩3   

  1. 1 四川大学计算机学院 成都 610065
    2 中国人民解放军78135部队 成都 610031
    3 国网四川省电力公司电力科学研究院 成都 610072
  • 收稿日期:2024-01-23 修回日期:2024-06-05 出版日期:2025-02-15 发布日期:2025-02-17
  • 通讯作者: 彭舰(jianpeng@scu.edu.cn)
  • 作者简介:(wanghuiqiang@stu.scu.edu.cn)
  • 基金资助:
    四川省重点研发计划(2023YFG0112,2022YFG0034);四川省重点实验室开放课题(SCITLAB-20001);四川大学博士后交叉学科基金(10822041A2137)

Multivariate Time Series Forecasting Based on Temporal Dependency and Variable Interaction

WANG Huiqiang1, HUANG Feihu1, PENG Jian1, JIANG Yuan2, ZHANG Linghao3   

  1. 1 College of Computer Science,Sichuan University,Chengdu 610065,China
    2 PLA 78135 Troop,Chengdu 610031,China
    3 State Grid Sichuan Electric Power Research Institute,Chengdu 610072,China
  • Received:2024-01-23 Revised:2024-06-05 Online:2025-02-15 Published:2025-02-17
  • About author:WANG Huiqiang,born in 2000,postgraduate,is a student member of CCF(No.J8853G).His main research intere-sts include time series analysis and deep learning.
    PENG Jian,born in 1970,Ph.D,professor,Ph.D supervisor,is an outstanding member of CCF(No.22761D).His main research interests include big data and wireless sensor network.
  • Supported by:
    Key Research and Development Program of Sichuan Province,China (2023YFG0112,2022YFG0034),Intelligent Terminal Key Laboratory of Sichuan Province(SCITLAB-20001) and Post-doctoral Interdisciplinary Innovation Fund of Sichuan University(10822041A2137).

摘要: 多元时间序列预测具有广阔的应用领域,如电力预测、天气预测等。最新的模型虽然都取得了相对不错的结果,但仍然面临以下挑战:1)如何充分考虑多元序列不同变量之间的相关性,作出更加精准的预测;2)建模不同变量之间的相关性通常需要巨大的时间和空间代价。目前的方法主要分为变量独立方法和变量混合方法。其中变量独立方法是根据每个变量自身的信息对该变量进行预测,没有考虑不同变量之间的相关性;变量混合方法是将所有变量整体嵌入到一个高维隐藏空间,没有对变量之间的相关性进行针对性的建模,不能充分捕捉变量之间的相关性。为了应对这些挑战,提出了基于时间依赖和变量交互的多元时间序列预测方法FIID,在充分建模不同变量之间相关性的同时,大大降低了时间和空间代价。首先,基于不同变量之间的相关性通常具有稀疏性这一特征提出变量折叠,大幅降低后续建模不同变量之间相关性的时间和空间代价;然后,提出时间依赖模块从频率角度进行线性变换来捕捉变量内部的全局相关性;更进一步地,定义不同变量之间的相关性为所有变量不同时间段之间的相关性,基于此提出变量交互模块,先聚合变量的局部信息,再在此基础上建模所有局部特征之间的全局相关性。这两个模块不仅对变量之间的相关性进行了充分的建模,且相比现有方法大大降低了时间和空间代价。在12个真实数据集上对FIID进行了对比实验,结果表明所提模型达到了最佳性能,并且效率更高。

关键词: 变量交互, 时间依赖, 线性复杂度, 变量折叠

Abstract: Multivariate time series forecasting has a wide range of applications,such as power forecasting,weather forecasting.Although the latest models have achieved relatively good results,they still face the following challenges:1)it is difficult to fully consider the correlation between different variables in multivariate time series to make more accurate predictions;2)modelling the correlation between different variables usually requires a huge time and space cost.Current methods are mainly classified into va-riable-independent methods and variable-mixed methods.The variable-independent methods predict each variable based on its own information without considering the correlation between different variables;the variable-mixed methods embed all the variables into a high-dimensional hidden space without modelling the correlation between the variables in a targeted way,and cannot adequately capture the correlation between the variables.To address these challenges,this paper proposes a multivariate time series forecasting method FIID based on temporal dependence and variable interaction,which adequately models the correlations among different variables while greatly reducing the time and space costs.Specifically,this paper proposes variable fold based on the fact that correlations between different variables are usually sparse,which greatly reduces the time and space cost of subsequent mo-delling of correlations between different variables.Then this paper proposes the temporal dependence module to capture the global correlations within variables by linear transformation from the frequency perspective.Further,this paper defines the correlation between different variables as the correlation between different time periods of all variables,based on which this paper proposes the variable interaction module,which first aggregates the local information of the variables,and then models the global correlation between all the local features on this basis.With these two modules,not only the correlations between variables are adequately modeled,but also the time and space costs are greatly reduced compared to existing methods.The model FIID is experimented on twelve real datasets,and the results show that it achieves the best performance and possesses higher efficiency.

Key words: Variable interaction, Temporal dependency, Linear complexity, Variable fold

中图分类号: 

  • TP183
[1]RAFAL A A,PETRUS C M,BERKAY A,et al.Multivariate Time Series Dataset For Space Weather Data Analytics[J].Scientific Data,2020,7(1):227.
[2]YASUKO M,YASUSHI S,WILLEM G,et al.FUNNEL:automatic mining of spatially coevolving epidemics[C]//FUNNEL:Automatic Mining of Spatially Coevolving Epidemics.2014:105-114.
[3]SUNDAR R S,SEEGER M,GASTHAUS J,et al.Deep StateSpace Models for Time Series Forecasting[C]//Proceedings of the 32nd International Conference on Neural Information Processing Systems.2018:7796-7805.
[4]DAVID S,VALENTIN F,JAN G,et al.Deepar:Probabilistic Forecasting With Autoregressive Recurrent Networks[J].International Journal of Forecasting,2021,37(3):1303-1303.
[5]BAI S J,ZICO KOLTER J,KOLTUN V.An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling[J].arXiv:1803.01271,2018.
[6]HUIQIANG W,JIAN P,FEIHU H,et al.MICN:Multi-scale Local and Global Context Modeling for Long-term Series Forecasting[C]//ICLR.2023.
[7]WU H X,HU T G,LIU Y,et al.TimesNet:Temporal 2D-Variation Modeling for General Time Series Analysis[C]//ICLR.2023.
[8]ASHISH V,NOAM S,NIKI P,et al.Attention Is All You Need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.2017:6000-6010.
[9]LI S Y,JIN X Y,XUAN Y,et al.Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting[C]//Proceedings of the 33rd International Confe-rence on Neural Information Processing Systems.2019:5244-5254.
[10]WU H X,XU J H,WANG J M,et al.Autoformer:Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting[C]//Proceedings of the 35th International Confe-rence on Neural Information Processing Systems.2024:22419-22430.
[11]LIU Y,WU H X,WANG J M,et al.Non-stationary Transformers:Exploring the Stationarity in Time Series Forecasting[C]//NeurIPS.2022.
[12]ZHANG Y H,YAN J C.Crossformer:Transformer UtilizingCross-Dimension Dependency for Multivariate Time Series Forecasting[C]//ICLR.2023.
[13]ZHOU T,MA Z Q,WEN Q S,et al.FEDformer:Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting[C]//Proceedings of the 39thInternational Conference on Machine Learning.2022:27268-27286.
[14]ZENG A L,CHEN M X,ZHANG L,et al.Are TransformersEffective for Time Series Forecasting?[C]//AAAI.2023:11121-11128.
[15]NIE Y Q,NGUYEN N H,SINTHONG P,et al.A Time Series is Worth 64 Words:Long-term Forecasting with Transformers[C]//ICLR.2023.
[16]ZHOU H Y,ZHANG S H,PENG J Q,et al.Informer:Beyond Efficient Transformer For Long Sequence Time-Series Forecasting[J].arXiv:2012.07436,2021.
[17]LIU S Z,YU H,LIAO C,et al.Pyraformer:Low-ComplexityPyramidal Attention for Long-Range Time Series Modeling and Forecasting[C]//International Conference on Learning Representations.2022.
[18]KIM T,KIM J,TAE Y,et al.Reversible Instance Normalization for Accurate Time-Series Forecasting against Distribution Shift[C]//ICLR.2022.
[19]WOO S,PARK J,LEE J Y,et al.CBAM:Convolutional Block Attention Module[J].arXiv:1807.06521,2018.
[20]WANG S N,LI B,KHABSA M,et al.Linformer:Self-Attention with Linear Complexity[J].arXiv:2006.04768,2020.
[21]WU C H,WU F Z,QI T,et al.Fastformer:Additive Attention is All You Need[J].arXiv:2108.09084,2021.
[22]SEN R,YU H F,DHILLON I.Think Globally,Act Locally:A Deep Neural Network Approach to High-Dimensional Time Series Forecasting[C]//Proceedings of the 33rd International Conference on Neural Information Processing Systems.2019:4838-4847.
[23]KITAEV N,KAISER L,LEVSKAYA A.Reformer:The Efficient Transformer[C]//ICLR.2020.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!