Computer Science ›› 2025, Vol. 52 ›› Issue (2): 67-79.doi: 10.11896/jsjkx.240100167
• Database & Big Data & Data Science • Previous Articles Next Articles
WANG Huiqiang1, HUANG Feihu1, PENG Jian1, JIANG Yuan2, ZHANG Linghao3
CLC Number:
[1]RAFAL A A,PETRUS C M,BERKAY A,et al.Multivariate Time Series Dataset For Space Weather Data Analytics[J].Scientific Data,2020,7(1):227. [2]YASUKO M,YASUSHI S,WILLEM G,et al.FUNNEL:automatic mining of spatially coevolving epidemics[C]//FUNNEL:Automatic Mining of Spatially Coevolving Epidemics.2014:105-114. [3]SUNDAR R S,SEEGER M,GASTHAUS J,et al.Deep StateSpace Models for Time Series Forecasting[C]//Proceedings of the 32nd International Conference on Neural Information Processing Systems.2018:7796-7805. [4]DAVID S,VALENTIN F,JAN G,et al.Deepar:Probabilistic Forecasting With Autoregressive Recurrent Networks[J].International Journal of Forecasting,2021,37(3):1303-1303. [5]BAI S J,ZICO KOLTER J,KOLTUN V.An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling[J].arXiv:1803.01271,2018. [6]HUIQIANG W,JIAN P,FEIHU H,et al.MICN:Multi-scale Local and Global Context Modeling for Long-term Series Forecasting[C]//ICLR.2023. [7]WU H X,HU T G,LIU Y,et al.TimesNet:Temporal 2D-Variation Modeling for General Time Series Analysis[C]//ICLR.2023. [8]ASHISH V,NOAM S,NIKI P,et al.Attention Is All You Need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.2017:6000-6010. [9]LI S Y,JIN X Y,XUAN Y,et al.Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting[C]//Proceedings of the 33rd International Confe-rence on Neural Information Processing Systems.2019:5244-5254. [10]WU H X,XU J H,WANG J M,et al.Autoformer:Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting[C]//Proceedings of the 35th International Confe-rence on Neural Information Processing Systems.2024:22419-22430. [11]LIU Y,WU H X,WANG J M,et al.Non-stationary Transformers:Exploring the Stationarity in Time Series Forecasting[C]//NeurIPS.2022. [12]ZHANG Y H,YAN J C.Crossformer:Transformer UtilizingCross-Dimension Dependency for Multivariate Time Series Forecasting[C]//ICLR.2023. [13]ZHOU T,MA Z Q,WEN Q S,et al.FEDformer:Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting[C]//Proceedings of the 39thInternational Conference on Machine Learning.2022:27268-27286. [14]ZENG A L,CHEN M X,ZHANG L,et al.Are TransformersEffective for Time Series Forecasting?[C]//AAAI.2023:11121-11128. [15]NIE Y Q,NGUYEN N H,SINTHONG P,et al.A Time Series is Worth 64 Words:Long-term Forecasting with Transformers[C]//ICLR.2023. [16]ZHOU H Y,ZHANG S H,PENG J Q,et al.Informer:Beyond Efficient Transformer For Long Sequence Time-Series Forecasting[J].arXiv:2012.07436,2021. [17]LIU S Z,YU H,LIAO C,et al.Pyraformer:Low-ComplexityPyramidal Attention for Long-Range Time Series Modeling and Forecasting[C]//International Conference on Learning Representations.2022. [18]KIM T,KIM J,TAE Y,et al.Reversible Instance Normalization for Accurate Time-Series Forecasting against Distribution Shift[C]//ICLR.2022. [19]WOO S,PARK J,LEE J Y,et al.CBAM:Convolutional Block Attention Module[J].arXiv:1807.06521,2018. [20]WANG S N,LI B,KHABSA M,et al.Linformer:Self-Attention with Linear Complexity[J].arXiv:2006.04768,2020. [21]WU C H,WU F Z,QI T,et al.Fastformer:Additive Attention is All You Need[J].arXiv:2108.09084,2021. [22]SEN R,YU H F,DHILLON I.Think Globally,Act Locally:A Deep Neural Network Approach to High-Dimensional Time Series Forecasting[C]//Proceedings of the 33rd International Conference on Neural Information Processing Systems.2019:4838-4847. [23]KITAEV N,KAISER L,LEVSKAYA A.Reformer:The Efficient Transformer[C]//ICLR.2020. |
[1] | WEI Wan-yin, DU Xiao-ni, LI Zhi-xia and WAN Yun-qi. Linear Complexity of Quaternary Generalized Cyclotomic Sequences with Period pq [J]. Computer Science, 2017, 44(6): 174-176. |
[2] | . [J]. Computer Science, 2007, 34(4): 77-78. |
[3] | . [J]. Computer Science, 2006, 33(11): 74-75. |
|