Computer Science ›› 2024, Vol. 51 ›› Issue (5): 62-69.doi: 10.11896/jsjkx.230300001

• Database & Big Data & Data Science • Previous Articles     Next Articles

Substation Equipment Malfunction Alarm Algorithm Based on Dual-domain Sparse Transformer

ZHANG Jianliang, LI Yang, ZHU Qingshan, XUE Hongling, MA Junwei, ZHANG Lixia, BI Sheng   

  1. Information and Communication Branch of State Grid Shanxi Electric Power,Taiyuan 030021,China
  • Received:2023-03-01 Revised:2023-09-16 Online:2024-05-15 Published:2024-05-08
  • About author:ZHANG Jianliang,born in 1981,master,senior engineer.His main research interests include electric power information and communication technology.
  • Supported by:
    State Grid Shanxi Electric Power Company Science and Technology Project Funding(52051C220003).

Abstract: Using the time series data generated during the operation of substation electrical equipment,a predictive model can be constructed for its future operating state,thereby detecting abnormal data in advance,eliminating hidden faults,and improving stability and reliable operation ability.The Transformer model is an emerging sequential data processing model that has advantages when dealing with longer sequences and can meet the forward-looking needs of malfunction alarm.However,the model structure of Transformer makes it difficult to be directly applied to malfunction alarm tasks due to its high computational complexity and space occupancy.Therefore,a Transformer equipment malfunction alarm method based on time series prediction is proposed,which improves the Transformer model to achieve modeling of equipment operation data.The model uses a dual-tower encoder structure to extract features of sequences in both frequency and time domains,and performs multi-dimensional data fusion on time feature data and space feature data to extract more detailed information.Secondly,sparse attention mechanism is used instead of standard attention mechanism to reduce the computational complexity and space occupancy rate of Transformer and meet the needs of real-time warning.The superiority of the proposed model and the necessity of the improved module are demonstrated by experiments on ETT transformer equipment dataset.Compared with other methods,the proposed model achieves optimal MSE and MAE indices in most prediction tasks,especially in long sequence prediction tasks,and has faster prediction speed.

Key words: Equipment malfunction alarm, Time series forecasting, Deep learning, Transformer

CLC Number: 

  • TP391
[1]ROMANIUK F A,RUMIANTSEV Y V,RUMIANTSEV V Y,et al.Formation of orthogonal components of input currents in microprocessor protections of electrical equipment[J].ENERGETIKA Proceedings of CIS Higher Education Institutions and Power Engineering Associations,2021,64(3):191-201.
[2]LI Q,LI S,HE Z,et al.DeepRetina:layer segmentation of retina in OCT images using deep learning[J].Translational Vision Science & Technology,2020,9(2):61-61.
[3]HAQUE S K M,ARDILA-REY J A,UMAR Y,et al.Application and suitability of polymeric materials as insulators in electrical equipment[J].Energies,2021,14(10):2758.
[4]LOU J E.Developing an extended theory of planned behaviormodel for small E-waste recycling:an analysis of consumer behavior determinants[J].Journal of Environmental Engineering A,2022,11:71-86.
[5]MAO K.Research on influence and prevention of electromagne-tic noise from high-power electrical equipment in substation [J].Electrical Drive Automation,2021,43(2):58-60.
[6]XUN J W,ZHANG H,XIAO L P,et al.Model of Power Grid Alarm Information Classification Based on the GRU Neural Network [J].Computer & Digital Engineering,2019,47(6):1405-1408,1538.
[7]REN B,ZHENG Y,WANG Y,et al.Research on fault location of secondary equipment in smart substation based on deep lear-ning[J].Power Syst.Technol,2021,45:713-721.
[8]BIANCHI F M,MAIORINO E,KAMPFFMEYER M C,et al.An overview and comparative analysis of recurrent neural networks for short term load forecasting[J].arXiv:1705.04378,2017.
[9]TULI S,CASALE G,JENNINGS N R.TranAD:Deep transformer networks for anomaly detection in multivariate time series data[J].arXiv:2201.07284,2022.
[10]WANG Y L.Research and Implementation of Real-time FaultEarly Warning Method and System Based on Massive High-frequency Time Series Data [D].Jinan:Shandong University,2022.
[11]WANG W L.Research on Fault Prediction Based on Multiva-riate Time Series Analysis [D].Jinan:Shandong University,2021.
[12]HUANG W,ZHANG Z F.Early Warning of Combustion Cham-ber Faults Based on Multiple Linear Regression and Time Series Analysis [J].Turbine Technology,2021,63(3):212-214.
[13]WENDE T,MINGGANG H U,CHUANKUN L I.Fault prediction based on dynamic model and grey time series model in chemical processes[J].Chinese Journal of Chemical Enginee-ring,2014,22(6):643-650.
[14]XIAO L,LUO J,OUYANG C M.Research on coal mill fault prediction based on semi-supervised learning method[J],Thermal Power Generation,2019,48(4):121-127.
[15]LI Y,YU R,SHAHABI C,et al.Diffusion convolutional recur-rent neural network:Data-driven trafficforecasting[J].arXiv:1707.01926,2017.
[16]LIU D.A dual-stage two-phase attention-based recurrent neural network for long-term and multivariate timeseries prediction[J].Expert Syst.Appl,2020,143:113082.
[17]QIU J,MA H,LEVY O,et al.Blockwise self-attention for long document understanding[J].arXiv:1911.02972,2019.
[18]DEVLIN J,CHANG M W,LEE K,et al.Bert:Pre-training of deep bidirectionaltransformers for language understanding[J].arXiv:1810.04805,2018.
[19]DOSOVITSKIY A,BEYER L,KOLESNIKOV A,et al.Animage is worth 16x16 words:Transformers for image recognition at scale[J].arXiv:2010.11929,2020.
[20]CHEN L,LU K,RAJESWARAN A,et al.Decision transfor-mer:Reinforcement learning via sequence modeling[J].Advances in neural information processing systems,2021,34:15084-15097.
[21]ZHOU T,MA Z,WEN Q,et al.FEDformer:Frequency en-hanced decomposed transformer for long-term series forecasting[J].arXiv:2201.12740,2022.
[22]XU J,WU H,WANG J,et al.Anomaly transformer:Time series anomaly detection with association discrepancy[J].arXiv:2110.02642,2021.
[23]WEN Q,ZHOU T,ZHANG C,et al.Transformers in time series:A survey[J].arXiv:2202.07125,2022.
[24]CHILD R,GRAY S,RADFORD A,et al.Generating long se-quences with sparse transformers[J].arXiv:1904.10509,2019.
[25]LI S,JIN X,XUAN Y,et al.Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting[C]//Proceedings of the 33rd International Conference on Neural Information Processing Systems.2019:5243-5235.
[26]KITAEV N,KAISER Ł,LEVSKAYA A.Reformer:The efficient transformer[J].arXiv:2001.04451,2020.
[27]ZHANG Z,YU L,LIANG X,et al.TransCT:dual-path transformer forlow dose computed tomography[C]//International Conference on Medical Image Computing and Computer-Assisted Intervention.Cham:Springer,2021:55-64.
[28]VASWANI A,SHAZEER N,PARMAR N,et al.Attention isall you need[C]//Proceedings of the 31st International Confe-rence on Neural Information Processing Systems.2017:6000-6010..
[29]GEHRING J,AULI M,GRANGIER D,et al.Convolutional sequence to sequence learning[C]//International Conference on Machine Learning.PMLR,2017:1243-1252.
[30]TSAI Y H H,BAI S,YAMADA M,et al.Transformer Dissection:A Unified Understanding of Transformer's Attentionvia the Lens of Kernel[J].arXiv:1908.11775,2019.
[31]BELTAGY I,PETERS M E,COHAN A.Longformer:Thelong-document transformer[J].arXiv:2004.05150,2020.
[32]ZHOU H,ZHANG S,PENG J,et al.Informer:Beyond efficient transformer for long sequence time-series forecasting[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2021:11106-11115.
[33]ARIYO A A,ADEWUMI A O,AYO C K.Stock price prediction using the ARIMA model[C]//2014 UKSim-AMSS 16th International Conference on Computer Modelling and Simulation.IEEE,2014:106-112.
[34]TAYLOR S J,LETHAM B.Forecasting at scale[J].The Ame-rican Statistician,2018,72(1):37-45.
[35]BAHDANAU D,CHO K,BENGIO Y.Neural machine translation by jointly learning to align and translate[J].arXiv:1409.0473,2014.
[36]LAI G,CHANG W C,YANG Y,et al.Modeling long-and short-term temporal patterns with deep neural networks[C]//The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval.2018:95-104.
[1] BAO Kainan, ZHANG Junbo, SONG Li, LI Tianrui. ST-WaveMLP:Spatio-Temporal Global-aware Network for Traffic Flow Prediction [J]. Computer Science, 2024, 51(5): 27-34.
[2] HE Shiyang, WANG Zhaohui, GONG Shengrong, ZHONG Shan. Cross-modal Information Filtering-based Networks for Visual Question Answering [J]. Computer Science, 2024, 51(5): 85-91.
[3] SONG Jianfeng, ZHANG Wenying, HAN Lu, HU Guozheng, MIAO Qiguang. Multi-stage Intelligent Color Restoration Algorithm for Black-and-White Movies [J]. Computer Science, 2024, 51(5): 92-99.
[4] WANG Ping, YU Zhenhuang, LU Lei. Partial Near-duplicate Video Detection Algorithm Based on Transformer Low-dimensionalCompact Coding [J]. Computer Science, 2024, 51(5): 108-116.
[5] ZHOU Yu, CHEN Zhihua, SHENG Bin, LIANG Lei. Multi Scale Progressive Transformer for Image Dehazing [J]. Computer Science, 2024, 51(5): 117-124.
[6] HE Xiaohui, ZHOU Tao, LI Panle, CHANG Jing, LI Jiamian. Study on Building Extraction from Remote Sensing Image Based on Multi-scale Attention [J]. Computer Science, 2024, 51(5): 134-142.
[7] XU Xuejie, WANG Baohui. Multi-label Patent Classification Based on Text and Historical Data [J]. Computer Science, 2024, 51(5): 172-178.
[8] LI Zichen, YI Xiuwen, CHEN Shun, ZHANG Junbo, LI Tianrui. Government Event Dispatch Approach Based on Deep Multi-view Network [J]. Computer Science, 2024, 51(5): 216-222.
[9] HONG Tijing, LIU Dengfeng, LIU Yian. Radar Active Jamming Recognition Based on Multiscale Fully Convolutional Neural Network and GRU [J]. Computer Science, 2024, 51(5): 306-312.
[10] SUN Jing, WANG Xiaoxia. Convolutional Neural Network Model Compression Method Based on Cloud Edge Collaborative Subclass Distillation [J]. Computer Science, 2024, 51(5): 313-320.
[11] XI Ying, WU Xuemeng, CUI Xiaohui. Node Influence Ranking Model Based on Transformer [J]. Computer Science, 2024, 51(4): 106-116.
[12] CHEN Runhuan, DAI Hua, ZHENG Guineng, LI Hui , YANG Geng. Urban Electricity Load Forecasting Method Based on Discrepancy Compensation and Short-termSampling Contrastive Loss [J]. Computer Science, 2024, 51(4): 158-164.
[13] LIN Binwei, YU Zhiyong, HUANG Fangwan, GUO Xianwei. Data Completion and Prediction of Street Parking Spaces Based on Transformer [J]. Computer Science, 2024, 51(4): 165-173.
[14] SONG Hao, MAO Kuanmin, ZHU Zhou. Algorithm of Stereo Matching Based on GAANET [J]. Computer Science, 2024, 51(4): 229-235.
[15] XUE Jinqiang, WU Qin. Progressive Multi-stage Image Denoising Algorithm Combining Convolutional Neural Network and
Multi-layer Perceptron
[J]. Computer Science, 2024, 51(4): 243-253.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!