Computer Science ›› 2024, Vol. 51 ›› Issue (6A): 230800119-7.doi: 10.11896/jsjkx.230800119

• Interdiscipline & Application • Previous Articles     Next Articles

Forecasting Teleconsultation Demand Based on LSTM and Attention Mechanism

ZHAI Yunkai1,2,3, QIAO Zhengwen1, QIAO Yan1   

  1. 1 School of Management,Zhengzhou University,Zhengzhou 450001,China
    2 National Engineering Laboratory of Internet Medical Systems and Applications,Zhengzhou 450052,China
    3 Henan Province International Joint Laboratory of Intelligent Health Information System,Zhengzhou 450000,China
  • Published:2024-06-06
  • About author:ZHAI Yunkai,born in 1980,Ph.D,professor,Ph.D supervisor.His main research interests include healthcare big data and telemedicine information system and management.
    QIAO Yan,born in 1991,Ph.D.His main research interests include healthcare big data and medical informatization.
  • Supported by:
    National Natural Science Foundation of China(72202217,71972012),Key Research Project Plan for Colleges and Universities of Henan Province(24A630034),Major Project of Basic Research of Philosophy and Social Science in Colleges and Universities of Henan Province(2022-JCZD-21).

Abstract: To predict the demand for teleconsultation more accurately and improve the efficiency of resource allocation for teleconsultation,this paper introduces multiple linear regression and attention mechanism to optimize Long Short-term Memory network.Firstly,according to the holiday effect existing in the teleconsultation demand,the holiday index is generated,and the index with high significance is selected as the model input through multiple regression analysis.Then,according to the long-term short-term memory network to learn the internal complex mapping relationship of the input indicators,the attention mechanism is used to assign different weights to the indicators.Finally,the prediction results are input according to the weight and LSTM hidden layer.Based on the actual historical teleconsultation data of the National Telemedicine Center,this paper studies the predictive ability of MLR-Attention-LSTM,and compares it with the ARIMA,SVR,KNN,BP neural network and long short-term memory network.The results show that the improved LSTM model has the highest prediction accuracy.Furthermore,this paper explores the impact of holiday indicators on the performance of the model.The results show that the input of holiday indicators can further improve the prediction accuracy of the model.It verifies the feasibility and applicability of MLR-Attention-LSTM and holiday-related variable input in the field of teleconsultation demand prediction,and provides theoretical support and practical guidance for the practical application of telemedicine centers.

Key words: Long short-term memory, Attention mechanism, Teleconsultation, Demand forecasting, Holiday effect

CLC Number: 

  • TP183
[1]ZHAO J,CUI Z Y,CAI Y L,et al.Analysis on the Efficiency Optimization of Resource Allocation Based on Telemedicine[J].Chinese Health Economics,2014,33(10):5-7.
[2]CHEN X,WANG J.Matching Method for Medical Service Supply and Demand Considering Bodies’ Psychological Behavior Based on Intelligent Platform[J].Operations Research and Ma-nagement Science,2018,27(10):125-132.
[3]DONG T S,ZHANG M K.Practice and Thinking of Using Appointment Service of Hospital Outpatient in Telemedicine Consultation Scheduling Work[J].Chinese Hospital Management,2017,37(1):40-41.
[4]LU W,GAO P,ZHAI Y K.An Adaptive RecommendationMethod for Telemedicine Specialists with Feedback Adjustment[J].Journal of Systems & Management,2023,32(5):960-975.
[5]PANDA S K,MOHANTY S N.Time Series Forecasting andModelling of Food Demand Supply Chain based on Regressors Analysis[J].IEEE Access,2023,11:42679-42700.
[6]WANG Q,JIANG H,QIU M,et al.TGAE:Temporal GraphAutoencoder for Travel Forecasting[J].IEEE Transactions on Intelligent Transportation Systems,2022,24(8):8529-8541.
[7]RUNGE J,ZMEUREANU R.Deep learning forecasting forelectric demand applications of cooling systems in buildings[J].Advanced Engineering Informatics,2022,53:101674.
[8]XUE G,LIU S,REN L,et al.Forecasting hourly attractiontourist volume with search engine and social media data for decision support[J].Information Processing & Management,2023,60(4):103399.
[9]BUCKINGHAM-JEFFERY E,MORBEY R,HOUSE T,et al.Correcting for day of the week and public holiday effects:improving a national daily syndromic surveillance service for detecting publ ic health threats[J].BMC Public Health,2017,17(1):1-9.
[10]OUYANG H B,HUANG K,YAN H J.Prediction of Financial Time Series Based on LSTM Neural Network[J].Chinese Journal of Management Science,2020,28(4):27-35.
[11]CHEN W J,YU L,LI J L.Forecasting Teleconsultation Demand with an Ensemble Attention-Based Bidirectional Long Short-Term Memory Model[J].International Journal of Computa-tional Intelligence Systems,2021,14(1):821-833.
[12]TANG Z P,WU J C,ZAHNG T T,et al.An EEMD-LSTM Model Based Research on Early Warning of the Systematic Risk in China Insurance Industry[J].Management Review,2022,34(9):27-34.
[13]LIU Y M,LI Y,ZHAO Z Y.Forecasting Price Trend of Constituent Stocks Using RF-LSTM Model Based on Feature Selection[J].Statistics & Decision,2021,37(1):157-160.
[14]LIU S,YAO E.Holiday passenger flow forecasting based on the modified least-square support vector machine for the metro system[J].Journal of Transportation Engineering,Part A-Systems,2017,143(2):1-8.
[15]YAO E,HONG J,PAN L,et al.Forecasting Passenger Flow Distribution on Holidays for Urban Rail Transit Based on Destination Choice Behavior Analysis[J].Journal of Advanced Transportation,2021,2021:1-13.
[16]HOCHREITER S,SCHMIDHUBER J.Long Short-Term Memory[J].Neural Computation,1997,9(8):1735-1780.
[17]KINGMA D P,BA J.Adam:A method for stochastic optimization[J].arXiv:1412.6980,2015.
[1] ZHANG Le, YU Ying, GE Hao. Mural Inpainting Based on Fast Fourier Convolution and Feature Pruning Coordinate Attention [J]. Computer Science, 2024, 51(6A): 230400083-9.
[2] SUN Yang, DING Jianwei, ZHANG Qi, WEI Huiwen, TIAN Bowen. Study on Super-resolution Image Reconstruction Using Residual Feature Aggregation NetworkBased on Attention Mechanism [J]. Computer Science, 2024, 51(6A): 230600039-6.
[3] QUE Yue, GAN Menghan, LIU Zhiwei. Object Detection with Receptive Field Expansion and Multi-branch Aggregation [J]. Computer Science, 2024, 51(6A): 230600151-6.
[4] HE Xinyu, LU Chenxin, FENG Shuyi, OUYANG Shangrong, MU Wentao. Ship Detection and Recognition of Optical Remote Sensing Images for Embedded Platform [J]. Computer Science, 2024, 51(6A): 230700117-7.
[5] ZHANG Lanxin, XIANG Ling, LI Xianze, CHEN Jinpeng. Intelligent Fault Diagnosis Method for Rolling Bearing Based on SAMNV3 [J]. Computer Science, 2024, 51(6A): 230700167-6.
[6] LIU Xiaohu, CHEN Defu, LI Jun, ZHOU Xuwen, HU Shan, ZHOU Hao. Speaker Verification Network Based on Multi-scale Convolutional Encoder [J]. Computer Science, 2024, 51(6A): 230700083-6.
[7] WU Chunming, WANG Tiaojun. Study on Defect Detection Algorithm of Transmission Line in Complex Background [J]. Computer Science, 2024, 51(6A): 230500178-6.
[8] LYU Yiming, WANG Jiyang. Iron Ore Image Classification Method Based on Improved Efficientnetv2 [J]. Computer Science, 2024, 51(6A): 230600212-6.
[9] WU Chunming, LIU Yali. Method for Lung Nodule Detection on CT Images Using Improved YOLOv5 [J]. Computer Science, 2024, 51(6A): 230500019-6.
[10] XIAO Yahui, ZHANG Zili, HU Xinrong, PENG Tao, ZHANG Jun. Clothing Image Segmentation Method Based on Deeplabv3+ Fused with Attention Mechanism [J]. Computer Science, 2024, 51(6A): 230900153-7.
[11] LANG Lang, CHEN Xiaoqin, LIU Sha, ZHOU Qiang. Detection of Pitting Defects on the Surface of Ball Screw Drive Based on Improved Deeplabv3+ Algorithm [J]. Computer Science, 2024, 51(6A): 240200058-6.
[12] QIAO Hong, XING Hongjie. Attention-based Multi-scale Distillation Anomaly Detection [J]. Computer Science, 2024, 51(6A): 230300223-11.
[13] KANG Zhiyong, LI Bicheng, LIN Huang. User Interest Recognition Method Incorporating Category Labels and Topic Information [J]. Computer Science, 2024, 51(6A): 230500169-8.
[14] ZHU Yuliang, LIU Juntao, RAO Ziyun, ZHANG Yi, CAO Wanhua. Knowledge Reasoning Model Combining HousE with Attention Mechanism [J]. Computer Science, 2024, 51(6A): 230600209-8.
[15] BAI Yu, WANG Xinzhe. Study on Hypernymy Recognition Based on Combined Training of Attention Mechanism and Prompt Learning [J]. Computer Science, 2024, 51(6A): 230700226-5.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!