Full Text:   <1028>

Summary:  <315>

CLC number: TP183

On-line Access: 2024-08-27

Received: 2023-10-17

Revision Accepted: 2024-05-08

Crosschecked: 2023-02-13

Cited: 0

Clicked: 1744

Citations:  Bibtex RefMan EndNote GB/T7714

 ORCID:

Ran TIAN

https://orcid.org/0000-0003-4435-580X

Xinmei LI

https://orcid.org/0000-0002-5595-9230

Zhongyu MA

https://orcid.org/0000-0001-8809-0685

Yanxing LIU

https://orcid.org/0000-0002-0554-3683

Jingxia WANG

https://orcid.org/0000-0003-0696-9982

Chu WANG

https://orcid.org/0000-0002-8687-9911

-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering  2023 Vol.24 No.9 P.1287-1301

http://doi.org/10.1631/FITEE.2200540


LDformer: a parallel neural network model for long-term power forecasting


Author(s):  Ran TIAN, Xinmei LI, Zhongyu MA, Yanxing LIU, Jingxia WANG, Chu WANG

Affiliation(s):  College of Computer Science & Engineering, Northwest Normal University, Lanzhou 730070, China

Corresponding email(s):   tianran@nwnu.edu.cn, 2020211978@nwnu.edu.cn, mazybg@nwnu.edu.cn, lyanxing@nwnu.edu.cn, 2020222004@nwnu.edu.cn, 2020221992@nwnu.edu.cn

Key Words:  Long-term power forecasting, Long short-term memory (LSTM), UniDrop, Self-attention mechanism


Ran TIAN, Xinmei LI, Zhongyu MA, Yanxing LIU, Jingxia WANG, Chu WANG. LDformer: a parallel neural network model for long-term power forecasting[J]. Frontiers of Information Technology & Electronic Engineering, 2023, 24(9): 1287-1301.

@article{title="LDformer: a parallel neural network model for long-term power forecasting",
author="Ran TIAN, Xinmei LI, Zhongyu MA, Yanxing LIU, Jingxia WANG, Chu WANG",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="24",
number="9",
pages="1287-1301",
year="2023",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2200540"
}

%0 Journal Article
%T LDformer: a parallel neural network model for long-term power forecasting
%A Ran TIAN
%A Xinmei LI
%A Zhongyu MA
%A Yanxing LIU
%A Jingxia WANG
%A Chu WANG
%J Frontiers of Information Technology & Electronic Engineering
%V 24
%N 9
%P 1287-1301
%@ 2095-9184
%D 2023
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2200540

TY - JOUR
T1 - LDformer: a parallel neural network model for long-term power forecasting
A1 - Ran TIAN
A1 - Xinmei LI
A1 - Zhongyu MA
A1 - Yanxing LIU
A1 - Jingxia WANG
A1 - Chu WANG
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 24
IS - 9
SP - 1287
EP - 1301
%@ 2095-9184
Y1 - 2023
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2200540


Abstract: 
Accurate long-term power forecasting is important in the decision-making operation of the power grid and power consumption management of customers to ensure the power system’s reliable power supply and the grid economy’s reliable operation. However, most time-series forecasting models do not perform well in dealing with long-time-series prediction tasks with a large amount of data. To address this challenge, we propose a parallel time-series prediction model called LDformer. First, we combine Informer with long short-term memory (LSTM) to obtain deep representation abilities in the time series. Then, we propose a parallel encoder module to improve the robustness of the model and combine convolutional layers with an attention mechanism to avoid value redundancy in the attention mechanism. Finally, we propose a probabilistic sparse (ProbSparse) self-attention mechanism combined with uniDrop to reduce the computational overhead and mitigate the risk of losing some key connections in the sequence. Experimental results on five datasets show that LDformer outperforms the state-of-the-art methods for most of the cases when handling the different long-time-series prediction tasks.

LDformer:面向长期电力预测的并行神经网络模型

田冉,李新梅,马忠彧,刘颜星,王晶霞,王楚
西北师范大学计算机科学与工程学院,中国兰州市,730070
摘要:准确的长期电力预测对电网决策运行和用户用电管理非常重要,可保证电力系统的可靠供电和电网经济的可靠运行。然而,大多数时间序列预测模型在数据量大、预测精度高的长时间序列预测任务中表现不佳。为了应对这一挑战,提出名为LDformer的并行时间序列预测模型。首先,将Informer与长短期记忆网络相结合,以获得时间序列的深度表达能力。其次,提出并行编码器模块提高模型鲁棒性,并将卷积层与注意力机制相结合,以避免注意力机制中的值冗余。最后,提出结合UniDrop的概率稀疏注意力机制,以减少计算开销并减轻序列中一些关键连接丢失的风险。在5个真实数据集上的实验结果显示,在不同的长时间序列预测任务中,LDformer大部分结果都优于最先进的基线结果。

关键词:长期电力预测;长短期记忆网络;UniDrop;自注意力机制

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Cao JS, Wang JH, 2019. Stock price forecasting model based on modified convolution neural network and financial time series analysis. Int J Commun Syst, 32(12):e3987.

[2]Chakraborty T, Chattopadhyay S, Ghosh I, 2019. Forecasting dengue epidemics using a hybrid methodology. Phys A Stat Mech Appl, 527:121266.

[3]Chen JF, Wang WM, Huang CM, 1995. Analysis of an adaptive time-series autoregressive moving-average (ARMA) model for short-term load forecasting. Electr Power Syst Res, 34(3):187-196.

[4]Chen XY, Sun LJ, 2022. Bayesian temporal factorization for multidimensional time series prediction. IEEE Trans Patt Anal Mach Intell, 44(9):4659-4673.

[5]Chuku C, Simpasa A, Oduor J, 2019. Intelligent forecasting of economic growth for developing economies. Int Econ, 159:74-93.

[6]Ciechulski T, Osowski S, 2021. High precision LSTM model for short-time load forecasting in power systems. Energies, 14(11):2983.

[7]Ding M, Zhou H, Xie H, et al., 2021. A time series model based on hybrid-kernel least-squares support vector machine for short-term wind power forecasting. ISA Trans, 108:58-68.

[8]Gai SX, Zeng XQ, Yuan TF, 2021. Parking volume forecast of railway station garages based on passenger behaviour analysis using the LSTM network. J Adv Transp, 2021:6688609.

[9]Guo SN, Lin YF, Li SJ, et al., 2019. Deep spatial-temporal 3D convolutional neural networks for traffic data forecasting. IEEE Trans Intell Transp Syst, 20(10):3913-3926.

[10]Han ZY, Zhao J, Leung H, et al., 2021. A review of deep learning models for time series prediction. IEEE Sens J, 21(6):7833-7848.

[11]Hosseini MK, Talebpour A, 2019. Traffic prediction using time-space diagram: a convolutional neural network approach. Transp Res Rec J Transp Res Board, 2673(7):425-435.

[12]Hu R, Chiu YC, Hsieh CW, 2020. Crowding prediction on mass rapid transit systems using a weighted bidirectional recurrent neural network. IET Intell Transp Syst, 14(3):196-203.

[13]Karevan Z, Suykens JAK, 2020. Transductive LSTM for time-series prediction: an application to weather forecasting. Neur Netw, 125:1-9.

[14]Khan MH, Muhammad NS, El-Shafie A, 2020. Wavelet based hybrid ANN-ARIMA models for meteorological drought forecasting. J Hydrol, 590:125380.

[15]Khandelwal U, He H, Qi P, et al., 2018. Sharp nearby, fuzzy far away: how neural language models use context. Proc 56th Annual Meeting of the Association for Computational Linguistics, p.284-294.

[16]Kim SH, Lee G, Kwon GY, et al., 2018. Deep learning based on multi-decomposition for short-term load forecasting. Energies, 11(12):3433.

[17]Li SY, Jin XY, Xuan Y, et al., 2019. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Proc 33rd Int Conf on Neural Information Processing Systems, Article 471.

[18]Marcjasz G, Uniejewski B, Weron R, 2020. Probabilistic electricity price forecasting with NARX networks: combine point or probabilistic forecasts?Int J Forecast, 36(2):‍466-479.

[19]Miao KC, Han TT, Yao YQ, et al., 2020. Application of LSTM for short term fog forecasting based on meteorological elements. Neurocomputing, 408:285-291.

[20]Min K, Kim D, Park J, et al., 2019. RNN-based path prediction of obstacle vehicles with deep ensemble. IEEE Trans Veh Technol, 68(10):10252-10256.

[21]Nóbrega JP, Oliveira ALI, 2019. A sequential learning method with Kalman filter and extreme learning machine for regression and time series forecasting. Neurocomputing, 337:235-250.

[22]Ran XD, Shan ZG, Fang YF, et al., 2019. An LSTM-based method with attention mechanism for travel time prediction. Sensors, 19(4):861.

[23]Syriopoulos T, Tsatsaronis M, Karamanos I, 2021. Support vector machine algorithms: an application to ship price forecasting. Comput Econ, 57(1):55-8.

[24]Vaswani A, Shazeer N, Parmar N, et al., 2017. Attention is all you need. Proc 31st Int Conf on Neural Information Processing Systems, p.6000-6010.

[25]Viccione G, Guarnaccia C, Mancini S, et al., 2020. On the use of ARIMA models for short-term water tank levels forecasting. Water Supply, 20(3):787-799.

[26]Wang C, Tian R, Hu J, et al., 2023. A trend graph attention network for traffic prediction. Inform Sci, 623:275-292.

[27]Wang ZP, Qu JF, Fang XY, et al., 2020. Prediction of early stabilization time of electrolytic capacitor based on ARIMA-Bi_LSTM hybrid model. Neurocomputing, 403:63-79.

[28]Wu N, Green B, Ben X, et al., 2020. Deep transformer models for time series forecasting: the influenza prevalence case. Proc 37th Int Conf on Machine Learning, p.1-10.

[29]Wu Z, Wu LJ, Meng Q, et al., 2021. UniDrop: a simple yet effective technique to improve transformer without extra cost. Proc Conf of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, p.3865-3878.

[30]Xiao S, Yan JC, Farajtabar M, et al., 2019. Learning time series associated event sequences with recurrent point process networks. IEEE Trans Neur Netw Learn Syst, 30(10):3124-3136.

[31]Xie YY, Lou YS, 2019. Hydrological time series prediction by ARIMA-SVR combined model based on wavelet transform. Proc 3rd Int Conf on Innovation in Artificial Intelligence, p.243-247.

[32]Xu DW, Wang YD, Jia LM, et al., 2017. Real-time road traffic state prediction based on ARIMA and Kalman filter. Front Inform Technol Electron Eng, 18(2):287-302.

[33]Zhang D, Ling JW, Wei ZH, et al., 2018. Long-term traffic speed prediction based on multiscale spatio-temporal feature learning network. IEEE Trans Intell Transp Syst, 20(10):3700-3709.

[34]Zhang L, Wang G, Giannakis GB, 2019. Real-time power system state estimation and forecasting via deep unrolled neural networks. IEEE Trans Signal Process, 67(15):4069-4077.

[35]Zhang TJ, Song S, Li SG, et al., 2019. Research on gas concentration prediction models based on LSTM multidimensional time series. Energies, 12(1):161.

[36]Zheng JH, Huang MF, 2020. Traffic flow forecast through time series analysis based on deep learning. IEEE Access, 8:82562-82570.

[37]Zhou HY, Zhang SH, Peng JQ, et al., 2021. Informer: beyond efficient transformer for long sequence time-series forecasting. Proc 35th AAAI Conf on Artificial Intelligence, p.1-15.

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE