Electricity load forecasting based on an Interpretable ProbSparse attention mechanism

Konferenz: AIIPCC 2022 - The Third International Conference on Artificial Intelligence, Information Processing and Cloud Computing
21.06.2022 - 22.06.2022 in Online

Tagungsband: AIIPCC 2022

Seiten: 7Sprache: EnglischTyp: PDF

Autoren:
Zhang, Han; Peng, Chen; Li, Jun; Niu, Yajie; Li, Longxiang (School of Information Science and Engineering, Jishou University, Hunan, China)

Inhalt:
Long sequence time-series forecasting (LSTF) is becoming increasingly popular in many real-world applications, including solar plant energy output forecasting, wind power forecasting, and electricity consumption forecasting. LSTF requires a model with high prediction capability, which is the ability to capture accurate long-range dependencies between the output and input. The ProbSparse self-attention mechanism has proven to be effective in dealing with long time series. However, its interpretability is limited, which is crucial to model improvement. To solve this issue, in this paper, an Interpretable ProbSparse self-attention mechanism (Intprob) method is proposed, and it is used to forecast the electrical load sequence. The Intprob model retains the advantages of ProbSparse self-attention when dealing with long time series, while also integrating the interpretable attention used in temporal fusion transformers. In addition to having superior generalization abilities, it also provides faster training efficiency. The experimental results and specific analysis show that the Intprob model prediction effect compares favorably to the state-of-art baseline models, and allow for analysis using interpretable metrics.