DSSE-Based Training Data Generation for Probabilistic Time Series Forecasting in Distribution Grids with Changing Topologies
Konferenz: NEIS 2025 - Conference on Sustainable Energy Supply and Energy Storage Systems
15.09.2025-16.09.2025 in Hamburg, Germany
doi:10.30420/566633018
Tagungsband: NEIS 2025
Seiten: 6Sprache: EnglischTyp: PDF
Autoren:
Dipp, Marcel; Pau, Marco
Inhalt:
A comprehensive measurement infrastructure provides real-time transparency in distribution grids and generates large pools of data, enabling the application of Deep Learning (DL) based probabilistic time series forecasting. In recent years, there has been a boom of novel DL models focusing on time series forecasting, including recurrent-based models, transformers with attention mechanisms, and linear models such as NLinear, TiDE, and TSMixer. However, the performance of DL-based approaches relies not only on the architecture but also on the availability and quality of training data. Distribution grids are affected by switching operations caused by equipment maintenance, congestion prevention, or power restoration, which directly impact the collected measurement data. Additionally, most data are obtained when the grid is in its default switching state, whereas measurement data from alternative topologies are only available for short periods. This leads to significant measurement data sparsity for many feasible grid configurations. To address these limitations, this work proposes a method based on Distribution System State Estimation (DSSE) to generate suitable training sets for any feasible topology. The methodology consists of three parts. First, an Artificial Neural Network (ANN) based DSSE generates active and reactive power estimates. Second, power flows are calculated using High- Performance Computing (HPC) techniques for the new topology. Last, TSMixer is applied as a DL model and trained for the new topology. The results demonstrate that probabilistic forecast errors are significantly lower than those of DL models trained only with measurement data from the default topology and approach the performance of DL models trained with real historical time series. For a low-voltage test grid, the methodology resulted in a 31.6% reduction in the mean root mean square error (RMSE) of the forecast (maximum: 45.8%) for line loading and a 25.5% reduction (maximum: 36.4%) for voltage magnitudes compared to the DL model trained solely with the default topology.

