Reinforcement Learning for Cost-Efficient Energy Management under Dynamic Pricing: CoSES Building Case Study

Konferenz: PESS 2025 - IEEE Power and Energy Student Summit
08.10.2025-10.10.2025 in Munich, Germany

doi:10.30420/566656011

Tagungsband: PESS 2025 – IEEE Power and Energy Student Summit,

Seiten: 6Sprache: EnglischTyp: PDF

Autoren:
Nezet, Zoe; Kazemi, Milad; Papadimitriou, Christina; Mohapatra, Anurag

Inhalt:
In recent years, the increasing energy demand and integration of renewable energy have shown the need for smart energy management to reduce economic and environmental costs. This paper presents a reinforcement learning (RL) model for building energy management that uses real-world data, dynamic electricity pricing and noise representation to maximize cost-efficiency and to reduce complexity compared to model predictive control (MPC). First, an RL model was built for the energy management of the Center for Combined Smart Energy Systems (CoSES) building at the Technical University of Munich. Second, the adaptability of the RL model was assessed for various residential buildings with different load profiles. The results demonstrated that the RL model achieves a cost efficiency of 0.2730A euro/kWh, which is 7.7 times higher compared to the MPC framework. Furthermore, the RL model showed limited adaptability when assessed on three buildings thus highlighting the necessity to extend the training of the model to larger datasets.