Author
Listed:
- Olimpiu Nicolae Moga
(Computer Science and Electrical Engineering Department, Lucian Blaga University of Sibiu, 550024 Sibiu, Romania)
- Adrian Florea
(Computer Science and Electrical Engineering Department, Lucian Blaga University of Sibiu, 550024 Sibiu, Romania)
- Claudiu Solea
(Computer Science and Electrical Engineering Department, Lucian Blaga University of Sibiu, 550024 Sibiu, Romania)
- Maria Vintan
(Computer Science and Electrical Engineering Department, Lucian Blaga University of Sibiu, 550024 Sibiu, Romania)
Abstract
Energy communities represent an important step towards clean energy; however, their management is a complex task due to various factors such as fluctuating demand and energy prices, variable renewable generation, and external factors such as power outages. This paper investigates the effectiveness of a Reinforcement Learning agent, based on the Proximal Policy Optimisation (PPO) algorithm, for energy management across three different energy community configurations. The performance of the PPO agent is compared against a Rule-Based Controller (RBC) and a baseline scenario using solar generation but no active management. Simulations were run in the CityLearn framework to simulate real world data. Across the three evaluated community configurations, the PPO agent achieved its greatest improvement over a single run in the scenario where all participants were prosumers (Schema 3), with a reduction of 9.2% in annual costs and carbon emissions. The main contribution of this work is demonstrating the viability of Reinforcement Learning agents in energy optimization problems, providing an alternative to traditional RBCs for energy communities.
Suggested Citation
Olimpiu Nicolae Moga & Adrian Florea & Claudiu Solea & Maria Vintan, 2025.
"Reinforcement Learning-Based Energy Management in Community Microgrids: A Comparative Study,"
Sustainability, MDPI, vol. 17(23), pages 1-21, November.
Handle:
RePEc:gam:jsusta:v:17:y:2025:i:23:p:10696-:d:1805940
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jsusta:v:17:y:2025:i:23:p:10696-:d:1805940. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.