Emergent Mind

Abstract

The global energy landscape is undergoing a transformation towards decarbonization, sustainability, and cost-efficiency. In this transition, microgrid systems integrated with renewable energy sources (RES) and energy storage systems (ESS) have emerged as a crucial component. However, optimizing the operational control of such an integrated energy system lacks a holistic view of multiple environmental, infrastructural and economic considerations, not to mention the need to factor in the uncertainties from both the supply and demand. This paper presents a holistic datadriven power optimization approach based on deep reinforcement learning (DRL) for microgrid control considering the multiple needs of decarbonization, sustainability and cost-efficiency. First, two data-driven control schemes, namely the prediction-based (PB) and prediction-free (PF) schemes, are devised to formulate the control problem within a Markov decision process (MDP). Second, a multivariate objective (reward) function is designed to account for the market profits, carbon emissions, peak load, and battery degradation of the microgrid system. Third, we develop a Double Dueling Deep Q Network (D3QN) architecture to optimize the power flows for real-time energy management and determine charging/discharging strategies of ESS. Finally, extensive simulations are conducted to demonstrate the effectiveness and superiority of the proposed approach through a comparative analysis. The results and analysis also suggest the respective circumstances for using the two control schemes in practical implementations with uncertainties.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.