چکیده
|
Increasing energy demand in today’s world emphasizes the importance of optimal scheduling for distributed energy resources to minimize energy costs and greenhouse gas (GHG) emissions. The efficiency of this decision-making process relies on accurate modeling. In this paper, reinforcement learning (RL), an artificial intelligence-based approach, is proposed to optimize the energy management system (EMS) of an energy hub (EH). This EH contains renewable energy resources (RER), a combined heat and power (CHP), and a gas furnace. In order to meet electrical and thermal energy demand, available options such as day-ahead and real-time purchases from the main grid, RERs, and natural gas consumption are managed, with the preference of RERs to minimize GHG emissions and energy costs. With the adaptable RL method, a non-linear model of the CHP operation is constructed, considering the operational costs of the CHP. Furthermore, the natural gas tariff is varied according to the consumption level of the microgrid. Finally, this paper presents an RL-based method for EMS optimization of an EH with day-ahead and real-time scheduling, applied to a 24-hour case study with linear and nonlinear modeling of the problem and sensitivity analysis of the parameters. Corresponding simulation results show the efficiency of the presented approach.
|