Font Size: a A A

Research On Optimal Energy Management Of Integrated Energy System Based On Deep Reinforcement Learning

Posted on:2023-02-26Degree:DoctorType:Dissertation
Country:ChinaCandidate:L Y ZhaoFull Text:PDF
GTID:1522307154967409Subject:Electrical engineering
Abstract/Summary:PDF Full Text Request
Energy is the necessary material basis for the survival and development of modern society.With the continuous growth of energy demand,the contradiction between energy supply and energy demand is becoming increasingly tense.The construction of integrated energy system(IES)provides a new solution for optimizing energy supply and improving energy efficiency.The IES contains multiple energy forms and energy units.However,the coexistence and interplay of multiple controllable energy units and the interdependence between different energy systems increase the difficulty of system operation optimization.Moreover,the integration of intermittent renewable energy sources,random energy consumption behaviours and changeable weather conditions introduce multiple uncertainties into the IES.In order to realize the multi-energy coordination and economical operation of the IES,effective energy management strategies need to be developed.This dissertation focuses on IES energy management under uncertainties.The main work is as follows:(1)On the energy supply side of the IES,a dynamic economic scheduling method based on improved deep deterministic policy gradient(DDPG)algorithm is proposed for park-level IES considering uncertainties of renewable energy output and load demands.The IES dynamic economic scheduling problem is modeled first and then formulated as a Markov decision process(MDP)with continuous action space and unknown transition probability.To solve the MDP,a deep reinforcement learning algorithm is proposed to improve the policy quality and learning efficiency by introducing prioritized experience replay mechanism and L2 regularization to DDPG.The proposed method does not require any distribution knowledge or forecast information,and can adaptively respond to the stochastic fluctuations of the supply and demands.It realizes the direct mapping from the current system state to the scheduling action.Simulation results of typical days in winter and summer and comparison with three benchmark methods show that the proposed method can effectively reduce the system operating cost.(2)On the energy consumption side of the IES,a joint load scheduling method based on proximal policy optimization(PPO)algorithm is proposed for household IES containing multiple devices such as household renewable energy generator,gas-electric hybrid heating system and gas-electric kitchen stove.The proposed method is designed to deal with the uncertainties of residents’ hot water demand,renewable generation,outdoor temperature and electricity price,and the diversity of operating characteristics of different devices.The Gaussian distribution and Bernoulli distribution are used to approximate the scheduling strategies of different types of household devices respectively,and PPO is used to optimize the scheduling strategies.It can handle continuous actions of power-shiftable devices and discrete actions of time-shiftable devices simultaneously,as well as the optimal management of electrical devices and gas-fired devices,thereby jointly optimizing the operation of all household loads.The effectiveness of the proposed method in terms of energy cost saving,thermal comfort guarantee and stochastic environment adaptability is verified through the simulation results of different cases and comparison study.(3)Aiming at the park-level IES with multiple energy supply units and dispatchable electric and heat loads,a cooperative energy management method is proposed,which considers the resources on the energy supply side and the controllable resources on the energy consumption side.The proposed method takes the flexibility of the energy supply side and the consumption side into consideration,so as to realize the "source-storage-load" cooperative operation optimization.The bilateral cooperative energy management problem is formulated as an MDP,and the observation state,energy management action and reward function are designed.To cope with the uncertainties of renewable energy generation of the supply side,the energy load of the consumption side and the external environment temperature,a two-side collaborative energy management method based on improved DDPG algorithm is developed.The proposed method can take advantage of the flexibility of different energy links,improve the optimal scheduling space through bilateral coordination,and avoid the influence of uncertain factors on energy management decisions.Operation results under different test days show the effectiveness of the proposed method and the good economic benefits of the bilateral coordinated management strategy.(4)In the context of the COVID-19 pandemic forcing many people to stay at home,the energy management problem of the energy consumption side is further studied.A home energy management method considering indoor environmental quality is proposed to ensure a healthy indoor environment and reduce associated energy cost.The indoor thermal environment and indoor air environment is finely modeled,in which a variety of physical processes and the impact of residents as heat source and CO2 source on the indoor environment are considered.In view of the challenges caused by the uncertainties of outdoor temperature,solar radiation,outdoor air quality,home occupancy and electricity price,a home energy management method based on prioritized experience replay mechanism and double deep Q network(DDQN)is designed,which comprehensively considers indoor air quality,thermal comfort and energy cost,so as to realize the optimal management of the heating/cooling system and ventilation system.Test results under different scenarios show the proposed method has good adaptability to the variation of uncertain parameters such as weather conditions,time-varying electricity prices and home occupancy patterns.
Keywords/Search Tags:Integrated energy system, Energy management, Operation optimization, Deep reinforcement learning, Uncertainties, Operating cost, Indoor environmental quality
PDF Full Text Request
Related items