Font Size: a A A

Micro-energy Network Optimal Scheduling Method Based On Deep Reinforcement Learning

Posted on:2024-07-20Degree:MasterType:Thesis
Country:ChinaCandidate:W X XuFull Text:PDF
GTID:2542306941959409Subject:Electrical engineering
Abstract/Summary:PDF Full Text Request
The access of large-scale new energy sources brings severe challenges to the safe and stable operation of power systems.Therefore,the research and construction of a new energy system represented by micro-energy grid provides a new solution for the optimal utilization of renewable energy.An efficient and reliable optimal dispatching method can reasonably configure the operation mode and output power of each device,reduce operation cost,and promote the utilization of renewable energy,reduce the reliance on traditional fossil energy,improve the quality of energy management and overall system performance of micro-energy grid,and ultimately achieve the purpose of reducing carbon emissions and improving environmental quality.In this paper,the micro-energy grid system containing hot and cold electric loads is studied to optimize the day-ahead and real-time scheduling.First,a model of each device in the micro energy grid is established to derive a day-ahead scheduling scheme,which ensures the economic operation of the micro-energy grid and serves as a guide for real-time scheduling.On this basis,a day-ahead-real-time two-stage optimal dispatching model based on the model predictive control method(MPC)is proposed.The real-time scheduling takes the deviation of equipment output from the reference value of the day-ahead plan and the minimum sum of the adjustment of equipment output in adjacent time steps as the comprehensive target,and obtains the real-time scheduling scheme by rolling optimization.The real-time scheduling solution follows the day-ahead plan well and is economical,environmentally friendly and energy efficient.Secondly,the optimal scheduling problem of the micro-energy grid is described as a Markov decision process with discrete time steps,and specific expressions of environment,intelligences,states,actions,state transfers and rewards are given.Considering the continuity of equipment output,a micro-energy grid optimal scheduling model based on deep deterministic policy gradient algorithm(DDPG)is constructed.In the offline training process,an empirical replay mechanism is used,while noise-increasing action exploration is introduced.In the online decision process,the saved model is used and real-time data is input to obtain the scheduling results.The effectiveness of the proposed model and algorithm is verified by arithmetic examples.Finally,considering the practical engineering applications,the problem of optimal scheduling of the micro-energy grid under the mass flow regulation approach considering the heat network energy flow is further analyzed.The devices in the heating network are accurately modeled and a non-real-time heat exchange model of the solar collector system is established.The mass flow regulation approach can be used to regulate the flow and temperature of heat network pipes,thus improving the economy and flexibility of the micro-energy grid.The DDPG algorithm is applied to solve for the bilinear term of the product of mass flow and temperature.The algorithm shows that the solution is economical,environmentally friendly and flexible.
Keywords/Search Tags:Micro-energy grid, Deep Reinforcement Learning, optimal scheduling, model predictive control, deep deterministic policy gradient algorithm
PDF Full Text Request
Related items