| As energy demand gradually shifts from quantity to comprehensive consideration of quality and environmental protection,renewable energy such as wind and solar energy are gradually replacing traditional fossil fuels and will become the main body of the world’s energy structure.As a new generation and distribution system that effectively utilizes multiple renewable energy sources,microgrid has become an effective solution to energy problems.But in microgrid systems,the difficulty in predicting renewable energy power because of its output uncertainty in time series,the difficulty in energy dispatching because of the multi-energy coupling characteristics of microgrids,and the difficulty in energy adjusting because of the large fluctuation in various load demand,which have become problems in the field of microgrid energy management.This paper uses deep learning and deep reinforcement learning methods to study the microgrid energy management strategy.Gated recurrent unit(GRU)are used to predict the output of renewable energy and load demand,Rainbow Deep Q Network(Rainbow DQN)is used to reasonably adjust the energy flow in the microgrid and suppress the load curve fluctuation.Firstly,the microgrid structure and the function and characteristic of each component are introduced,and mathematical models are derived based on the working conditions constraints of each component.Secondly,a time-division forecasting method for the distributed power output and load demand based on GRU is presented.Through correlation analysis,several factors with the highest correlation coefficient of the forecasting sequence are selected as input of the neural network,and GRU is used to time-division predict the power of renewable energy and load demand.The error coefficients of actual and forecasted values are compared with those of Long ShortTerm Memory(LSTM)networks.The accuracy and rapidity of time-division GRU prediction are verified.Then,a microgrid energy management strategy based on Rainbow DQN is proposed,which aims at the optimization goal to minimize the daily operation cost of the microgrid and combines four DQN improvement strategies: Dueling DQN,Double DQN,Prioritized Experience Replay(PER),and Noise Net.Rainbow DQN perceives and studies the power output informations of renewable energies,the state control informations of energy storage batteries and electric vehicles,the power demand informations of power loads,and a relatively optimal energy management strategy of microgrid is obtained by constantly adjusting hyper-parameter and network structure in simulation.The simulation results show that the convergence speed and stability of daily operation cost curve of the microgrid which uses Rainbow DQN have significant advantages over the single or double optimization methods of DQN,and the microgrid daily operation cost used Rainbow DQN is the lowest. |