| With the rapid development of 5G communication technology,the number of mobile terminals and network traffic have increased significantly,including a variety of services with different requirements for service quality.These services not only meet the needs of users,but also aggravate the shortage of mobile resources,bringing a huge burden to the mobile communication network.In order to increase network capacity and reduce latency,it is necessary to optimize resource allocation in the mobile network.To address the above challenges,the Mobile Edge Computing(MEC)architecture offloads storage and computing resources to edge nodes closer to users,reducing transmission latency and bandwidth consumption.Through edge caching technology,the content delivery center is offloaded from remote servers to network edges,and file delivery can be optimized at the edge of the network.In addition,Reinforcement Learning(RL)technique can be used to make behavioral decisions in the environment and are suitable for optimizing resource allocation in the communication system.This paper conducts research on the optimization of cache resources and frequency resources for heterogeneous mobile network scenarios,applies edge cache and reinforcement learning to improve the service quality of various services.The main contribution can be summarized as follows:1.This paper constructs a content cache and delivery architecture for both fixed users and mobile users in the MEC system.Most of the existing research is to optimize the cache deployment or frequency allocation alone.This paper comprehensively considers the deployment and update of the cache,the selection of service nodes and the allocation of frequency resources.In this scenario,the fixed edge service node and the mobile edge service node jointly serve the user,and the mobile user moves with the mobile edge service node.2.In order to improve the average cache-hit rate of file requests,the popularity prediction algorithm based on PLSA(Probabilistic Latent Semantic Analysis)is used to predict the probability of file requests.Centralized and distributed cache deployment strategies are proposed.The simulation verifies that the distributed cache deployment strategy can optimize the utilization of cache resources,compares the performance impact of different cache update strategies on both centralized and distributed cache deployment algorithms,and finds that there are differences in the performance improvement effects of the algorithms based on different data.When the algorithm runs on Zipf data,the distributed algorithm improves the average cache hit rate by 20%to 30%compared with the centralized algorithm.When the algorithm runs on MovieLens data,the improvement becomes to 3%~6%.3.In order to improve the service quality of users and spectrum utilization,this paper proposes the problem of maximizing average service quality based on node movement characteristics,random file requests and real-time user location.This paper also designs a multi-node frequency allocation algorithm based on reinforcement learning to flexibly schedule downlink sub-channels.The algorithm comprehensively considers the mobile characteristics of nodes,random file requests and real-time location of users,and outputs the reasonable frequency allocation method for realtime situations.The simulation verifies that the frequency allocation algorithm based on reinforcement learning can use frequency resources more efficiently and analyzes the learning process of the neural network.Moreover,the simulation result shows that compared with the baseline algorithm,the average quality of service of the algorithm is improved by about 0.03,which proves that the algorithm can give full play to the advantages of the distributed cache algorithm,flexibly allocate subchannels to users according to environmental conditions,minimize cochannel interference,reduce transmission delay,and improve average service quality. |