Font Size: a A A

Research On Edge Caching Technology For Wireless Heterogeneous Networks

Posted on:2024-08-14Degree:DoctorType:Dissertation
Country:ChinaCandidate:D Y LiFull Text:PDF
GTID:1528306917994879Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
With the rapid development of mobile communication technology,new services and businesses have emerged constantly,resulting in explosive growth in mobile data traffic.It poses a serious challenge to the next-generation wireless mobile communication networks.Ultra-dense networks as promising solutions break the traditional flat macro network coverage architecture by constructing a macro-micro heterogeneous networks.It has an immeasurable potential in enhancing the system capacity and becomes an effective means to cope with the explosive growth of mobile data traffic.However,with the dense deployment of base stations,the transmission load of the backhaul link intensifies,which can easily cause network congestion.To address these challenges,edge caching technology has emerged.It provides a solution by sinking storage functions to the network edge(such as base stations and terminal devices),providing required contents to nearby users,thereby reducing the massive repetitive transmission of content,effectively relieving the backhaul link load and solving the network congestion problem caused by the dense deployment of base stations.It becomes one of the key technologies to drive the future development of mobile communications.However,the caching resources of the edge network are extremely limited,and only a small amount of user request contents can be stored at the edge of the networks.Against this background,how to accurately find the user request hot contents is a challenging problem that needs to be solved when designing the edge caching strategy.Moreover,users’ communication demands are uneven in time and space,with obvious tidal effects,which severely imbalance the network load.Therefore,collaborative caching strategy design is required to maximize the utilization efficiency of caching resources in the edge networks.Finally,caching deployment and content distribution are closely coupled and mutually constrained.The placement of cached content directly affects content distribution efficiency,and content distribution efficiency,in turn,affects the next stage of caching strategy design.Therefore,joint optimization of edge caching and content distribution strategies is crucial to fully tap the potential of edge caching technology and maximize the system caching performance.Therefore,this thesis focuses on the core issue of "how to design efficient edge caching resource optimization solutions to fully tap the potential of edge caching technology,and meet the quality of service requirements of users" to conduct research on edge caching technology in wireless heterogeneous networks.The main research content and innovation are summarized as follows:1)For the dynamic time-varying nature of content popularity and the tidal effect of communication load in wireless heterogeneous networks,we propose a content popularity prediction based hierarchical edge caching strategy in this thesis.Specifically,a novel content popularity prediction framework called a Stacked Autoencoder-Long Short Term Memory Network is designed to accurately predict content popularity by integrating multidimensional features of the correlation and temporal periodicity between different content request patterns.Then,to predict the popularity of those newly-added contents,a similarity-based content popularity prediction approach is proposed.In this approach,the content similarity is adopted as an evaluation index to assign different weights to the popularities of the existing similar contents to evaluate the popularity of those newly-added contents.Based on the content popularity prediction,a hierarchical edge caching optimization problem is formulated to minimize the average content downloading latency.Since the formulated problem is NP-hard and difficult to be solved,a lowcomplexity algorithm is proposed to obtain near-optimal solutions.Simulation results show that the proposed content popularity prediction based hierarchical edge caching strategy significantly improves the utilization efficiency of caching resources and effectively reduces the average content downloading latency compared with the baselines.2)Taking into account the differences in user preferences and the social relationships between different users,a user preference learning-based proactive edge caching strategy is proposed.In the strategy,we first propose a novel context and social-aware user preference learning method to precisely predict users’ dynamic preferences by jointly exploiting the context correlation among different contents and the influence of social relationships.Specifically,the graph convolutional networks are developed to capture the high-order similarity representation among different contents from the constructed content graph and an attention mechanism is designed to generate the social influence weights for users with different social relationships.Based on the learned user preference,a proactive edge caching architecture is proposed to integrate the offline caching content placement and the online caching content replacement policy to continuously cache the popular contents.Simulation results show that the proposed user preference learningbased proactive edge caching strategy can significantly reduce the average content downloading latency under different cache capacities and user numbers.3)For the issues of the additional cost and the user’s privacy disclosure caused by edge caching in heterogeneous networks,we propose a novel community detection and attentionweighted federated learning based proactive edge caching strategy.In the strategy,we formulate an optimization problem to maximize the system benefit per unit cost.Since it is NP-hard and difficult to be solved,the optimization problem is decomposed into two stages,i.e.,caching nodes selection and caching content placement.To select the optimal users as caching nodes,a community detection method is proposed,which groups users into different communities based on both the mobility and social properties of users,and then selects important users as caching nodes for each community by considering the social importance of users.To determine how to place the popular contents in these selected important users,an attention-weighted federated learning based content popularity prediction framework is proposed.It reduces the global model prediction bias caused by the difference in the quality of local models,improves the accuracy of content popularity prediction while ensuring user privacy security,and reduces the average content download latency.Simulation results show that the proposed community detection and attention-weighted federated learning based proactive edge caching strategy can achieve an optimal trade-off between operating cost and system latency while protecting the user’s privacy.4)For the closely coupled issue of the caching deployment and content distribution,an Artificial Intelligence(AI)-enabled joint edge content caching and power allocation strategy is proposed.Specifically,a joint edge content caching and power allocation problem is formulated to minimize the content downloading latency.To improve the timeliness of resource scheduling,the joint edge content caching and power allocation problem is transformed into a classification problem in the deep learning field.After that,a novel AI-enabled joint edge content caching and power allocation model is proposed to learn the mapping relationships between the content request patterns,user locations,and resource configurations to make intelligent decisions for joint edge content caching and power allocation.Simulation results show that the proposed joint edge content caching and power allocation strategy outperforms the state-of-the-art baselines in terms of the average content downloading latency while guaranteeing real-time resource scheduling.In summary,for the issues of limited caching resources,dynamic content popularity,uneven communication loads,and the closely coupled issue of the caching deployment and content distribution in heterogeneous networks,this thesis focuses on the content popularity prediction,hierarchical edge caching design,user preference learning,caching users selection and joint optimization of caching and communication resources.As a result,a series of edge caching resource optimization solutions have been developed,which can significantly improve the utilization efficiency of caching resources and significantly reduce the system latency.
Keywords/Search Tags:Edge Caching, Wireless Heterogeneous Networks, Resource Allocation, Intelli-gent Communication, Deep Learning
PDF Full Text Request
Related items