| While the development of wireless communication technologies has facilitated and enriched people’s daily life,they have also brought about an explosion in data and network transmission pressure.To cope with this situation,caching technology takes advantage of the timevarying feature of network traffic and improves communication efficiency by placing content into users’ cache memory during off-peak hours,thereby alleviating network congestion that occurs during peak hours.Caching technology gains local caching gain by putting contents in users’ local storage in advance.If the local storage is limited,it provides limited relief to network transmission pressure.Unlike traditional caching technology,coded caching enables a single broadcast transmission from the server to simultaneously satisfy different demands of users by creating multicast opportunities.And global caching gain is obtained.When coded caching is applied to the D2 D network,it is possible to obtain almost the same gain as the server multicast coded caching scheme without a server.Based on the application scenario where both the server and users are able to send messages,a coded caching scheme with parallel transmission is proposed.The current study on coded caching problem with parallel transmission is limited to the case where the server broadcast channel has the same transmission capacity as the D2 D network.However,in practical applications,it is more common for these two channels to have different transmission capabilities.Therefore,in this thesis,the coded caching problem with parallel transmission is studied when the channel capacities are different.For the coded caching problem with parallel transmission when all users have identical size of cache memory and the capacity of the server broadcast channel and the D2 D network channel are different,a specific scheme with uncoded prefetching and parallel transmission is proposed.After considering the gap between the two channel capacities,the optimal delivery latency is obtained by allocating different workloads to the server and user-cooperation network.The results show that the proposed scheme obtains shorter delivery delay than the situation where the channel capacity gap between the two different channels is ignored.With the idea of bit allocation,it is demonstrated that the proposed scheme with parallel transmission is optimal in the case of uncoded prefetching,when users’ cache resources are sufficient and the capacity of the server broadcast channel and the D2 D network channel are the same.For the coded caching problem with parallel transmission when all users have different sizes of cache memory and the capacity of the server broadcast channel and the D2 D network channel are different,the process is classified into two types: D2 D network mode(D2DCC)and server multicast mode(SMCC),and the coded caching problem with parallel transmission is modeled as an optimization problem of how to allocate all files and users’ cache resources to the above two modes.To solve this optimization problem and obtain a specific scheme,an algorithm based on PSO and linear programming is proposed.The results show that the proposed scheme with parallel transmission requires significantly less delivery time compared with the situations when the two policies,D2 DCC and SMCC,are performed separately,especially when the users’ cache resources are limited. |