| In recent years,driven by the Internet of things(Io T)and the fifth generation mobile communication(5G),artificial intelligence(AI)has flourished,and a large number of emerging applications have emerged,which put forward higher requirements for data security and privacy.At the same time,more stringent data protection laws have been issued,limiting the application of AI with a large demand for data.Machine Learning(ML)or Deep Learning(DL)models can meet some AI needs.However,due to the consideration of data island,many data can not be fully utilized.In order to solve this problem,Federated Learning(FL)technology assigns the task of training the model to the data holder,the client,and aggregates the trained local model through the aggregation server.At this time,the communication between the client and the server transmits the gradient or model rather than the original sample data,which provides the client with low delay and high security model training service in the communication network.Among them,client selection and resource allocation directly affect the training speed,model performance,service delay and energy consumption of Federated Learning system.It is a key issue in the research of Federated Learning.In addition,the introduction of differential privacy into Federated Learning is also an important research direction for its data security.Aiming at the problems of training acceleration and model transmission security in Federated Learning,this paper first considers the client selection and resource allocation strategy in Federated Learning under single cell,and then extends it to multi-cell scenario,using interval interference as differential privacy service to enhance the security performance of the model.The main work is as follows:First,in the single cell Federated Learning scenario with limited wireless resources,this paper considers combining client selection and resource allocation to accelerate Federated Learning,gives a new performance index to balance the single iteration time and the amount of training data,called training efficiency,and constructs client selection,bandwidth allocation and stochastic gradient descent batch size selection combined with convex optimization method,the target is to maximize the training efficiency.The algorithm proposed in this paper decomposes the original problem into the first step of client selection and the second step of resource allocation.The non-convex problem is disassembled into convex problem to simplify the solution process.Simulation results show its effectiveness in accelerating Federal Learning progress.Secondly,aiming at the security and transmission delay of Federated Learning,this paper considers the introduction of differential privacy(DP)and combined with over the air computation(Air Comp),models the multi-cell Federated Learning scene,and constructs a model that uses interval interference to meet the requirements of DP and jointly optimize the client gradient transmission power,artificial noise transmission power and denoising factor by minimize the Mean Square Error(MSE).In this paper,through the rotation solution of convex optimization theory,the closed solution of the problem is obtained,and an iterative algorithm is designed to alternately optimize the transmission power of clients in each cell.At the same time,the influence of various factors is analyzed.Experimental simulation shows that there is a trade-off between differential privacy and the performance of Federated Learning system.An appropriate amount of interval interference can be used to meet the requirements of differential privacy and reduce the transmission power of artificial noise.At the same time,it reveals that the more clients participate in training,the lower the power overhead of the whole system. |