| Currently,machine learning is achieving unprecedented success in areas such as image processing analysis,natural language processing and driver lessness.In order to obtain more accurate machine learning models,machine models often need to be trained using large amounts of data.However,most of the data that can be used to train machine learning models currently exists on resource-constrained devices such as tablets and smartphones,and user data cannot be uploaded to the cloud for centralized machine learning model training due to concerns over data privacy and security.With the rapid development of edge computing and distributed machine learning,a new distributed machine learning paradigm-federation learning-has been proposed.In this way,it enables user devices to collaboratively train machine learning models without exposing the original data.However,the limitations of devices in terms of computing,communication and data resources can have a significant impact on the training effect of federation learning.On the one hand,user devices may not be willing to participate in federated learning due to the nature of variability in computation and communication between user devices in federated learning.Even if they do participate in federation learning,the presence of devices with poor performance may prolong the federation learning,resulting in less efficient federation learning.On the other hand,the heterogeneity of the data stored between devices can have a significant impact on the accuracy rate of the machine model.For a machine model for a particular task,devices that have more data required for machine model training have good results for model accuracy rate improvement.Based on the existence of the above two problems,how to motivate user devices to participate in federal learning training is the problem investigated in this thesis.Two specific research components are included as follows:1)Considering the large variability in computing and communication resources of user devices in the federated learning system,the task initiator needs to select "suitable and effective" user devices from the surrounding devices to assist in the federated learning,and also needs to determine the amount of computing and communication resources to use the user devices.Considering the user devices,the computational and communication resources available to the users will change during the training process of federation learning.To address the above problem,we propose an incentive mechanism driven by computational and communication resources based on the Bertrand game,and the simulation results verify the effectiveness of the incentive mechanism.2)Considering the heterogeneity of data stored among user devices in the practical application of federated learning,the model owner faces the problem of motivating user devices to contribute data resources to participate in federated learning.By analyzing the impact of data resources on energy consumption,the game between the model owner and user devices about data resources is established.The above problem is modeled as a Stackelberg game to motivate user devices to participate in federated learning.As a result,a data resource-driven incentive mechanism based on the Stackelberg game is proposed,and the simulation results show that the proposed scheme can significantly improve the federal learning efficiency. |